Synchronization does not fully reach, the node crashes when starting farming

Issue Report

Environment

Operating system: linux
2023-03-28T04:32:30.853626Z INFO sc_sysinfo: :computer: CPU architecture: x86_64
2023-03-28T04:32:30.853682Z INFO sc_sysinfo: :computer: Target environment: gnu
2023-03-28T04:32:30.853697Z INFO sc_sysinfo: :computer: CPU: DO-Regular
2023-03-28T04:32:30.853711Z INFO sc_sysinfo: :computer: CPU cores: 4
2023-03-28T04:32:30.853724Z INFO sc_sysinfo: :computer: Memory: 7948MB
2023-03-28T04:32:30.853738Z INFO sc_sysinfo: :computer: Kernel: 5.19.0-23-generic
2023-03-28T04:32:30.853751Z INFO sc_sysinfo: :computer: Linux distribution: Ubuntu 22.10
2023-03-28T04:32:30.853764Z INFO sc_sysinfo: :computer: Virtual machine: yes
2023-03-28T04:32:30.891643Z INFO sc_service::builder: :package: Highest known block at #637318

Problem

Node crashes after launch

2023-03-28T04:34:37.730158Z INFO substrate: :sparkles: Imported #637400 (0x96ae…a750)
2023-03-28T04:34:39.969994Z INFO substrate: :sparkles: Imported #637401 (0x4774…677d)
2023-03-28T04:34:41.350227Z INFO substrate: :gear: Preparing 0.6 bps, target=#637407 (33 peers), best: #637401 (0x4774…677d), finalized #637301 (0xb459…6fb6), :arrow_down: 16.0kiB/s :arrow_up: 1.3MiB/s
2023-03-28T04:34:41.461001Z INFO substrate: :sparkles: Imported #637402 (0xbff4…6f2a)
2023-03-28T04:34:41.924588Z INFO substrate: :sparkles: Imported #637403 (0x74c7…7c75)
2023-03-28T04:34:44.095288Z INFO substrate: :sparkles: Imported #637404 (0x2eb8…971b)
2023-03-28T04:34:45.825516Z INFO substrate: :sparkles: Imported #637405 (0xe24f…d417)
2023-03-28T04:34:46.356825Z INFO substrate: :zzz: Idle (38 peers), best: #637405 (0xe24f…d417), finalized #637305 (0x7aad…2b43), :arrow_down: 20.7kiB/s :arrow_up: 1.5MiB/s
2023-03-28T04:34:48.107840Z INFO substrate: :sparkles: Imported #637405 (0x5960…1405)
2023-03-28T04:34:51.607121Z INFO substrate: :zzz: Idle (48 peers), best: #637406 (0x5fa8…ce48), finalized #637306 (0x8af6…0d4b), :arrow_down: 18.4kiB/s :arrow_up: 2.0MiB/s
2023-03-28T04:34:53.225514Z INFO substrate: :sparkles: Imported #637407 (0xb51c…1ec1)
2023-03-28T04:34:55.722034Z INFO substrate: :sparkles: Imported #637408 (0x9b73…b027)
2023-03-28T04:34:56.608111Z INFO substrate: :zzz: Idle (43 peers), best: #637408 (0x9b73…b027), finalized #637308 (0x1023…5b13), :arrow_down: 17.2kiB/s :arrow_up: 2.0MiB/s
2023-03-28T04:34:57.717239Z INFO substrate: :sparkles: Imported #637409 (0x5b4c…dabd)
2023-03-28T04:34:59.940726Z INFO substrate: :sparkles: Imported #637410 (0xa82b…93cb)
2023-03-28T04:35:01.608510Z INFO substrate: :zzz: Idle (58 peers), best: #637410 (0xa82b…93cb), finalized #637310 (0xfc11…0f70), :arrow_down: 19.2kiB/s :arrow_up: 1.7MiB/s
2023-03-28T04:35:01.826522Z INFO substrate: :sparkles: Imported #637411 (0xe175…7517)
2023-03-28T04:35:02.484056Z INFO subspace_sdk::farmer: Initializing piece cache… db_path=“/root/.local/share/subspace-cli/cache/piece_cache_db” size=330
2023-03-28T04:35:02.778233Z INFO subspace_sdk::farmer: Piece cache initialized successfully current_size=330
2023-03-28T04:35:03.086348Z INFO substrate: :sparkles: Imported #637411 (0x5728…204c)
2023-03-28T04:35:03.347150Z INFO substrate: :sparkles: Imported #637412 (0xefc5…c0db)
2023-03-28T04:35:06.611396Z INFO substrate: :zzz: Idle (56 peers), best: #637412 (0xefc5…c0db), finalized #637312 (0xa8cc…bf2c), :arrow_down: 22.2kiB/s :arrow_up: 1.7MiB/s
2023-03-28T04:35:07.043545Z INFO substrate: :sparkles: Imported #637413 (0x6e13…4251)
2023-03-28T04:35:07.713237Z INFO substrate: :sparkles: Imported #637414 (0xd3a1…a86c)
Farmer started successfully!
2023-03-28T04:35:11.324173Z INFO single_disk_plot{disk_farm_index=0}: subspace_farmer::single_disk_plot: Subscribing to slot info notifications
2023-03-28T04:35:11.326141Z INFO single_disk_plot{disk_farm_index=0}: subspace_farmer::reward_signing: Subscribing to reward signing notifications
The application panicked (crashed).
Message: Cannot block the current thread from within a runtime. This happens because a function attempted to block the current thread while the thread is being used to drive asynchronous tasks.
Location: /home/runner/.cargo/git/checkouts/subspace-sdk-cf03f5a296714cdf/dfa443f/src/farmer.rs:647
Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Run with RUST_BACKTRACE=full to include source snippets.
2023-03-28T04:35:11.562966Z INFO new{user_space_pledged=Some(1000.0 MB)}: subspace_cli::summary: close time.busy=287µs time.idle=58.8s
2023-03-28T04:35:11.613701Z INFO substrate: :zzz: Idle (69 peers), best: #637414 (0xd3a1…a86c), finalized #637314 (0x773b…4a2e), :arrow_down: 16.3kiB/s :arrow_up: 2.8MiB/s
2023-03-28T04:35:12.040101Z INFO single_disk_plot{disk_farm_index=0}: subspace_farmer::single_disk_plot: close time.busy=8.52s time.idle=720ms
subspaced.service: Main process exited, code=exited, status=101/n/a
subspaced.service: Failed with result ‘exit-code’.
subspaced.service: Consumed 13min 42.984s CPU time.
subspaced.service: Scheduled restart job, restart counter is at 87.
Stopped Subspace Node.
subspaced.service: Consumed 13min 42.984s CPU time.
Started Subspace Node.
2023-03-28T04:35:13.891862Z INFO subspace_cli::utils: Increase file limit from soft to hard (limit is 1024000)
2023-03-28T04:35:13.894975Z INFO validate_config:parse_config: subspace_cli::config: close time.busy=2.90ms time.idle=69.4µs
2023-03-28T04:35:13.895123Z INFO validate_config: subspace_cli::config: close time.busy=3.11ms time.idle=30.1µs
Starting node …
2023-03-28T04:35:15.502049Z INFO subspace_service::piece_cache: Storage provider cache loaded - 1106956 items.

1 Like

Hello thanks for reporting! Please check

Already discussed here.

Can you help me to update? I didn’t find instructions on how to do it.

updated like this

wget -O subspace-cli https://github.com/subspace/subspace-cli/releases/download/v0.1.11-alpha/subspace-cli-ubuntu-x86_64-v3-v0.1.11-alpha

sudo chmod +x subspace-cli

sudo mv subspace-cli /usr/local/bin/

sudo systemctl restart subspaced

journalctl -u subspaced -f

But now here are the errors

2023-03-28T06:53:26.362830Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.364527Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.369303Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.370118Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.370463Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.373480Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.373612Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.376825Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.377191Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.379335Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.379779Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.380025Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.381133Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.382262Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.384229Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.384490Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.384670Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.385179Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.385342Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.385545Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.385697Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.385866Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.386118Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.387559Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.388080Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.389691Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.392543Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.392705Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.392885Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.393648Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T06:53:26.393893Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }

Update post

The node either does not start, or it is not clear

2023-03-28T07:04:11.561140Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561184Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561225Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561324Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561393Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561438Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561484Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561540Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561591Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561637Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561679Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561720Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561761Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561803Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561848Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561899Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561941Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.561993Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.562035Z  WARN libp2p_swarm: Incoming connection rejected: ConnectionLimit { limit: 100, current: 100 }
2023-03-28T07:04:11.902190Z  INFO sc_informant: ♻️  Reorg on #638892,0xe1ed…ba6d to #638893,0xffcf…0fbb, common ancestor #638891,0x5b75…ce02
2023-03-28T07:04:11.902728Z  INFO substrate: ✨ Imported #638893 (0xffcf…0fbb)
2023-03-28T07:04:12.143499Z  INFO substrate: ✨ Imported #638894 (0xc5b0…2f43)
2023-03-28T07:04:12.393581Z  INFO substrate: ✨ Imported #638894 (0x8c5e…b19c)
2023-03-28T07:04:12.479996Z  INFO substrate: 💤 Idle (24 peers), best: #638894 (0xc5b0…2f43), finalized #638794 (0x171d…f190), ⬇ 18.6kiB/s ⬆ 1.1MiB/s
2023-03-28T07:04:12.490637Z  INFO single_disk_plot{disk_farm_index=0}: subspace_farmer::single_disk_plot: close time.busy=8.15s time.idle=1.38s
subspaced.service: Deactivated successfully.
subspaced.service: Consumed 10min 42.011s CPU time.

You can ignore the WARNs they are known and will be gone on next releases also they dont cause any trouble to the working of farmer. What you have is I think your farmer stops with no error like mine

@Insperative looks like new version fixed the issue please check (v0.1.12)

Yes, now there is no crash problem after the update, but the node does not catch up by 100 blocks. Node can't catch up to 100 units, doesn't fully sync - #3 by Insperative

Hello Insperative.

We shared an update on the thread you linked :slight_smile:

Hello Fradique.

Thanks for help :upside_down_face:

1 Like

same problem :

2023-03-29T16:22:56.709165Z INFO validate_config:parse_config: subspace_cli::config: close time.busy=57.3µs time.idle=1.87ms
2023-03-29T16:22:56.709218Z INFO validate_config: subspace_cli::config: close time.busy=65.9µs time.idle=1.92ms

There is a new version released with the last couple days, please make sure you’ve updated it seems to have solved this problem.

Also keep an eye out there is a v2 and v3 version of the file depending on the year of your processor