Piece request to announcer provider failed, Unxepected end of File

Issue Report

Users running on the new version are facing an error while attempting to sync.

Environment

  • Operating System: Ubuntu
  • CPU Architecture: x86
  • RAM: 16GB
  • Storage: 512GB SSD
  • Plot Size: 50GB
  • Subspace Deployment Method: Docker

Problem

Steps to reproduce

  1. Start node & farmer
  2. Let attempt to sync
  3. after 10-15 min this error will arrive

Expected result

  • Sync farmer and node with no major errors

What happens instead

  • We get an error which keeps repeating as syncing is attempted
subspace-farmer-1  | 2023-03-01T16:38:39.770028Z ERROR yamux::connection: 93a71c3b: socket error: decode error: i/o error: unexpected end of file    
subspace-farmer-1  | 2023-03-01T16:38:39.913900Z  WARN subspace_farmer::utils::farmer_provider_record_processor: Piece request to announcer provider failed. provider=12D3KooWNjYtDoqsRXZuwdVWXNBSWY16rgLjuwhaHNX1AqUyPrAm piece_index_hash=PieceIndexHash([213, 229, 45, 138, 213, 68, 203, 135, 31, 249, 189, 10, 111, 156, 141, 233, 42, 246, 151, 47, 181, 225, 220, 126, 212, 11, 161, 179, 3, 66, 221, 37]) error=ProtocolFailure(Refused)
subspace-farmer-1  | 2023-03-01T16:38:40.152169Z  WARN subspace_farmer::utils::farmer_provider_record_processor: Piece request to announcer provider failed. provider=12D3KooWNjYtDoqsRXZuwdVWXNBSWY16rgLjuwhaHNX1AqUyPrAm piece_index_hash=PieceIndexHash([139, 214, 14, 190, 150, 72, 201, 8, 215, 93, 230, 67, 213, 145, 33, 155, 241, 216, 52, 192, 162, 192, 243, 180, 0, 169, 23, 0, 13, 161, 70, 69]) error=ProtocolFailure(Refused)
subspace-node-1    | 2023-03-01 16:38:40 Accepting new connection 1/100
subspace-farmer-1  | 2023-03-01T16:38:40.597686Z  WARN subspace_farmer::utils::farmer_provider_record_processor: Piece request to announcer provider failed. provider=12D3KooWNjYtDoqsRXZuwdVWXNBSWY16rgLjuwhaHNX1AqUyPrAm piece_index_hash=PieceIndexHash([88, 195, 52, 42, 137, 109, 244, 169, 27, 185, 70, 128, 39, 131, 55, 126, 172, 9, 89, 129, 204, 181, 117, 122, 75, 144, 34, 192, 98, 4, 94, 71]) error=ProtocolFailure(Refused)
subspace-farmer-1  | 2023-03-01T16:38:41.019036Z  WARN subspace_farmer::utils::farmer_provider_record_processor: Piece request to announcer provider failed. provider=12D3KooWNjYtDoqsRXZuwdVWXNBSWY16rgLjuwhaHNX1AqUyPrAm piece_index_hash=PieceIndexHash([206, 41, 87, 155, 156, 178, 13, 138, 105, 235, 132, 75, 186, 182, 251, 96, 210, 252, 209, 79, 178, 128, 119, 186, 161, 101, 153, 152, 42, 252, 230, 103]) error=ProtocolFailure(Refused)

It worth noting these nodes are syncing as far as telemetry shows.

We have had this issue reported by 3 users so far.

I have also seen this issue from @xorinox who builds from source aswell.

This is annoying, but it is a non-fatal error and shouldn’t really break anything.

cc @shamil for potential investigation on why this may happen and how to resolve it in the future.

1 Like

node belongs to me. warns and errors giving a fight in my log screen but it syncing slowly :slight_smile:

1 Like

maybe it can help
Environment
Operating System: Ubuntu 20.04
CPU Architecture: x86
RAM: 16GB
Storage: 150GB SSD
Plot Size: 50GB
Subspace Deployment Method: cli-service

ERROR messages in logs:

2023-03-01T17:17:01.191649Z  INFO substrate: ⚙️  Syncing  1.6 bps, target=#248986 (32 peers), best: #5026 (0xc9ff…38fe), finalized #4926 (0xac06…0c9d), ⬇ 21.3kiB/s ⬆ 1.2kiB/s
2023-03-01T17:17:04.268190Z  INFO subspace_service::dsn: Processing a segment. segment_index=17
2023-03-01T17:17:06.192152Z  INFO substrate: ⚙️  Syncing  0.6 bps, target=#248988 (32 peers), best: #5029 (0xd1b9…f972), finalized #4929 (0x7729…006c), ⬇ 20.5kiB/s ⬆ 0.6kiB/s
2023-03-01T17:17:09.745140Z ERROR yamux::connection: 6bc6cc41: socket error: decode error: i/o error: unexpected end of file
2023-03-01T17:17:11.192718Z  INFO substrate: ⚙️  Syncing  1.6 bps, target=#248989 (32 peers), best: #5037 (0xcbf7…73b7), finalized #4937 (0x7399…2717), ⬇ 22.1kiB/s ⬆ 2.5kiB/s
2023-03-01T17:17:16.193397Z  INFO substrate: ⚙️  Syncing  1.2 bps, target=#248990 (32 peers), best: #5043 (0x4a42…1a0f), finalized #4943 (0xf05f…f6f2), ⬇ 17.4kiB/s ⬆ 1.3kiB/s
2023-03-01T17:17:21.193865Z  INFO substrate: ⚙️  Syncing  0.8 bps, target=#248991 (32 peers), best: #5047 (0x0161…5db6), finalized #4947 (0xdde1…9e68), ⬇ 18.8kiB/s ⬆ 0.9kiB/s
2023-03-01T17:17:26.194260Z  INFO substrate: ⚙️  Syncing  1.0 bps, target=#248993 (32 peers), best: #5052 (0x0eeb…e2ab), finalized #4952 (0x10b1…41c7), ⬇ 21.3kiB/s ⬆ 0.7kiB/s
2023-03-01T17:17:31.195745Z  INFO substrate: ⚙️  Syncing  1.1 bps, target=#248995 (32 peers), best: #5058 (0x7265…258f), finalized #4958 (0xbf38…1f79), ⬇ 33.5kiB/s ⬆ 8.4kiB/s
2023-03-01T17:17:36.196274Z  INFO substrate: ⚙️  Syncing  1.4 bps, target=#248996 (32 peers), best: #5065 (0xf77a…8c60), finalized #4964 (0x92e0…5836), ⬇ 20.6kiB/s ⬆ 1.5kiB/s
2023-03-01T17:17:39.465173Z  INFO subspace_service::dsn: Segment publishing was successful. segment_index=17
2023-03-01T17:17:41.196862Z  INFO substrate: ⚙️  Syncing  2.2 bps, target=#248996 (32 peers), best: #5076 (0x3876…4c09), finalized #4976 (0x181a…6b4e), ⬇ 19.0kiB/s ⬆ 0.8kiB/s
2023-03-01T17:17:46.197406Z  INFO substrate: ⚙️  Syncing  1.0 bps, target=#248997 (32 peers), best: #5081 (0x9ba5…0225), finalized #4981 (0x62d8…725f), ⬇ 19.8kiB/s ⬆ 0.8kiB/s
2023-03-01T17:17:51.197948Z  INFO substrate: ⚙️  Syncing  1.2 bps, target=#248998 (32 peers), best: #5087 (0x80c1…4dc7), finalized #4987 (0x4667…300c), ⬇ 18.6kiB/s ⬆ 1.0kiB/s
2023-03-01T17:17:56.198502Z  INFO substrate: ⚙️  Syncing  1.2 bps, target=#249000 (32 peers), best: #5093 (0x2df6…4939), finalized #4993 (0x5e9a…865d), ⬇ 20.0kiB/s ⬆ 1.1kiB/s
2023-03-01T17:18:01.199015Z  INFO substrate: ⚙️  Syncing  1.0 bps, target=#249002 (32 peers), best: #5098 (0x3650…9114), finalized #4998 (0x3eac…ab33), ⬇ 19.6kiB/s ⬆ 2.4kiB/s
2023-03-01T17:18:06.199641Z  INFO substrate: ⚙️  Syncing  1.0 bps, target=#249003 (32 peers), best: #5103 (0x2e06…acc0), finalized #5003 (0xa3c8…4463), ⬇ 19.7kiB/s ⬆ 1.6kiB/s
2023-03-01T17:18:11.200135Z  INFO substrate: ⚙️  Syncing  1.2 bps, target=#249005 (32 peers), best: #5109 (0x02ed…d69c), finalized #5009 (0x1533…1150), ⬇ 16.3kiB/s ⬆ 0.9kiB/s

but it keeps syncing OK after that

for clear logs on docker, you may use this:
docker compose logs --tail=1000 -f | grep ‘PrimaryChain|INFO’