milvus
milvus copied to clipboard
[Bug]: streamingnode log too much
Is there an existing issue for this?
- [x] I have searched the existing issues
Environment
- Milvus version: 2.6.2
- Deployment mode(standalone or cluster): cluster
- MQ type(rocksmq, pulsar or kafka): woodpecker
- SDK version(e.g. pymilvus v2.0.0rc2): 2.6.2
- OS(Ubuntu or CentOS):
- CPU/Memory: 512m/2048Mi
- GPU: 0
- Others:
Current Behavior
Output a lot of warning Call Sync, but storage is not writable, quick fail all append requests
Expected Behavior
Shouldn't constantly log this during service restart or stablize
Steps To Reproduce
Restart a running pod in ek8s cause the error
Milvus Log
Streaming node
streamingnode [2025/10/09 19:18:27.263 +00:00] [WARN] [grpclog/grpclog.go:155] ["[core][Channel #17 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: \"10.0.133.213:22222\", ServerName: \"streamingnode\", Attributes: {\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=50)>\" }, BalancerAttributes: {\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc0011ba7e0>\" , \"<%!p(attributes.attributesKeyT │
│ ype=0)>\": \"<%!p(int64=50)>\" }}. Err: connection error: desc = \"transport: Error while dialing: dial tcp 10.0.133.213:22222: connect: connection refused\""] │
│ streamingnode [2025/10/09 19:18:27.365 +00:00] [WARN] [grpclog/grpclog.go:155] ["[core][Channel #17 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: \"10.0.133.213:22222\", ServerName: \"streamingnode\", Attributes: {\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=50)>\" }, BalancerAttributes: {\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc0011ba7e0>\" , \"<%!p(attributes.attributesKeyT │
│ ype=0)>\": \"<%!p(int64=50)>\" }}. Err: connection error: desc = \"transport: Error while dialing: dial tcp 10.0.133.213:22222: connect: connection refused\""] │
│ streamingnode [2025/10/09 19:18:27.525 +00:00] [WARN] [grpclog/grpclog.go:155] ["[core][Channel #17 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: \"10.0.133.213:22222\", ServerName: \"streamingnode\", Attributes: {\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=50)>\" }, BalancerAttributes: {\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc0011ba7e0>\" , \"<%!p(attributes.attributesKeyT │
│ ype=0)>\": \"<%!p(int64=50)>\" }}. Err: connection error: desc = \"transport: Error while dialing: dial tcp 10.0.133.213:22222: connect: connection refused\""] │
│ streamingnode [2025/10/09 19:18:27.694 +00:00] [WARN] [server/logstore.go:276] ["get batch entries failed"] scope=LogStore,intent=GetBatchEntriesAdv,traceID=f9b3362583fae81e66f24173823ad7f8 [logId=1,segId=11,fromEntryId=5,maxEntries=200 [error="no more data"]] [stack="github.com/zilliztech/woodpecker/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/logstore.go:276\ng │
│ ithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker │
│ /log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/impls/wp/scanner.go:55"] │
│ streamingnode [2025/10/09 19:18:27.706 +00:00] [WARN] [server/logstore.go:276] ["get batch entries failed"] scope=LogStore,intent=GetBatchEntriesAdv,traceID=0b817beb4ac18ee37b8c5d9fee6f36bd [logId=3,segId=11,fromEntryId=5,maxEntries=200 [error="no more data"]] [stack="github.com/zilliztech/woodpecker/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/logstore.go:276\ng │
│ ithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker │
│ /log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/impls/wp/scanner.go:55"] │
│ streamingnode [2025/10/09 19:18:27.745 +00:00] [WARN] [grpclog/grpclog.go:155] ["[core][Channel #17 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: \"10.0.133.213:22222\", ServerName: \"streamingnode\", Attributes: {\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=50)>\" }, BalancerAttributes: {\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc0011ba7e0>\" , \"<%!p(attributes.attributesKeyT │
│ ype=0)>\": \"<%!p(int64=50)>\" }}. Err: connection error: desc = \"transport: Error while dialing: dial tcp 10.0.133.213:22222: connect: connection refused\""] │
│ streamingnode [2025/10/09 19:18:27.750 +00:00] [WARN] [objectstorage/reader_impl.go:889] ["Failed to stat block object"] scope=MinioFileReader,intent=readDataBlocks,traceID=a3189779c398c23c2b2aaf0d4ecd6cd1 [segmentFileKey=sound-troopers-db/wp/1/12,blockNumber=1 [error="Head \"https://s3.us-east-1.amazonaws.com/soundtroopers-vectordb-us-east-1-prod/sound-troopers-db/wp/1/12/1.blk\": context canceled"]] [stack="github.com/z │
│ illiztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).readBlockBatchUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:889\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).readDataBlocksUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:518\ngithub.co │
│ m/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).ReadNextBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:454\ngithub.com/zilliztech/woodpecker/server/processor.(*segmentProcessor).ReadBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/processor/segment_processor.go:216\ngithub.com/zilliztech/woodpecke │
│ r/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/logstore.go:271\ngithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/ │
│ github.com/zilliztech/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker/log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/impls/wp/scanner.go:55"] │
│ streamingnode [2025/10/09 19:18:27.754 +00:00] [WARN] [adaptor/scanner_switchable.go:93] ["scanner consuming was interrpurted with error, start a backoff"] [module=streamingnode] [component=scanner] [name=recovery] [channel=sound-troopers-db-rootcoord-dml_5:rw@12] [startMessageID=11/0] [error="context canceled"] │
│ streamingnode [2025/10/09 19:18:27.761 +00:00] [WARN] [objectstorage/reader_impl.go:889] ["Failed to stat block object"] scope=MinioFileReader,intent=readDataBlocks,traceID=96a3fc3b35f1f5fd115c22c4aadcc672 [segmentFileKey=sound-troopers-db/wp/3/12,blockNumber=1 [error="Head \"https://s3.us-east-1.amazonaws.com/soundtroopers-vectordb-us-east-1-prod/sound-troopers-db/wp/3/12/1.blk\": context canceled"]] [stack="github.com/z │
│ illiztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).readBlockBatchUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:889\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).readDataBlocksUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:518\ngithub.co │
│ m/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).ReadNextBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:454\ngithub.com/zilliztech/woodpecker/server/processor.(*segmentProcessor).ReadBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/processor/segment_processor.go:216\ngithub.com/zilliztech/woodpecke │
│ r/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/logstore.go:271\ngithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/ │
│ github.com/zilliztech/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker/log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/impls/wp/scanner.go:55"] │
│ streamingnode [2025/10/09 19:18:27.761 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":10,\"Local\":110},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.156.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"BalancerAttribut │
│ es\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc00147d500>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/09 19:18:27.762 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=1] [state="{\"Addresses\":[{\"Addr\":\"10.0.156.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc00 │
│ 147d500>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/09 19:18:27.762 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/09 19:18:27.766 +00:00] [WARN] [adaptor/scanner_switchable.go:93] ["scanner consuming was interrpurted with error, start a backoff"] [module=streamingnode] [component=scanner] [name=recovery] [channel=sound-troopers-db-rootcoord-dml_14:rw@12] [startMessageID=11/0] [error="context canceled"] │
│ streamingnode [2025/10/09 19:18:27.773 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":10,\"Local\":111},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.156.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"BalancerAttribut │
│ es\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc001f18bc0>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/09 19:18:27.773 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=1] [state="{\"Addresses\":[{\"Addr\":\"10.0.156.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc00 │
│ 1f18bc0>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/09 19:18:27.773 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/09 19:18:27.777 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":10,\"Local\":112},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.156.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"BalancerAttribut │
│ es\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc001f193c0>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/09 19:18:27.777 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=1] [state="{\"Addresses\":[{\"Addr\":\"10.0.156.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc00 │
│ 1f193c0>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=51)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/09 19:18:27.777 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/09 19:18:27.868 +00:00] [WARN] [server/logstore.go:276] ["get batch entries failed"] scope=LogStore,intent=GetBatchEntriesAdv,traceID=b9499c3bcefe1390fe7b36ef042fd168 [logId=1,segId=11,fromEntryId=5,maxEntries=200 [error="no more data"]] [stack="github.com/zilliztech/woodpecker/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/logstore.go:276\ng │
│ ithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker │
│ /log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/impls/wp/scanner.go:55"] │
│ stream closed: EOF for milvus/sound-troopers-db-milvus-streamingnode-56f7b9cf8f-dbnfz (config) │
│ streamingnode [2025/10/09 19:18:27.917 +00:00] [WARN] [objectstorage/reader_impl.go:889] ["Failed to stat block object"] scope=MinioFileReader,intent=readDataBlocks,traceID=4220b37a4b3a03c9cadb9bcadc4ee43d [segmentFileKey=sound-troopers-db/wp/1/12,blockNumber=1 [error="context canceled"]] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).readBlockBatchUnsafe\n\t/root/go/pkg/mod/gi │
│ thub.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:889\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).readDataBlocksUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:518\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileReaderAdv).ReadNextBatchAdv\n\t/root/go/pkg/mod/git │
│ hub.com/zilliztech/[email protected]/server/storage/objectstorage/reader_impl.go:454\ngithub.com/zilliztech/woodpecker/server/processor.(*segmentProcessor).ReadBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/processor/segment_processor.go:216\ngithub.com/zilliztech/woodpecker/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/logstore │
│ .go:271\ngithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/ │
│ woodpecker/log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/impls/wp/scanner.go:55"] │
│ streamingnode [2025/10/09 19:18:27.918 +00:00] [WARN] [server/logstore.go:276] ["get batch entries failed"] scope=LogStore,intent=GetBatchEntriesAdv,traceID=183a792d316f6ec2c62c76c46ccac5a3 [logId=3,segId=11,fromEntryId=5,maxEntries=200 [error="no more data"]] [stack="github.com/zilliztech/woodpecker/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/logstore.go:276\ng │
│ ithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker │
│ /log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/impls/wp/scanner.go:55"]
│ streamingnode [2025/10/09 19:23:12.211 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=3fd2efb853d16174a3348033a9293c68 [segmentFileKey=sound-troopers-db/wp/12/12] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/z │
│ illiztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/ │
│ storage/objectstorage/writer_impl.go:585"]
│ streamingnode [2025/10/09 19:23:12.211 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=3fd2efb853d16174a3348033a9293c68 [segmentFileKey=sound-troopers-db/wp/12/12] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/z │
│ illiztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/ │
│ storage/objectstorage/writer_impl.go:585"]
Anything else?
- I observe this issue happens when etcd restarts. I have 3 etc, but i set pdb on it to be 2.
- It can also happen when i restart streamingNode. I have 2 atm, but when i restart 1. Before close, it will let me know WAL manager is failing then the other node starts to output a lot of logs
/assign @chyezh /unassign
@hung-phan
- Does it keep logging forever?
- Could you provide more log to debug?
│ streamingnode [2025/10/09 19:23:12.211 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=3fd2efb853d16174a3348033a9293c68 [segmentFileKey=sound-troopers-db/wp/12/12] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/z │ │ illiztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/ │ │ storage/objectstorage/writer_impl.go:585"]
I think the main issue is this. Did you check Minio still have enough space? Does it last forever?
We use s3, this is our config
apiVersion: milvus.io/v1beta1
kind: Milvus
metadata:
namespace: milvus
name: my_release
labels:
app: milvus
spec:
mode: cluster
config:
log:
level: warn
minio:
bucketName: my_bucket
# we share multiple milvus clusters with same bucket
rootPath: /sound-troopers-db
useSSL: true
useIAM: true
dependencies:
msgStreamType: woodpecker
etcd:
inCluster:
deletionPolicy: Delete
pvcDeletion: true
values:
replicaCount: 3
persistence:
size: 20Gi # default pvc size
storage:
external: true
type: S3
endpoint: s3.us-east-1.amazonaws.com:443
components:
serviceAccountName: milvus-service-account
enableRollingUpdate: true
image: milvusdb/milvus:v2.6.2-gpu # Milvus image with gpu support
proxy:
serviceType: ClusterIP
replicas: -1
resources:
requests:
cpu: 512m
memory: 1Gi
limits:
cpu: 2048m
memory: 4Gi
mixCoord:
replicas: -1
resources:
requests:
cpu: 512m
memory: 2Gi
limits:
cpu: 2048m
memory: 8Gi
dataNode:
replicas: -1
resources:
requests:
cpu: 512m
memory: 2Gi
limits:
cpu: 2048m
memory: 8Gi
streamingNode:
replicas: -1
resources:
requests:
cpu: 512m
memory: 2Gi
limits:
cpu: 2048m
memory: 8Gi
queryNode:
replicas: -1
resources:
requests:
cpu: 512m
memory: 4Gi
limits:
cpu: 4096m
memory: 16Gi
The log will continue for 15-20 minutes then stablize after. This is what happens when 1 node restart, the other node starts emitting log
│ streamingnode [2025/10/11 17:26:29.772 +00:00] [WARN] [handler/handler_client_impl.go:208] ["assignment not found"] │
│ streamingnode [2025/10/11 17:26:29.811 +00:00] [WARN] [handler/handler_client_impl.go:208] ["assignment not found"] │
│ streamingnode [2025/10/11 17:26:29.992 +00:00] [WARN] [handler/handler_client_impl.go:208] ["assignment not found"] │
│ streamingnode [2025/10/11 17:26:29.997 +00:00] [WARN] [handler/handler_client_impl.go:208] ["assignment not found"] │
│ streamingnode [2025/10/11 17:26:30.277 +00:00] [WARN] [handler/handler_client_impl.go:208] ["assignment not found"] │
│ streamingnode [2025/10/11 17:26:30.280 +00:00] [WARN] [handler/handler_client_impl.go:208] ["assignment not found"] │
│ streamingnode [2025/10/11 17:26:30.305 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":156,\"Local\":243},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKe │
│ yType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" , \"<%!p(attributes.attributesKeyType=1)>\": \"<0xc006c78660>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p │
│ (attributes.attributesKeyType=1)>\": \"<0xc006c78680>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/11 17:26:30.305 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=3] [state="{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\" │
│ :{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" , \"<%!p(attributes.attributesKeyType=1)>\": \"<0xc006c78660>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc006c78680>\" , │
│ \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/11 17:26:30.305 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/11 17:26:30.337 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":156,\"Local\":244},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKey │
│ Type=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc0068d6e40>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null},{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\":{\"<%!p │
│ (attributes.attributesKeyType=1)>\": \"<0xc0068d6e60>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/11 17:26:30.337 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=3] [state="{\"Addresses\":[{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\": │
│ {\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc0068d6e40>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null},{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc0068d6e60>\" , │
│ \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/11 17:26:30.337 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/11 17:26:30.344 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":156,\"Local\":245},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKe │
│ yType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc006c78cc0>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p │
│ (attributes.attributesKeyType=1)>\": \"<0xc006c78ce0>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/11 17:26:30.344 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=3] [state="{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\" │
│ :{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc006c78cc0>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc006c78ce0>\" , │
│ \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/11 17:26:30.344 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/11 17:26:30.351 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":156,\"Local\":246},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKe │
│ yType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc004d76b20>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p │
│ (attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" , \"<%!p(attributes.attributesKeyType=1)>\": \"<0xc004d76b40>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/11 17:26:30.351 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=3] [state="{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\" │
│ :{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc004d76b20>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc004d76b40>\" , │
│ \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/11 17:26:30.351 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/11 17:26:30.355 +00:00] [INFO] [resolver/resolver_with_discoverer.go:189] ["service discover update, update resolver"] [component=grpc-resolver] [scheme=channel-assignment] [state="{\"Version\":{\"Global\":156,\"Local\":247},\"State\":{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKe │
│ yType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc004d76c20>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p │
│ (attributes.attributesKeyType=1)>\": \"<0xc004d76c40>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}}"] [resolver_count=1] │
│ streamingnode [2025/10/11 17:26:30.355 +00:00] [INFO] [resolver/watch_based_grpc_resolver.go:57] ["update resolver state success"] [component=grpc-resolver] [scheme=channel-assignment] [id=3] [state="{\"Addresses\":[{\"Addr\":\"10.0.105.209:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"BalancerAttributes\" │
│ :{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc004d76c20>\" , \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=230)>\" },\"Metadata\":null},{\"Addr\":\"10.0.126.18:22222\",\"ServerName\":\"\",\"Attributes\":{\"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"BalancerAttributes\":{\"<%!p(attributes.attributesKeyType=1)>\": \"<0xc004d76c40>\" , │
│ \"<%!p(attributes.attributesKeyType=0)>\": \"<%!p(int64=240)>\" },\"Metadata\":null}],\"Endpoints\":null,\"ServiceConfig\":null,\"Attributes\":null}"] │
│ streamingnode [2025/10/11 17:26:30.355 +00:00] [INFO] [resolver/resolver_with_discoverer.go:199] ["update resolver done"] [component=grpc-resolver] [scheme=channel-assignment] │
│ streamingnode [2025/10/11 17:26:35.085 +00:00] [WARN] [server/logstore.go:276] ["get batch entries failed"] scope=LogStore,intent=GetBatchEntriesAdv,traceID=16704c2ab035e21fb2dc11f9820fe370 [logId=15,segId=46,fromEntryId=3,maxEntries=200 [error="no more data"]] [stack="github.com/zilliztech/woodpecker/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/z │
│ illiztech/[email protected]/server/logstore.go:276\ngithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztec │
│ h/[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker/log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/ │
│ impls/wp/scanner.go:55"] │
│ streamingnode [2025/10/11 17:26:35.088 +00:00] [WARN] [server/logstore.go:276] ["get batch entries failed"] scope=LogStore,intent=GetBatchEntriesAdv,traceID=9e6425fa7a5d2e5eeff262b6af0d3528 [logId=3,segId=50,fromEntryId=3,maxEntries=200 [error="no more data"]] [stack="github.com/zilliztech/woodpecker/server.(*logStore).GetBatchEntriesAdv\n\t/root/go/pkg/mod/github.com/zi │
│ lliztech/[email protected]/server/logstore.go:276\ngithub.com/zilliztech/woodpecker/woodpecker/client.(*logStoreClientLocal).ReadEntriesBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/client/logstore_client.go:64\ngithub.com/zilliztech/woodpecker/woodpecker/segment.(*segmentHandleImpl).ReadBatchAdv\n\t/root/go/pkg/mod/github.com/zilliztech │
│ /[email protected]/woodpecker/segment/segment_handle.go:443\ngithub.com/zilliztech/woodpecker/woodpecker/log.(*logBatchReaderImpl).ReadNext\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/woodpecker/log/log_reader.go:162\ngithub.com/milvus-io/milvus/pkg/v2/streaming/walimpls/impls/wp.(*scannerImpl).executeConsumer\n\t/workspace/source/pkg/streaming/walimpls/i │
│ mpls/wp/scanner.go:55"] │
│ streamingnode [2025/10/11 17:26:35.590 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=464bf6c803c76d0a52628807e51076ef [segmentFileKey=sound-troopers-db/wp/15/46] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter). │
│ quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/ │
│ objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.604 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=7afcd0032ec0bccb8b8b700bbe64410d [segmentFileKey=sound-troopers-db/wp/4/49] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).q │
│ uickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/o │
│ bjectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.609 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=9c20b2dbb0ece1c70bcb5c336bece0be [segmentFileKey=sound-troopers-db/wp/3/50] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).q │
│ uickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/o │
│ bjectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.623 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=d1457946b6f91908134d61c1aa4ba757 [segmentFileKey=sound-troopers-db/wp/16/47] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter). │
│ quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/ │
│ objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.641 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=0bb451b1f8313259e434f049bd154ec2 [segmentFileKey=sound-troopers-db/wp/12/51] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter). │
│ quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/ │
│ objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.643 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=b336848da79f769592949b038b079703 [segmentFileKey=sound-troopers-db/wp/14/52] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter). │
│ quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/ │
│ objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.682 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=63221ca953b4d99cf718279d81b9be65 [segmentFileKey=sound-troopers-db/wp/1/50] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).q │
│ uickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/o │
│ bjectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.688 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=fc2bb46b7615e351306910b86eee1f8e [segmentFileKey=sound-troopers-db/wp/7/52] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).q │
│ uickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/o │
│ bjectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.790 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=7b75371cc0d7dfcbb15c5f9fd170d66f [segmentFileKey=sound-troopers-db/wp/15/46] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter). │
│ quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/ │
│ objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.805 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=90ab1c72b11a0a827f331cd6d44b8d3d [segmentFileKey=sound-troopers-db/wp/4/49] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).q │
│ uickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/o │
│ bjectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.810 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=8a37459288f59a075ce870ad66435b94 [segmentFileKey=sound-troopers-db/wp/3/50] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).q │
│ uickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/o │
│ bjectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"] │
│ streamingnode [2025/10/11 17:26:35.824 +00:00] [WARN] [objectstorage/writer_impl.go:1237] ["Call Sync, but storage is not writable, quick fail all append requests"] scope=MinioFileWriter,intent=Sync,traceID=7315185570097774b0f41a1fda292588 [segmentFileKey=sound-troopers-db/wp/16/47] [stack="github.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter). │
│ quickSyncFailUnsafe\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:1237\ngithub.com/zilliztech/woodpecker/server/storage/objectstorage.(*MinioFileWriter).Sync\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:922\ngithub.com/zilliztech/woodpecker/server/storage/ │
│ objectstorage.(*MinioFileWriter).run\n\t/root/go/pkg/mod/github.com/zilliztech/[email protected]/server/storage/objectstorage/writer_impl.go:585"]
milvus has multiple wal on different streamingnode,so when the streamingnode restart, milvus will recover the wal on closing streamingnode on another one, some log will be printed as expected. And when streaming node gone, some related client will print some connection failure log. it seems that most log is printed by woodpecker. /assign @tinswzy please help to check if it works as expected, and should we set those log lower log level?
milvus has multiple wal on different streamingnode,so when the streamingnode restart, milvus will recover the wal on closing streamingnode on another one, some log will be printed as expected. And when streaming node gone, some related client will print some connection failure log. it seems that most log is printed by woodpecker. /assign @tinswzy please help to check if it works as expected, and should we set those log lower log level?
Ok. I'll confirm and make some improvements.
After a flush retry failure, the log may continue to print redundant “sync error” messages (error = storage not writable) until it is cleaned up. However, this does not affect correctness. The log behavior will be fixed in the next version. Thanks for your feedback!
I ran with 2.6.4 in prod and it still outputs so many log until it stabilizes. cc @tinswzy
@tinswzy please cherry pick 44934 to 2.6 branch
The log will continue for 15-20 minutes then stablize after. This is what happens when 1 node restart, the other node starts emitting log
It does not "stabilize" for me. It would just continue printing 100+ warnings each second until i finally stop it
(i am on 2.6.5)
@Classic298 If your log content is
wal is on shutdown
It already be fixed by #45154, will be released at 2.6.6.
@chyezh it is Sync failed: segment storage not writable