milvus
milvus copied to clipboard
[Bug]: Check through attu that all mivlus collections are being loaded and the progress remains at 0%
Is there an existing issue for this?
- [X] I have searched the existing issues
Environment
- Milvus version:2.4.0-rc.1
- Deployment mode(standalone or cluster):
- MQ type(rocksmq, pulsar or kafka):
- SDK version(e.g. pymilvus v2.0.0rc2):
- OS(Ubuntu or CentOS):
- CPU/Memory:
- GPU:
- Others:
Current Behavior
When I started the milvus service with docker compose up -d
, I saw through attu that all of mivlus's collections were being loaded and the progress remained at 0%
Expected Behavior
milvus-standalone | [2024/04/01 07:50:20.986 +00:00] [WARN] [querycoordv2/services.go:822] ["failed to get replica info"] [traceID=f8b919c471737b5e04fcd7cc1a92ef8c] [collectionID=448526295529002681] [replica=448526296074944530] [error="failed to get channels, collection not loaded: collection not found[collection=448526295529002681]"] [errorVerbose="failed to get channels, collection not loaded: collection not found[collection=448526295529002681]\n(1) attached stack trace\n -- stack trace:\n | github.com/milvus-io/milvus/pkg/util/merr.WrapErrCollectionNotFound\n | \t/go/src/github.com/milvus-io/milvus/pkg/util/merr/utils.go:416\n | github.com/milvus-io/milvus/internal/querycoordv2.(*Server).fillReplicaInfo\n | \t/go/src/github.com/milvus-io/milvus/internal/querycoordv2/handlers.go:320\n | github.com/milvus-io/milvus/internal/querycoordv2.(*Server).GetReplicas\n | \t/go/src/github.com/milvus-io/milvus/internal/querycoordv2/services.go:820\n | github.com/milvus-io/milvus/internal/distributed/querycoord.(*Server).GetReplicas\n | \t/go/src/github.com/milvus-io/milvus/internal/distributed/querycoord/service.go:402\n | github.com/milvus-io/milvus/internal/proto/querypb._QueryCoord_GetReplicas_Handler.func1\n | \t/go/src/github.com/milvus-io/milvus/internal/proto/querypb/query_coord.pb.go:6313\n | github.com/milvus-io/milvus/pkg/util/interceptor.ServerIDValidationUnaryServerInterceptor.func1\n | \t/go/src/github.com/milvus-io/milvus/pkg/util/interceptor/server_id_interceptor.go:54\n | github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n | \t/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/chain.go:25\n | github.com/milvus-io/milvus/pkg/util/interceptor.ClusterValidationUnaryServerInterceptor.func1\n | \t/go/src/github.com/milvus-io/milvus/pkg/util/interceptor/cluster_interceptor.go:48\n | github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n | \t/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/chain.go:25\n | github.com/milvus-io/milvus/pkg/util/logutil.UnaryTraceLoggerInterceptor\n | \t/go/src/github.com/milvus-io/milvus/pkg/util/logutil/grpc_interceptor.go:23\n | github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n | \t/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/chain.go:25\n | go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1\n | \t/go/pkg/mod/go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/[email protected]/interceptor.go:342\n | github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n | \t/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/chain.go:25\n | github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1\n | \t/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/chain.go:34\n | github.com/milvus-io/milvus/internal/proto/querypb._QueryCoord_GetReplicas_Handler\n | \t/go/src/github.com/milvus-io/milvus/internal/proto/querypb/query_coord.pb.go:6315\n | google.golang.org/grpc.(*Server).processUnaryRPC\n | \t/go/pkg/mod/google.golang.org/[email protected]/server.go:1360\n | google.golang.org/grpc.(*Server).handleStream\n | \t/go/pkg/mod/google.golang.org/[email protected]/server.go:1737\n | google.golang.org/grpc.(*Server).serveStreams.func1.1\n | \t/go/pkg/mod/google.golang.org/[email protected]/server.go:982\n | runtime.goexit\n | \t/usr/local/go/src/runtime/asm_amd64.s:1598\nWraps: (2) failed to get channels, collection not loaded\nWraps: (3) collection not found[collection=448526295529002681]\nError types: (1) *withstack.withStack (2) *errutil.withPrefix (3) merr.milvusError"]
Steps To Reproduce
No response
Milvus Log
No response
Anything else?
No response
looks like channel is not loaded. @smallshallot could you offer server side logs so we can understand why this happened?
@smallshallot Could you please refer this doc to export the whole Milvus logs for investigation? For Milvus installed with docker-compose, you can use docker-compose logs > milvus.log to export the logs.
/assign @smallshallot /unassign
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Rotten issues close after 30d of inactivity. Reopen the issue with /reopen
.