milvus icon indicating copy to clipboard operation
milvus copied to clipboard

[Bug]: Milvus search exception

Open HuaJieHappy opened this issue 1 year ago • 12 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Environment

- Milvus version:2.2.12
- Deployment mode(standalone or cluster):standalone
- MQ type(rocksmq, pulsar or kafka):  defalut  
- SDK version(e.g. pymilvus v2.0.0rc2): 2.2.12
- OS(Ubuntu or CentOS): centos7.5
- CPU/Memory: 8C32G
- GPU: No
- Others:

Current Behavior

The application was subjected to stress testing with 30 concurrent users, and an error occurred during the query method. The error message is The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/chatfile/app/service/chat_sse_support/multiple_recall.py", line 41, in multiple_recall milvus_docs, milvus_image_ids = milvus_search_based_service(data, question_embedding, search_limit, File "/chatfile/app/recall_pics/milvus_recall.py", line 167, in milvus_search_based_service result = search_vectors(collection=collection, vectors_to_search=content_embedding, search_params=search_params, File "/chatfile/app/service/chat/milvus_search.py", line 19, in search_vectors results = collection.search(data=[vectors_to_search], param=search_params, limit=milvus_search_limit, File "/usr/local/lib/python3.8/site-packages/pymilvus/orm/collection.py", line 629, in search res = conn.search(self._name, data, anns_field, param, limit, expr, File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 109, in handler raise e File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 105, in handler return func(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 136, in handler ret = func(self, *args, **kwargs) File "/usr/local/lib/python3.8/site-packages/pymilvus/decorators.py", line 56, in handler raise MilvusException(message=str(e)) from e pymilvus.exceptions.MilvusException: <MilvusException: (code=1, message=<_InactiveRpcError of RPC that terminated with: status = StatusCode.CANCELLED details = "Channel closed!" debug_error_string = "UNKNOWN:Channel closed! {created_time:"2024-10-09T19:01:25.108709223+08:00", grpc_message:"Channel closed!", grpc_status:1}"

)>

Expected Behavior

Is it necessary to modify the configuration or upgrade the version

Steps To Reproduce

use milvus default config

Milvus Log

No response

Anything else?

No response

HuaJieHappy avatar Oct 16 '24 03:10 HuaJieHappy

@HuaJieHappy Please refer this doc to export the whole Milvus logs for investigation. For Milvus installed with docker-compose, you can use docker-compose logs > milvus.log to export the logs.

Also quick questions:

  1. do you have writing requests during the stress testing?
  2. any metrics about the resource usage and milvus pods? if convenient, I suggest you upgrade to 2.3.22 or 2.4.13 as the running milvus 2.2.12 is quite old.

/assign @HuaJieHappy /unassign

yanliang567 avatar Oct 16 '24 03:10 yanliang567

do we have complete upgrade plan such as from 2.2.12 to 2.4.13? data can't lost

HuaJieHappy avatar Oct 16 '24 05:10 HuaJieHappy

How to upgrade Milvus from version 2.2.12 to 2.4.13 or another version while preserving the data and ensuring that queries can still be executed?

HuaJieHappy avatar Oct 16 '24 05:10 HuaJieHappy

Our suggestion is to upgrade from 2.2.12 to 2.3.22(Latest 2.3) and see

xiaofan-luan avatar Oct 18 '24 01:10 xiaofan-luan

If there are any issue we can help on investigating. All zilliz cloud instance has already been upgrade to 2.4 and 2.3 is already in it's end of life

xiaofan-luan avatar Oct 18 '24 01:10 xiaofan-luan

milvus-standalone | [2024/10/18 02:17:12.614 +00:00] [WARN] [grpcclient/client.go:341] ["ClientBase ReCall grpc first call get error"] [role=datacoord] [error="err: rpc error: code = Canceled desc = context canceled\n, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace\n/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:340 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).ReCall\n/go/src/github.com/milvus-io/milvus/internal/distributed/datacoord/client/client.go:435 github.com/milvus-io/milvus/internal/distributed/datacoord/client.(*Client).GetRecoveryInfoV2\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:855 github.com/milvus-io/milvus/internal/indexcoord.(*IndexCoord).getIndexedStats\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:936 github.com/milvus-io/milvus/internal/indexcoord.(*IndexCoord).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/distributed/indexcoord/service.go:279 github.com/milvus-io/milvus/internal/distributed/indexcoord.(*Server).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/proto/indexpb/index_coord.pb.go:2669 github.com/milvus-io/milvus/internal/proto/indexpb._IndexCoord_DescribeIndex_Handler.func1\n/go/src/github.com/milvus-io/milvus/internal/util/interceptor/cluster_interceptor.go:69 github.com/milvus-io/milvus/internal/util/interceptor.ClusterValidationUnaryServerInterceptor.func1\n/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/chain.go:25 github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n/go/src/github.com/milvus-io/milvus/internal/util/logutil/grpc_interceptor.go:22 github.com/milvus-io/milvus/internal/util/logutil.UnaryTraceLoggerInterceptor\n"] @xiaofan-luan

HuaJieHappy avatar Oct 18 '24 02:10 HuaJieHappy

15 threads for search func

HuaJieHappy avatar Oct 18 '24 02:10 HuaJieHappy

Our suggestion is to upgrade from 2.2.12 to 2.3.22(Latest 2.3) and see

Can the previous data be retained after the upgrade?

HuaJieHappy avatar Oct 18 '24 02:10 HuaJieHappy

yes it should be all compatible

xiaofan-luan avatar Oct 18 '24 06:10 xiaofan-luan

do a backup in case is recommended

xiaofan-luan avatar Oct 18 '24 06:10 xiaofan-luan

do a backup in case is recommended

use Milvus backup tools?

HuaJieHappy avatar Oct 18 '24 06:10 HuaJieHappy

milvus-standalone | [2024/10/18 02:17:12.614 +00:00] [WARN] [grpcclient/client.go:341] ["ClientBase ReCall grpc first call get error"] [role=datacoord] [error="err: rpc error: code = Canceled desc = context canceled\n, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace\n/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:340 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).ReCall\n/go/src/github.com/milvus-io/milvus/internal/distributed/datacoord/client/client.go:435 github.com/milvus-io/milvus/internal/distributed/datacoord/client.(*Client).GetRecoveryInfoV2\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:855 github.com/milvus-io/milvus/internal/indexcoord.(*IndexCoord).getIndexedStats\n/go/src/github.com/milvus-io/milvus/internal/indexcoord/index_coord.go:936 github.com/milvus-io/milvus/internal/indexcoord.(*IndexCoord).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/distributed/indexcoord/service.go:279 github.com/milvus-io/milvus/internal/distributed/indexcoord.(*Server).DescribeIndex\n/go/src/github.com/milvus-io/milvus/internal/proto/indexpb/index_coord.pb.go:2669 github.com/milvus-io/milvus/internal/proto/indexpb._IndexCoord_DescribeIndex_Handler.func1\n/go/src/github.com/milvus-io/milvus/internal/util/interceptor/cluster_interceptor.go:69 github.com/milvus-io/milvus/internal/util/interceptor.ClusterValidationUnaryServerInterceptor.func1\n/go/pkg/mod/github.com/grpc-ecosystem/[email protected]/chain.go:25 github.com/grpc-ecosystem/go-grpc-middleware.ChainUnaryServer.func1.1.1\n/go/src/github.com/milvus-io/milvus/internal/util/logutil/grpc_interceptor.go:22 github.com/milvus-io/milvus/internal/util/logutil.UnaryTraceLoggerInterceptor\n"] @xiaofan-luan

need help,please @yanliang567

HuaJieHappy avatar Oct 18 '24 06:10 HuaJieHappy

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. Rotten issues close after 30d of inactivity. Reopen the issue with /reopen.

stale[bot] avatar Nov 17 '24 07:11 stale[bot]

/reopen

dbc-2024 avatar Nov 22 '24 09:11 dbc-2024