milvus icon indicating copy to clipboard operation
milvus copied to clipboard

[Bug]: [perf][cluster]Milvus creates the diskann index, indexnode oom many times

Open jingkl opened this issue 1 year ago • 3 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues

Environment

- Milvus version:2.2.0-20230317-bbc21fe8
- Deployment mode(standalone or cluster):cluster
- MQ type(rocksmq, pulsar or kafka):    
- SDK version(e.g. pymilvus v2.0.0rc2):
- OS(Ubuntu or CentOS): 
- CPU/Memory: 
- GPU: 
- Others:

Current Behavior

release_name_prefix perf-cluster-1679185800 deploy_config fouramf-server-cluster-8c16m case_params: fouramf-client-gist1m-concurrent-diskann other_params: --milvus_tag_prefix=2.2.0 -s --deploy_mode=cluster case_name: test_concurrent_locust_custom_parameters

perf-cluster-1685800-3-96-9029-etcd-0                             1/1     Running                  0                 26h     10.104.1.2      4am-node10   <none>           <none>
perf-cluster-1685800-3-96-9029-etcd-1                             1/1     Running                  0                 26h     10.104.5.201    4am-node12   <none>           <none>
perf-cluster-1685800-3-96-9029-etcd-2                             1/1     Running                  0                 26h     10.104.9.159    4am-node14   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-datacoord-6dc8bb67f6qh82m   1/1     Running                  1 (26h ago)       26h     10.104.13.100   4am-node16   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-datanode-f9d5786bc-q2dkr    1/1     Running                  1 (26h ago)       26h     10.104.12.219   4am-node17   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-indexcoord-7d6bdbd4bjq8sv   1/1     Running                  1 (26h ago)       26h     10.104.14.49    4am-node18   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-indexnode-768477bcd4lgpjx   1/1     Running                  192 (2m54s ago)   26h     10.104.14.51    4am-node18   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-proxy-75754559b9-scqfd      1/1     Running                  1 (26h ago)       26h     10.104.14.48    4am-node18   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-querycoord-66985cc4bck89z   1/1     Running                  1 (26h ago)       26h     10.104.13.99    4am-node16   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-querynode-588f45479bnlvtf   1/1     Running                  0                 26h     10.104.13.101   4am-node16   <none>           <none>
perf-cluster-1685800-3-96-9029-milvus-rootcoord-7879fdbb9dcjr5l   1/1     Running                  1 (26h ago)       26h     10.104.14.50    4am-node18   <none>           <none>
perf-cluster-1685800-3-96-9029-minio-0                            1/1     Running                  0                 26h     10.104.6.134    4am-node13   <none>           <none>
perf-cluster-1685800-3-96-9029-minio-1                            1/1     Running                  0                 26h     10.104.4.229    4am-node11   <none>           <none>
perf-cluster-1685800-3-96-9029-minio-2                            1/1     Running                  0                 26h     10.104.5.205    4am-node12   <none>           <none>
perf-cluster-1685800-3-96-9029-minio-3                            1/1     Running                  0                 26h     10.104.1.5      4am-node10   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-bookie-0                    1/1     Running                  0                 26h     10.104.6.137    4am-node13   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-bookie-1                    1/1     Running                  0                 26h     10.104.1.7      4am-node10   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-bookie-2                    1/1     Running                  0                 26h     10.104.9.168    4am-node14   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-bookie-init-869c5           0/1     Completed                0                 26h     10.104.6.122    4am-node13   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-broker-0                    1/1     Running                  0                 26h     10.104.1.231    4am-node10   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-proxy-0                     1/1     Running                  1 (19h ago)       26h     10.104.6.126    4am-node13   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-pulsar-init-2mchf           0/1     Completed                0                 26h     10.104.1.226    4am-node10   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-recovery-0                  1/1     Running                  1 (19h ago)       26h     10.104.6.123    4am-node13   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-zookeeper-0                 1/1     Running                  0                 26h     10.104.9.157    4am-node14   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-zookeeper-1                 1/1     Running                  0                 26h     10.104.4.252    4am-node11   <none>           <none>
perf-cluster-1685800-3-96-9029-pulsar-zookeeper-2                 1/1     Running                  0                 26h     10.104.5.240    4am-node12   <none>           <none>

Indexnode: 截屏2023-03-20 11 06 37

client log:

[2023-03-19 00:43:06,888 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_u4Bggbei): 987498 (base.py:313)
[2023-03-19 00:43:06,914 -  INFO - fouram]: [Base] Start inserting, ids: 995000 - 995999, data size: 1,000,000 (base.py:161)
[2023-03-19 00:43:07,098 -  INFO - fouram]: [Time] Collection.insert run in 0.1834s (api_request.py:41)
[2023-03-19 00:43:07,101 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_u4Bggbei): 987498 (base.py:313)
[2023-03-19 00:43:07,130 -  INFO - fouram]: [Base] Start inserting, ids: 996000 - 996999, data size: 1,000,000 (base.py:161)
[2023-03-19 00:43:07,462 -  INFO - fouram]: [Time] Collection.insert run in 0.3315s (api_request.py:41)
[2023-03-19 00:43:07,465 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_u4Bggbei): 987498 (base.py:313)
[2023-03-19 00:43:07,493 -  INFO - fouram]: [Base] Start inserting, ids: 997000 - 997999, data size: 1,000,000 (base.py:161)
[2023-03-19 00:43:07,813 -  INFO - fouram]: [Time] Collection.insert run in 0.3198s (api_request.py:41)
[2023-03-19 00:43:07,817 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_u4Bggbei): 987498 (base.py:313)
[2023-03-19 00:43:07,845 -  INFO - fouram]: [Base] Start inserting, ids: 998000 - 998999, data size: 1,000,000 (base.py:161)
[2023-03-19 00:43:08,294 -  INFO - fouram]: [Time] Collection.insert run in 0.4485s (api_request.py:41)
[2023-03-19 00:43:08,296 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_u4Bggbei): 987498 (base.py:313)
[2023-03-19 00:43:08,323 -  INFO - fouram]: [Base] Start inserting, ids: 999000 - 999999, data size: 1,000,000 (base.py:161)
[2023-03-19 00:43:08,504 -  INFO - fouram]: [Time] Collection.insert run in 0.1808s (api_request.py:41)
[2023-03-19 00:43:08,506 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_u4Bggbei): 993046 (base.py:313)
[2023-03-19 00:43:08,517 -  INFO - fouram]: [Base] Total time of insert: 255.1678s, average number of vector bars inserted per secon
d: 3918.9898, average time to insert 1000 vectors per time: 0.2552s (base.py:230)
[2023-03-19 00:43:08,517 -  INFO - fouram]: [Base] Start flush collection fouram_u4Bggbei (base.py:131)
[2023-03-19 00:43:11,555 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_u4Bggbei): 1000000 (base.py:313)
[2023-03-19 00:43:11,565 -  INFO - fouram]: [Base] Params of index: {'index_type': 'DISKANN', 'metric_type': 'L2', 'params': {}} (ba
se.py:291)
[2023-03-19 00:43:11,565 -  INFO - fouram]: [Base] Start build index of DISKANN for collection fouram_u4Bggbei, params:{'index_type'
: 'DISKANN', 'metric_type': 'L2', 'params': {}} (base.py:278)

Expected Behavior

No response

Steps To Reproduce

1. create a collection 
        2. build diskann index on vector column
        3. insert 1000000 vectors
        4. flush collection
        5. build diskann index on vector column with the same parameters
        6. count the total number of rows
        7. load collection
        8.Concurrent [1, 20] for search operation

Milvus Log

No response

Anything else?

scene_concurrent_locust required params: {'dataset_params': {'dim': 768, 'dataset_name': 'gist', 'dataset_size': 1000000, 'ni_per': 1000, 'metric_type': 'L2'}, 'collection_params': {'other_fields': []}, 'l oad_params': {}, 'search_params': {}, 'index_params': {'index_type': 'DISKANN', 'index_param': {}}, 'concurrent_params': {'concurren t_number': [1, 20], 'during_time': 3600, 'interval': 20}, 'concurrent_tasks': [{'type': 'search', 'weight': 1, 'params': {'nq': 1, ' top_k': 1, 'search_param': {'search_list': 30}, 'random_data': True}}]} (params_check.py:31)

jingkl avatar Mar 20 '23 03:03 jingkl

/assign @xige-16 /unassign

yanliang567 avatar Mar 20 '23 03:03 yanliang567

image: 2.2.0-20230411-3fe5fdb3

release_name_prefix: perf-standalone-1-1681201800 deploy_config: fouramf-server-cluster-8c16m other_params: --milvus_tag_prefix=2.2.0 -s case_name: test_concurrent_locust_glove_diskann_search_standalone

server:

perf-standalone01800-3-68-5052-etcd-0                             1/1     Running            0               13m     10.104.5.155    4am-node12   <none>           <none>
perf-standalone01800-3-68-5052-milvus-standalone-6b9f6f5c7ft6v5   0/1     CrashLoopBackOff   4 (45s ago)     13m     10.104.14.11    4am-node18   <none>           <none>
perf-standalone01800-3-68-5052-minio-6fdfbd5647-tfg2n             1/1     Running            0               13m     10.104.9.250    4am-node14   <none>           <none> 

client log:

[2023-04-11 08:35:16,359 -  INFO - fouram]: [Base] Params of index: {'index_type': 'DISKANN', 'metric_type': 'IP', 'params': {}} (base.py:291)
[2023-04-11 08:35:16,359 -  INFO - fouram]: [Base] Start build index of DISKANN for collection fouram_b7VWo5Na, params:{'index_type': 'DISKANN', 'metric_type': 'IP', 'params': {}} (base.py:278)
[2023-04-11 08:35:33,515 - WARNING - fouram]: [93m[get_index_state] retry:4, cost: 0.27s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:35:33,786 - WARNING - fouram]: [93m[get_index_state] retry:5, cost: 0.81s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:35:34,598 - WARNING - fouram]: [93m[get_index_state] retry:6, cost: 2.43s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:35:37,031 - WARNING - fouram]: [93m[get_index_state] retry:7, cost: 7.29s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:35:44,329 - WARNING - fouram]: [93m[get_index_state] retry:8, cost: 21.87s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:36:06,221 - WARNING - fouram]: [93m[get_index_state] retry:9, cost: 60.00s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:37:06,233 - WARNING - fouram]: [93m[get_index_state] retry:10, cost: 60.00s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:41:02,908 - WARNING - fouram]: [93m[get_index_state] retry:4, cost: 0.27s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:41:03,180 - WARNING - fouram]: [93m[get_index_state] retry:5, cost: 0.81s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:41:03,992 - WARNING - fouram]: [93m[get_index_state] retry:6, cost: 2.43s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:41:06,425 - WARNING - fouram]: [93m[get_index_state] retry:7, cost: 7.29s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:41:13,723 - WARNING - fouram]: [93m[get_index_state] retry:8, cost: 21.87s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:41:35,615 - WARNING - fouram]: [93m[get_index_state] retry:9, cost: 60.00s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:42:35,676 - WARNING - fouram]: [93m[get_index_state] retry:10, cost: 60.00s, reason: <_MultiThreadedRendezvous: StatusCode.UNAVAILABLE, failed to connect to all addresses>[0m (decorators.py:71)
[2023-04-11 08:43:35,737 - ERROR - fouram]: RPC error: [get_index_state], <MilvusUnavailableException: (code=1, message=server Unavailable: Retry run out of 10 retry times)>, <Time:{'RPC start': '2023-04-11 08:41:02.774745', 'RPC error': '2023-04-11 08:43:35.737153'}> (decorators.py:108)
[2023-04-11 08:43:35,737 - ERROR - fouram]: RPC error: [wait_for_creating_index], <MilvusUnavailableException: (code=1, message=server Unavailable: Retry run out of 10 retry times)>, <Time:{'RPC start': '2023-04-11 08:35:16.365436', 'RPC error': '2023-04-11 08:43:35.737711'}> (decorators.py:108)
[2023-04-11 08:43:35,737 - ERROR - fouram]: RPC error: [create_index], <MilvusUnavailableException: (code=1, message=server Unavailable: Retry run out of 10 retry times)>, <Time:{'RPC start': '2023-04-11 08:35:16.361186', 'RPC error': '2023-04-11 08:43:35.737862'}> (decorators.py:108)
[2023-04-11 08:43:35,781 - ERROR - fouram]: Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 50, in handler
    return func(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 632, in get_index_state
    response = rf.result()
  File "/usr/local/lib/python3.8/dist-packages/grpc/_channel.py", line 744, in result
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "failed to connect to all addresses"
	debug_error_string = "{"created":"@1681202615.736689129","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3260,"referenced_errors":[{"created":"@1681202615.736688379","description":"failed to connect to all addresses","file":"src/core/lib/transport/error_utils.cc","file_line":167,"grpc_status":14}]}"
>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/src/fouram/client/util/api_request.py", line 33, in inner_wrapper
    res = func(*args, **kwargs)
  File "/src/fouram/client/util/api_request.py", line 70, in api_request
    return func(*arg, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/orm/index.py", line 74, in __init__
    conn.create_index(self._collection.name, self._field_name, self._index_params, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 109, in handler
    raise e
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 105, in handler
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 136, in handler
    ret = func(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 85, in handler
    raise e
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 50, in handler
    return func(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 571, in create_index
    index_success, fail_reason = self.wait_for_creating_index(collection_name=collection_name,
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 109, in handler
    raise e
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 105, in handler
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 136, in handler
    ret = func(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 85, in handler
    raise e
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 50, in handler
    return func(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 654, in wait_for_creating_index
    state, fail_reason = self.get_index_state(collection_name, index_name, timeout=timeout, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 109, in handler
    raise e
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 105, in handler
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 136, in handler
    ret = func(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 66, in handler
    raise MilvusUnavailableException(message=f"server Unavailable: {timeout_msg}") from e
pymilvus.exceptions.MilvusUnavailableException: <MilvusUnavailableException: (code=1, message=server Unavailable: Retry run out of 10 retry times)>
 (api_request.py:48)
[2023-04-11 08:43:35,781 - ERROR - fouram]: (api_response) : <MilvusUnavailableException: (code=1, message=server Unavailable: Retry run out of 10 retry times)> (api_request.py:49)
[2023-04-11 08:43:35,782 - ERROR - fouram]: [CheckFunc] init_index request check failed, response:<MilvusUnavailableException: (code=1, message=server Unavailable: Retry run out of 10 retry times)> (func_check.py:49)

jingkl avatar Apr 11 '23 08:04 jingkl

release_name_prefix perf-cluster-1681405200 deploy_config fouramf-server-cluster-8c16m case_params fouramf-client-gist1m-concurrent-diskann other_params --milvus_tag=2.2.6-20230413-d0e87113 -s --deploy_mode=cluster case_name test_concurrent_locust_custom_parameters

server:


perf-cluster-1605200-3-54-2672-etcd-0                             1/1     Running     0               9h      10.104.1.12     4am-node10   <none>           <none>
perf-cluster-1605200-3-54-2672-etcd-1                             1/1     Running     0               9h      10.104.6.185    4am-node13   <none>           <none>
perf-cluster-1605200-3-54-2672-etcd-2                             1/1     Running     0               9h      10.104.4.251    4am-node11   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-datacoord-79b444b576tb6dc   1/1     Running     1 (9h ago)      9h      10.104.12.72    4am-node17   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-datanode-767f79f498-w9qwx   1/1     Running     1 (9h ago)      9h      10.104.14.72    4am-node18   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-indexcoord-7cb68c96cjdhfb   1/1     Running     1 (9h ago)      9h      10.104.13.88    4am-node16   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-indexnode-69f79fc49fdzzn9   1/1     Running     58 (10m ago)    9h      10.104.13.87    4am-node16   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-proxy-74756648c6-lsxwd      1/1     Running     1 (9h ago)      9h      10.104.13.86    4am-node16   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-querycoord-7457cc9bdg8zrh   1/1     Running     1 (9h ago)      9h      10.104.12.71    4am-node17   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-querynode-78d99fd8cc2xrms   1/1     Running     0               9h      10.104.12.69    4am-node17   <none>           <none>
perf-cluster-1605200-3-54-2672-milvus-rootcoord-754865b64ffh8jq   1/1     Running     1 (9h ago)      9h      10.104.12.68    4am-node17   <none>           <none>
perf-cluster-1605200-3-54-2672-minio-0                            1/1     Running     0               9h      10.104.1.11     4am-node10   <none>           <none>
perf-cluster-1605200-3-54-2672-minio-1                            1/1     Running     0               9h      10.104.5.139    4am-node12   <none>           <none>
perf-cluster-1605200-3-54-2672-minio-2                            1/1     Running     0               9h      10.104.6.186    4am-node13   <none>           <none>
perf-cluster-1605200-3-54-2672-minio-3                            1/1     Running     0               9h      10.104.4.253    4am-node11   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-bookie-0                    1/1     Running     0               9h      10.104.1.17     4am-node10   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-bookie-1                    1/1     Running     0               9h      10.104.6.190    4am-node13   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-bookie-2                    1/1     Running     0               9h      10.104.9.59     4am-node14   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-bookie-init-74flf           0/1     Completed   0               9h      10.104.5.124    4am-node12   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-broker-0                    1/1     Running     0               9h      10.104.6.172    4am-node13   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-proxy-0                     1/1     Running     0               9h      10.104.5.127    4am-node12   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-pulsar-init-4pb7s           0/1     Completed   0               9h      10.104.5.125    4am-node12   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-recovery-0                  1/1     Running     0               9h      10.104.4.238    4am-node11   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-zookeeper-0                 1/1     Running     0               9h      10.104.1.15     4am-node10   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-zookeeper-1                 1/1     Running     0               9h      10.104.4.31     4am-node11   <none>           <none>
perf-cluster-1605200-3-54-2672-pulsar-zookeeper-2                 1/1     Running     0               9h      10.104.5.171    4am-node12   <none>           <none>

client log:

[2023-04-13 17:15:27,323 -  INFO - fouram]: [Base] Total time of insert: 350.4721s, average number of vector bars inserted per secon
d: 2853.2942, average time to insert 1000 vectors per time: 0.3505s (base.py:231)
[2023-04-13 17:15:27,324 -  INFO - fouram]: [Base] Start flush collection fouram_oOvx4n1K (base.py:132)
[2023-04-13 17:15:31,450 -  INFO - fouram]: [Base] Number of vectors in the collection(fouram_oOvx4n1K): 1000000 (base.py:314)
[2023-04-13 17:15:31,457 -  INFO - fouram]: [Base] Params of index: {'index_type': 'DISKANN', 'metric_type': 'L2', 'params': {}} (ba
se.py:292)
[2023-04-13 17:15:31,457 -  INFO - fouram]: [Base] Start build index of DISKANN for collection fouram_oOvx4n1K, params:{'index_type'
: 'DISKANN', 'metric_type': 'L2', 'params': {}} (base.py:279)

jingkl avatar Apr 14 '23 02:04 jingkl

@jingkl can you run a test to verify how much memory for a segment with 2GB is needed for diskann index building

yanliang567 avatar Jun 09 '23 02:06 yanliang567

/assign @jingkl

xige-16 avatar Aug 09 '23 08:08 xige-16

deploy_config fouramf-server-cluster-8c16m-disk case_params fouramf-client-gist1m-concurrent-diskann other_params -s --deploy_mode=cluster --update_helm_file case_name test_concurrent_locust_custom_parameters

image:2.2.0-20230809-3322c5de

jingkl avatar Aug 11 '23 10:08 jingkl