Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: [benchmark][cluster] DQL requests raise error Operator::GetOutput failed for [Operator:, plan node id: 186504] : Assert "it->second <= insert_barrier" => delete record beyond insert barrier, 759 DQL & DML scene #38472

Closed
1 task done
wangting0128 opened this issue Dec 16, 2024 · 7 comments
Assignees
Labels
kind/bug Issues or changes related a bug test/benchmark benchmark test triage/accepted Indicates an issue or PR is ready to be actively worked on.
Milestone

Comments

@wangting0128
Copy link
Contributor

Is there an existing issue for this?

  • I have searched the existing issues

Environment

- Milvus version:master-20241215-c3edc853-amd64
- Deployment mode(standalone or cluster):cluster
- MQ type(rocksmq, pulsar or kafka):pulsar    
- SDK version(e.g. pymilvus v2.0.0rc2):2.5.0rc124
- OS(Ubuntu or CentOS): 
- CPU/Memory: 
- GPU: 
- Others:

Current Behavior

argo task: fouram-memory-index-stab-1734274800
test case name: test_concurrent_locust_hnsw_dml_dql_filter_cluster

server:

NAME                                                              READY   STATUS      RESTARTS         AGE     IP              NODE         NOMINATED NODE   READINESS GATES
fouram-memory-i74800-2-1-5088-etcd-0                              1/1     Running     0                5h7m    10.104.25.233   4am-node30   <none>           <none>
fouram-memory-i74800-2-1-5088-etcd-1                              1/1     Running     0                5h7m    10.104.26.61    4am-node32   <none>           <none>
fouram-memory-i74800-2-1-5088-etcd-2                              1/1     Running     0                5h7m    10.104.18.231   4am-node25   <none>           <none>
fouram-memory-i74800-2-1-5088-milvus-datanode-ddd9b5b48-fk8kz     1/1     Running     4 (5h5m ago)     5h7m    10.104.33.17    4am-node36   <none>           <none>
fouram-memory-i74800-2-1-5088-milvus-indexnode-776975968f-mn8qj   1/1     Running     4 (5h6m ago)     5h7m    10.104.30.211   4am-node38   <none>           <none>
fouram-memory-i74800-2-1-5088-milvus-mixcoord-7496874bb8-2brnj    1/1     Running     4 (5h6m ago)     5h7m    10.104.23.40    4am-node27   <none>           <none>
fouram-memory-i74800-2-1-5088-milvus-proxy-d4d85698b-4g7rm        1/1     Running     4 (5h6m ago)     5h7m    10.104.30.208   4am-node38   <none>           <none>
fouram-memory-i74800-2-1-5088-milvus-querynode-74969fd568-r6c24   1/1     Running     4 (5h6m ago)     5h7m    10.104.23.41    4am-node27   <none>           <none>
fouram-memory-i74800-2-1-5088-minio-0                             1/1     Running     0                5h7m    10.104.19.252   4am-node28   <none>           <none>
fouram-memory-i74800-2-1-5088-minio-1                             1/1     Running     0                5h7m    10.104.25.234   4am-node30   <none>           <none>
fouram-memory-i74800-2-1-5088-minio-2                             1/1     Running     0                5h7m    10.104.32.74    4am-node39   <none>           <none>
fouram-memory-i74800-2-1-5088-minio-3                             1/1     Running     0                5h7m    10.104.26.62    4am-node32   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-bookie-0                   1/1     Running     0                5h7m    10.104.19.254   4am-node28   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-bookie-1                   1/1     Running     0                5h7m    10.104.25.235   4am-node30   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-bookie-2                   1/1     Running     0                5h7m    10.104.32.76    4am-node39   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-bookie-init-wp4vs          0/1     Completed   0                5h7m    10.104.25.205   4am-node30   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-broker-0                   1/1     Running     0                5h7m    10.104.21.128   4am-node24   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-broker-1                   1/1     Running     0                5h7m    10.104.6.106    4am-node13   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-proxy-0                    1/1     Running     0                5h7m    10.104.6.107    4am-node13   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-proxy-1                    1/1     Running     0                5h7m    10.104.9.162    4am-node14   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-pulsar-init-s8j4v          0/1     Completed   0                5h7m    10.104.25.204   4am-node30   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-recovery-0                 1/1     Running     0                5h7m    10.104.9.161    4am-node14   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-zookeeper-0                1/1     Running     0                5h7m    10.104.20.131   4am-node22   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-zookeeper-1                1/1     Running     0                5h7m    10.104.25.232   4am-node30   <none>           <none>
fouram-memory-i74800-2-1-5088-pulsarv3-zookeeper-2                1/1     Running     0                5h7m    10.104.32.72    4am-node39   <none>           <none> 

client logs:

[2024-12-15 15:10:37,493 - ERROR - fouram]: RPC error: [query], <MilvusException: (code=65535, message=fail to Query on QueryNode 3: worker(3) query failed: Operator::GetOutput failed for [Operator:, plan node id: 186504] : Assert "it->second <= insert_barrier"  => delete record beyond insert barrier, 759 : 758 at /workspace/source/internal/core/src/segcore/DeletedRecord.h:194
)>, <Time:{'RPC start': '2024-12-15 15:10:36.806259', 'RPC error': '2024-12-15 15:10:37.493448'}> (decorators.py:140)
[2024-12-15 15:10:42,493 - ERROR - fouram]: RPC error: [query], <MilvusException: (code=65535, message=fail to Query on QueryNode 3: worker(3) query failed: Operator::GetOutput failed for [Operator:, plan node id: 199189] : Assert "it->second <= insert_barrier"  => delete record beyond insert barrier, 801 : 800 at /workspace/source/internal/core/src/segcore/DeletedRecord.h:194
)>, <Time:{'RPC start': '2024-12-15 15:10:41.776490', 'RPC error': '2024-12-15 15:10:42.493286'}> (decorators.py:140)
[2024-12-15 15:10:47,668 - ERROR - fouram]: RPC error: [search], <MilvusException: (code=65535, message=fail to search on QueryNode 3: worker(3) query failed: Operator::GetOutput failed for [Operator:, plan node id: 212216] : Assert "it->second <= insert_barrier"  => delete record beyond insert barrier, 851 : 850 at /workspace/source/internal/core/src/segcore/DeletedRecord.h:194
)>, <Time:{'RPC start': '2024-12-15 15:10:46.993980', 'RPC error': '2024-12-15 15:10:47.668367'}> (decorators.py:140)

Expected Behavior

No response

Steps To Reproduce

1. create a collection with fields: 'id', 'float_vector', 'float_1'
2. build HNSW index on vector field 'float_vector'
3. insert 10w data
4. flush collectiona
5. rebuild index
6. load collection
7. concurrent requests
   - query
   - search 
   - load
   - delete: delete_length=1
   - insert: insert_nb=1

Milvus Log

No response

Anything else?

test result:

[2024-12-15 20:09:24,352 -  INFO - fouram]: Print locust final stats. (locust_runner.py:56)
[2024-12-15 20:09:24,353 -  INFO - fouram]: Type     Name                                                                          # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s (stats.py:789)
[2024-12-15 20:09:24,353 -  INFO - fouram]: --------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|----------- (stats.py:789)
[2024-12-15 20:09:24,353 -  INFO - fouram]: grpc     delete                                                                        338673     0(0.00%) |     63       2    1186     56 |   18.82        0.00 (stats.py:789)
[2024-12-15 20:09:24,353 -  INFO - fouram]: grpc     insert                                                                        340030     0(0.00%) |     70       6     748     63 |   18.89        0.00 (stats.py:789)
[2024-12-15 20:09:24,353 -  INFO - fouram]: grpc     load                                                                          679065     0(0.00%) |    126       6    1285    110 |   37.73        0.00 (stats.py:789)
[2024-12-15 20:09:24,353 -  INFO - fouram]: grpc     query                                                                        1698439     2(0.00%) |     66       2    1219     59 |   94.36        0.00 (stats.py:789)
[2024-12-15 20:09:24,353 -  INFO - fouram]: grpc     search                                                                       3398647     1(0.00%) |     29       3     916     17 |  188.81        0.00 (stats.py:789)
[2024-12-15 20:09:24,353 -  INFO - fouram]: --------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|----------- (stats.py:789)
[2024-12-15 20:09:24,354 -  INFO - fouram]:          Aggregated                                                                   6454854     3(0.00%) |     53       2    1285     46 |  358.60        0.00 (stats.py:789)
[2024-12-15 20:09:24,354 -  INFO - fouram]:  (stats.py:790)
[2024-12-15 20:09:24,355 -  INFO - fouram]: [PerfTemplate] Report data: 
{'server': {'deploy_tool': 'helm',
            'deploy_mode': 'cluster',
            'config_name': 'cluster_2c8m',
            'config': {'queryNode': {'resources': {'limits': {'cpu': '8.0', 'memory': '4Gi'}, 'requests': {'cpu': '5.0', 'memory': '3Gi'}}, 'replicas': 1},
                       'indexNode': {'resources': {'limits': {'cpu': '2.0', 'memory': '8Gi'}, 'requests': {'cpu': '2.0', 'memory': '5Gi'}}, 'replicas': 1},
                       'dataNode': {'resources': {'limits': {'cpu': '2.0', 'memory': '8Gi'}, 'requests': {'cpu': '2.0', 'memory': '5Gi'}}},
                       'cluster': {'enabled': True},
                       'pulsarv3': {},
                       'kafka': {},
                       'minio': {'metrics': {'podMonitor': {'enabled': True}}},
                       'etcd': {'metrics': {'enabled': True, 'podMonitor': {'enabled': True}}},
                       'metrics': {'serviceMonitor': {'enabled': True}},
                       'log': {'level': 'debug'},
                       'image': {'all': {'repository': 'harbor.milvus.io/milvus/milvus', 'tag': 'master-20241215-c3edc853-amd64'}}},
            'host': 'fouram-memory-i74800-2-1-5088-milvus.qa-milvus.svc.cluster.local',
            'port': '19530',
            'uri': ''},
 'client': {'test_case_type': 'ConcurrentClientBase',
            'test_case_name': 'test_concurrent_locust_hnsw_dml_dql_filter_cluster',
            'test_case_params': {'dataset_params': {'metric_type': 'L2', 'dim': 128, 'dataset_name': 'sift', 'dataset_size': 100000, 'ni_per': 50000},
                                 'collection_params': {'other_fields': ['float_1'], 'shards_num': 2},
                                 'resource_groups_params': {'reset': False},
                                 'database_user_params': {'reset_rbac': False, 'reset_db': False},
                                 'index_params': {'index_type': 'HNSW', 'index_param': {'M': 8, 'efConstruction': 200}},
                                 'concurrent_params': {'concurrent_number': 20, 'during_time': '5h', 'interval': 20, 'spawn_rate': None},
                                 'concurrent_tasks': [{'type': 'search',
                                                       'weight': 10,
                                                       'params': {'nq': 10,
                                                                  'top_k': 10,
                                                                  'search_param': {'ef': 16},
                                                                  'expr': {'float_1': {'GT': -1.0, 'LT': 50000.0}},
                                                                  'guarantee_timestamp': None,
                                                                  'partition_names': None,
                                                                  'output_fields': None,
                                                                  'ignore_growing': False,
                                                                  'group_by_field': None,
                                                                  'timeout': 60,
                                                                  'random_data': True,
                                                                  'check_task': 'check_response',
                                                                  'check_items': None}},
                                                      {'type': 'query',
                                                       'weight': 5,
                                                       'params': {'ids': [0, 1, 2, 3, 4, 5, 6, 7, 8, 9],
                                                                  'expr': None,
                                                                  'output_fields': None,
                                                                  'offset': None,
                                                                  'limit': None,
                                                                  'ignore_growing': False,
                                                                  'partition_names': None,
                                                                  'timeout': 60,
                                                                  'consistency_level': None,
                                                                  'random_data': False,
                                                                  'random_count': 0,
                                                                  'random_range': [0, 1],
                                                                  'field_name': 'id',
                                                                  'field_type': 'int64',
                                                                  'custom_expr': None,
                                                                  'custom_range': [0, 1],
                                                                  'check_task': 'check_response',
                                                                  'check_items': None}},
                                                      {'type': 'load',
                                                       'weight': 2,
                                                       'params': {'replica_number': 1, 'timeout': 30, 'check_task': 'check_response', 'check_items': None}},
                                                      {'type': 'delete',
                                                       'weight': 1,
                                                       'params': {'expr': '',
                                                                  'delete_length': 1,
                                                                  'timeout': 30,
                                                                  'check_task': 'check_response',
                                                                  'check_items': None}},
                                                      {'type': 'insert',
                                                       'weight': 1,
                                                       'params': {'nb': 1,
                                                                  'timeout': 30,
                                                                  'random_id': True,
                                                                  'random_vector': True,
                                                                  'varchar_filled': False,
                                                                  'start_id': 0,
                                                                  'shuffle_id': False,
                                                                  'check_task': 'check_response',
                                                                  'check_items': None}}]},
            'run_id': 2024121549808685,
            'datetime': '2024-12-15 15:03:00.033964',
            'client_version': '2.5.0'},
 'result': {'test_result': {'index': {'RT': 25.6646},
                            'insert': {'total_time': 5.4915, 'VPS': 18209.9608, 'batch_time': 2.7458, 'batch': 50000},
                            'flush': {'RT': 2.5918},
                            'load': {'RT': 1.4985},
                            'Locust': {'Aggregated': {'Requests': 6454854,
                                                      'Fails': 3,
                                                      'RPS': 358.6,
                                                      'fail_s': 0.0,
                                                      'RT_max': 1285.41,
                                                      'RT_avg': 53.71,
                                                      'TP50': 46,
                                                      'TP99': 210.0},
                                       'delete': {'Requests': 338673,
                                                  'Fails': 0,
                                                  'RPS': 18.82,
                                                  'fail_s': 0.0,
                                                  'RT_max': 1186.31,
                                                  'RT_avg': 63.68,
                                                  'TP50': 56,
                                                  'TP99': 170.0},
                                       'insert': {'Requests': 340030,
                                                  'Fails': 0,
                                                  'RPS': 18.89,
                                                  'fail_s': 0.0,
                                                  'RT_max': 748.92,
                                                  'RT_avg': 70.73,
                                                  'TP50': 63,
                                                  'TP99': 190.0},
                                       'load': {'Requests': 679065,
                                                'Fails': 0,
                                                'RPS': 37.73,
                                                'fail_s': 0.0,
                                                'RT_max': 1285.41,
                                                'RT_avg': 126.41,
                                                'TP50': 110.0,
                                                'TP99': 300.0},
                                       'query': {'Requests': 1698439,
                                                 'Fails': 2,
                                                 'RPS': 94.36,
                                                 'fail_s': 0.0,
                                                 'RT_max': 1219.15,
                                                 'RT_avg': 66.75,
                                                 'TP50': 59,
                                                 'TP99': 180.0},
                                       'search': {'Requests': 3398647,
                                                  'Fails': 1,
                                                  'RPS': 188.81,
                                                  'fail_s': 0.0,
                                                  'RT_max': 916.52,
                                                  'RT_avg': 29.97,
                                                  'TP50': 17,
                                                  'TP99': 140.0}}}}}
@wangting0128 wangting0128 added kind/bug Issues or changes related a bug needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. test/benchmark benchmark test labels Dec 16, 2024
@wangting0128 wangting0128 added this to the 2.5.0 milestone Dec 16, 2024
@xiaofan-luan
Copy link
Collaborator

/assign @zhagnlu
please help on it

@zhagnlu
Copy link
Contributor

zhagnlu commented Dec 17, 2024

actually, concurrency r/w not prove write length is alway less than read length, remove this incorrect check

@wangting0128
Copy link
Contributor Author

verified private image passed:zhagnlu-fix_delete_assert-6ce237b-20241217

argo task: fouramf-n99xv

sre-ci-robot pushed a commit that referenced this issue Dec 17, 2024
@wangting0128
Copy link
Contributor Author

wangting0128 commented Dec 18, 2024

different case,same error

argo task: fouram-disk-stab-1734440400
test case name: test_concurrent_locust_diskann_dml_dql_filter_cluster
image: master-20241217-f0096ec2-amd64

server:

NAME                                                              READY   STATUS      RESTARTS        AGE     IP              NODE         NOMINATED NODE   READINESS GATES
fouram-disk-sta40400-1-46-9374-etcd-0                             1/1     Running     0               5h7m    10.104.26.120   4am-node32   <none>           <none>
fouram-disk-sta40400-1-46-9374-etcd-1                             1/1     Running     0               5h7m    10.104.23.25    4am-node27   <none>           <none>
fouram-disk-sta40400-1-46-9374-etcd-2                             1/1     Running     0               5h7m    10.104.20.11    4am-node22   <none>           <none>
fouram-disk-sta40400-1-46-9374-milvus-datanode-55bc4d97f4-gvs25   1/1     Running     3 (5h6m ago)    5h7m    10.104.25.66    4am-node30   <none>           <none>
fouram-disk-sta40400-1-46-9374-milvus-indexnode-655cd59bc59bll5   1/1     Running     3 (5h6m ago)    5h7m    10.104.33.72    4am-node36   <none>           <none>
fouram-disk-sta40400-1-46-9374-milvus-mixcoord-7598d9fc46-bch8n   1/1     Running     3 (5h6m ago)    5h7m    10.104.33.70    4am-node36   <none>           <none>
fouram-disk-sta40400-1-46-9374-milvus-proxy-7dc6c95fb8-6z425      1/1     Running     3 (5h6m ago)    5h7m    10.104.33.71    4am-node36   <none>           <none>
fouram-disk-sta40400-1-46-9374-milvus-querynode-757d5b686bgh4k2   1/1     Running     3 (5h6m ago)    5h7m    10.104.17.223   4am-node23   <none>           <none>
fouram-disk-sta40400-1-46-9374-minio-0                            1/1     Running     0               5h7m    10.104.23.14    4am-node27   <none>           <none>
fouram-disk-sta40400-1-46-9374-minio-1                            1/1     Running     0               5h7m    10.104.26.119   4am-node32   <none>           <none>
fouram-disk-sta40400-1-46-9374-minio-2                            1/1     Running     0               5h7m    10.104.16.51    4am-node21   <none>           <none>
fouram-disk-sta40400-1-46-9374-minio-3                            1/1     Running     0               5h7m    10.104.20.12    4am-node22   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-bookie-0                  1/1     Running     0               5h7m    10.104.23.29    4am-node27   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-bookie-1                  1/1     Running     0               5h7m    10.104.16.55    4am-node21   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-bookie-2                  1/1     Running     0               5h7m    10.104.33.82    4am-node36   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-bookie-init-z6stc         0/1     Completed   0               5h7m    10.104.9.165    4am-node14   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-broker-0                  1/1     Running     0               5h7m    10.104.9.167    4am-node14   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-broker-1                  1/1     Running     0               5h7m    10.104.14.104   4am-node18   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-proxy-0                   1/1     Running     0               5h7m    10.104.14.103   4am-node18   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-proxy-1                   1/1     Running     0               5h7m    10.104.9.168    4am-node14   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-pulsar-init-bz4rb         0/1     Completed   0               5h7m    10.104.9.166    4am-node14   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-recovery-0                1/1     Running     0               5h7m    10.104.9.169    4am-node14   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-zookeeper-0               1/1     Running     0               5h7m    10.104.26.117   4am-node32   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-zookeeper-1               1/1     Running     0               5h7m    10.104.23.15    4am-node27   <none>           <none>
fouram-disk-sta40400-1-46-9374-pulsarv3-zookeeper-2               1/1     Running     0               5h7m    10.104.25.80    4am-node30   <none>           <none>

client log:
截屏2024-12-18 10 59 05

@wangting0128
Copy link
Contributor Author

wangting0128 commented Dec 18, 2024

different case,same error

argo task: fouram-disk-stab-1734440400
test case name: test_concurrent_locust_diskann_dml_dql_filter_standalone
image: master-20241217-f0096ec2-amd64

server:

NAME                                                              READY   STATUS        RESTARTS        AGE     IP              NODE         NOMINATED NODE   READINESS GATES
fouram-disk-sta40400-6-40-3738-etcd-0                             1/1     Running       0               5h5m    10.104.17.233   4am-node23   <none>           <none>
fouram-disk-sta40400-6-40-3738-milvus-standalone-7b5654967lg8l9   1/1     Running       0               5h5m    10.104.21.223   4am-node24   <none>           <none>
fouram-disk-sta40400-6-40-3738-minio-7669bd7c46-bbmsg             1/1     Running       0               5h5m    10.104.33.79    4am-node36   <none>           <none>

client log:
截屏2024-12-18 11 00 44

@yanliang567 yanliang567 added triage/accepted Indicates an issue or PR is ready to be actively worked on. and removed needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Dec 18, 2024
@yanliang567 yanliang567 removed their assignment Dec 18, 2024
@wangting0128
Copy link
Contributor Author

reproduce

argo task: fouramf-dt68t
test case name: test_concurrent_locust_hnsw_dml_dql_filter_cluster
image: master-20241217-29e620fa-amd64

server:

NAME                                                              READY   STATUS      RESTARTS        AGE     IP              NODE         NOMINATED NODE   READINESS GATES
fouramf-dt68t-7-4072-etcd-0                                       1/1     Running     0               5h6m    10.104.23.130   4am-node27   <none>           <none>
fouramf-dt68t-7-4072-etcd-1                                       1/1     Running     0               5h6m    10.104.33.202   4am-node36   <none>           <none>
fouramf-dt68t-7-4072-etcd-2                                       1/1     Running     0               5h6m    10.104.16.172   4am-node21   <none>           <none>
fouramf-dt68t-7-4072-milvus-datanode-78bf6c949d-jgxmn             1/1     Running     1 (5h6m ago)    5h6m    10.104.9.124    4am-node14   <none>           <none>
fouramf-dt68t-7-4072-milvus-indexnode-6d5b9bd5b7-tf6fw            1/1     Running     0               5h6m    10.104.14.79    4am-node18   <none>           <none>
fouramf-dt68t-7-4072-milvus-mixcoord-5fd4ffc76-xz54d              1/1     Running     1 (5h6m ago)    5h6m    10.104.9.123    4am-node14   <none>           <none>
fouramf-dt68t-7-4072-milvus-proxy-6bbf75f55b-9q6vj                1/1     Running     1 (5h6m ago)    5h6m    10.104.9.122    4am-node14   <none>           <none>
fouramf-dt68t-7-4072-milvus-querynode-f5bd4cdb6-fvh95             1/1     Running     0               5h6m    10.104.14.80    4am-node18   <none>           <none>
fouramf-dt68t-7-4072-minio-0                                      1/1     Running     0               5h6m    10.104.33.200   4am-node36   <none>           <none>
fouramf-dt68t-7-4072-minio-1                                      1/1     Running     0               5h6m    10.104.25.35    4am-node30   <none>           <none>
fouramf-dt68t-7-4072-minio-2                                      1/1     Running     0               5h6m    10.104.23.131   4am-node27   <none>           <none>
fouramf-dt68t-7-4072-minio-3                                      1/1     Running     0               5h6m    10.104.16.171   4am-node21   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-bookie-0                            1/1     Running     0               5h6m    10.104.25.37    4am-node30   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-bookie-1                            1/1     Running     0               5h6m    10.104.23.134   4am-node27   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-bookie-2                            1/1     Running     0               5h6m    10.104.16.173   4am-node21   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-bookie-init-qmf6b                   0/1     Completed   0               5h6m    10.104.6.134    4am-node13   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-broker-0                            1/1     Running     0               5h6m    10.104.23.120   4am-node27   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-broker-1                            1/1     Running     0               5h6m    10.104.9.128    4am-node14   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-proxy-0                             1/1     Running     0               5h6m    10.104.9.130    4am-node14   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-proxy-1                             1/1     Running     0               5h6m    10.104.33.194   4am-node36   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-pulsar-init-24lsn                   0/1     Completed   0               5h6m    10.104.9.125    4am-node14   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-recovery-0                          1/1     Running     0               5h6m    10.104.9.129    4am-node14   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-zookeeper-0                         1/1     Running     0               5h6m    10.104.33.201   4am-node36   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-zookeeper-1                         1/1     Running     0               5h6m    10.104.23.128   4am-node27   <none>           <none>
fouramf-dt68t-7-4072-pulsarv3-zookeeper-2                         1/1     Running     0               5h6m    10.104.25.36    4am-node30   <none>           <none> 

client log:
截屏2024-12-18 11 13 10

sre-ci-robot pushed a commit that referenced this issue Dec 18, 2024
@wangting0128
Copy link
Contributor Author

verification passed

argo task: fouramf-concurrent-pwgwj
image: master-20241218-87056be7-amd64

test cases:

  • test_concurrent_locust_diskann_dml_dql_filter_cluster
  • test_concurrent_locust_diskann_dml_dql_filter_standalone
  • test_concurrent_locust_hnsw_dml_dql_filter_cluster

sre-ci-robot pushed a commit that referenced this issue Dec 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Issues or changes related a bug test/benchmark benchmark test triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

No branches or pull requests

4 participants