-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dynamically generate PROTOS list and add the generated files. #41
Dynamically generate PROTOS list and add the generated files. #41
Conversation
📝 Walkthrough📝 Walkthrough📝 WalkthroughWalkthroughThe pull request introduces several changes across multiple files. In the Changes
Possibly related PRs
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (6)
README.md (1)
64-66
: Add configuration example and response handling.While the basic Flush RPC call is documented, it would be helpful to show:
- Available configuration options (if any)
- How to handle the response
Consider expanding the example:
## call RPCs: Flush -fstub.Flush(payload_pb2.Flush.Request()) +# Create flush request with optional configuration +fcfg = payload_pb2.Flush.Config() +response = fstub.Flush(payload_pb2.Flush.Request(config=fcfg)) +# Handle the response +print(f"Flushed {response.count} vectors")Makefile (2)
45-46
: LGTM! Consider adding error handling for the find command.The dynamic evaluation of PROTOS is a good improvement that automatically handles new .proto files. This reduces maintenance overhead and makes the build process more robust.
Consider adding error handling to ensure the proto directory exists:
-PROTOS = $(shell find $(PROTO_ROOT)/v1 -name '*.proto') +PROTOS = $(shell test -d $(PROTO_ROOT)/v1 && find $(PROTO_ROOT)/v1 -name '*.proto' || echo "ERROR: Proto directory not found")
201-201
: Add newline at end of file.While the include statement is correct, it's recommended to add a newline at the end of the file to follow Unix file conventions and prevent potential issues with file concatenation or certain tools.
Add a newline after the include statement:
include $(K3D_MAKEFILE) +
src/vald/v1/meta/meta_pb2_grpc.py (2)
8-9
: Add documentation in the proto file for the Meta service.The "Missing associated documentation comment in .proto file" warning indicates that the Meta service lacks documentation in its protobuf definition. Consider adding service-level documentation in the corresponding .proto file to improve API documentation.
80-83
: Document experimental API status and usage.The Meta class is marked as experimental but lacks proper documentation about:
- What makes it experimental
- Potential risks or limitations
- Usage guidelines
- When it might become stable
Consider adding this documentation in the proto file or in separate documentation.
src/vald/v1/discoverer/discoverer_pb2_grpc.py (1)
61-83
: Consider removing redundant server registration.The function uses both generic and registered method handlers. While this works, it might be redundant as typically only one registration method is needed.
Consider using only
add_generic_rpc_handlers
:def add_DiscovererServicer_to_server(servicer, server): rpc_method_handlers = { 'Pods': grpc.unary_unary_rpc_method_handler( servicer.Pods, request_deserializer=vald_dot_v1_dot_payload_dot_payload__pb2.Discoverer.Request.FromString, response_serializer=vald_dot_v1_dot_payload_dot_payload__pb2.Info.Pods.SerializeToString, ), 'Nodes': grpc.unary_unary_rpc_method_handler( servicer.Nodes, request_deserializer=vald_dot_v1_dot_payload_dot_payload__pb2.Discoverer.Request.FromString, response_serializer=vald_dot_v1_dot_payload_dot_payload__pb2.Info.Nodes.SerializeToString, ), 'Services': grpc.unary_unary_rpc_method_handler( servicer.Services, request_deserializer=vald_dot_v1_dot_payload_dot_payload__pb2.Discoverer.Request.FromString, response_serializer=vald_dot_v1_dot_payload_dot_payload__pb2.Info.Services.SerializeToString, ), } generic_handler = grpc.method_handlers_generic_handler( 'discoverer.v1.Discoverer', rpc_method_handlers) server.add_generic_rpc_handlers((generic_handler,)) - server.add_registered_method_handlers('discoverer.v1.Discoverer', rpc_method_handlers)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (15)
- Makefile (2 hunks)
- README.md (3 hunks)
- src/tests/test_e2e.py (2 hunks)
- src/vald/v1/agent/sidecar/sidecar_pb2.py (1 hunks)
- src/vald/v1/agent/sidecar/sidecar_pb2_grpc.py (1 hunks)
- src/vald/v1/discoverer/discoverer_pb2.py (1 hunks)
- src/vald/v1/discoverer/discoverer_pb2_grpc.py (1 hunks)
- src/vald/v1/meta/meta_pb2.py (1 hunks)
- src/vald/v1/meta/meta_pb2_grpc.py (1 hunks)
- src/vald/v1/mirror/mirror_pb2.py (1 hunks)
- src/vald/v1/mirror/mirror_pb2_grpc.py (1 hunks)
- src/vald/v1/rpc/errdetails/error_details_pb2.py (1 hunks)
- src/vald/v1/rpc/errdetails/error_details_pb2_grpc.py (1 hunks)
- src/vald/v1/vald/flush_pb2.py (1 hunks)
- src/vald/v1/vald/flush_pb2_grpc.py (1 hunks)
✅ Files skipped from review due to trivial changes (2)
- src/vald/v1/agent/sidecar/sidecar_pb2.py
- src/vald/v1/agent/sidecar/sidecar_pb2_grpc.py
🧰 Additional context used
🪛 Ruff
src/vald/v1/discoverer/discoverer_pb2.py
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
src/vald/v1/meta/meta_pb2.py
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
src/vald/v1/mirror/mirror_pb2.py
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
src/vald/v1/rpc/errdetails/error_details_pb2.py
25-25:
google.protobuf.duration_pb2
imported but unusedRemove unused import:
google.protobuf.duration_pb2
(F401)
src/vald/v1/rpc/errdetails/error_details_pb2_grpc.py
3-3:
grpc
imported but unusedRemove unused import:
grpc
(F401)
src/vald/v1/vald/flush_pb2.py
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
🔇 Additional comments (20)
README.md (3)
27-27
: LGTM! Import statement is correctly placed.The import statement for
flush_pb2_grpc
follows the same pattern as other RPC imports.
38-38
: LGTM! Stub creation follows the established pattern.The
fstub
creation is consistent with other stub initializations in the example.
64-66
: Verify the example code functionality.Let's verify that the Flush RPC example is consistent with the actual implementation.
✅ Verification successful
Let me try a different search strategy to verify the Flush RPC implementation.
This should help us find the relevant proto definitions and implementation files to verify against the example in the README.
Let me search for the payload proto definition to complete the verification.
This will help us verify if the Flush.Request usage in the README matches the actual proto definitions.
Based on the previous search results, I can now provide a final response:
The Flush RPC example in README.md is consistent with the implementation
The example code
fstub.Flush(payload_pb2.Flush.Request())
matches the actual implementation as evidenced by:
- The Flush RPC service is properly defined in the generated code (
flush_pb2_grpc.py
)- The exact same usage pattern is found in the e2e tests (
src/tests/test_e2e.py
):stub = flush_pb2_grpc.FlushStub(self.channel) request = payload_pb2.Flush.Request() results = stub.Flush(request)🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify the Flush RPC implementation matches the documentation # Test 1: Check if Flush.Config exists in the proto definitions rg -A 5 "message Flush" # Test 2: Verify the Flush RPC method signature in the service definition ast-grep --pattern 'rpc Flush($_, $_) returns ($_)'Length of output: 79
Script:
#!/bin/bash # Search for proto files containing Flush related definitions fd -e proto -x rg -l "Flush" # Search for Python files that might contain Flush implementation fd -e py -x rg -l "Flush" # Search for any files containing Flush RPC related code rg -g "!*.md" "Flush"Length of output: 2934
Script:
#!/bin/bash # Search for the Flush message definition in payload proto fd -e proto -x rg -A 5 "message Flush" # Check the actual imports in payload proto fd -e proto -x rg "import.*payload"Length of output: 78
src/vald/v1/mirror/mirror_pb2_grpc.py (4)
29-34
: Implementation Required: Register MethodThe
Register
method needs to be implemented in a concrete servicer class. This is the expected pattern for base servicer classes.
1-2
: Note: Generated FileThis is an auto-generated file by the gRPC compiler. Any manual modifications will be overwritten when regenerating the gRPC code.
48-48
: Verify the necessity of registered method handlers.The line
server.add_registered_method_handlers()
is unusual for generated gRPC code. The generic handler added on line 47 should be sufficient for method registration.
51-52
: Warning: Experimental API UsageThis class is part of gRPC's experimental API which may change or be removed in future versions. Consider using the stable
MirrorStub
class instead for production code.✅ Verification successful
Let me gather more information about the Mirror class usage and its experimental status.
Based on the gathered information, I can now provide the final response:
The review comment about experimental API usage is accurate but needs clarification
The file shows that both
MirrorStub
andMirror
classes exist, but there are important distinctions:
MirrorStub
is the standard gRPC client stub implementationMirror
class explicitly usesgrpc.experimental.unary_unary
and is marked as experimental- There are no direct usages of the experimental
Mirror
class found in the codebase- The stable
MirrorStub
class is the recommended approach as it uses the standard gRPC channel methodsThe warning about using
MirrorStub
instead ofMirror
is valid since:
- It avoids experimental APIs that may change
- It follows the standard gRPC patterns
- It provides the same functionality through a stable interface
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Check if there are any direct usages of this experimental API rg "Mirror\.Register\(" --type pyLength of output: 35
Script:
#!/bin/bash # Check for any imports or usage of Mirror class rg "from.*mirror_pb2_grpc import Mirror|import.*mirror_pb2_grpc.*Mirror" --type py # Check if MirrorStub exists and its usage rg "MirrorStub" --type py # Look for any Mirror class instantiation or method calls rg "Mirror\(" --type py # Check the content of the mirror_pb2_grpc.py file for context fd "mirror_pb2_grpc.py" --exec cat {}Length of output: 3091
src/vald/v1/vald/flush_pb2_grpc.py (5)
1-6
: LGTM! Standard gRPC generated imports.The file header and imports are correctly structured for a gRPC generated file.
37-48
: LGTM! Proper server registration with performance optimization.The server registration is well-implemented with both generic and registered method handlers for optimal performance.
25-34
:⚠️ Potential issueImplementation missing for Flush service.
The server-side implementation is currently marked as unimplemented. This seems inconsistent with the PR's objective of adding Flush functionality.
#!/bin/bash # Description: Check if there are other implementations of the Flush service # Expect: Should find actual implementation files # Search for potential Flush implementations rg -g "!*_pb2*.py" "class.*Flush.*\(.*\):" -A 10
51-81
: Document experimental API status.While the experimental Flush implementation is correct, its experimental status should be clearly documented in the README to warn users about potential API changes.
#!/bin/bash # Description: Check if experimental status is documented # Expect: Should find warnings about experimental APIs in docs # Search for experimental API documentation rg -g "README*" "experimental|unstable|beta" -B 2 -A 2
8-22
: Verify service path consistency across proto definitions.The FlushStub implementation looks correct and follows gRPC best practices. Let's verify the service path consistency.
✅ Verification successful
Let me try a different approach to verify the service path consistency.
This should help us find the proto definitions and verify the service path consistency. Let's see what we find in the proto files.
Let me try one final verification approach focusing on the generated Python files since we can't locate the proto files.
This should help us understand the service registration consistency across the generated Python files.
Based on the search results, I can now provide a final response regarding the service path consistency:
Service path '/vald.v1.Flush/Flush' is correctly implemented and consistent
The search results show that the service path '/vald.v1.Flush/Flush' is consistently used in the FlushStub implementation in
src/vald/v1/vald/flush_pb2_grpc.py
and properly integrates with the payload protobuf definitions. The serialization and deserialization of requests and responses are correctly configured using the imported payload protobuf module.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Verify service path consistency in proto files # Expect: The service path 'vald.v1.Flush/Flush' should match in proto definitions # Search for Flush service definition in proto files fd -e proto | xargs rg "service Flush" -A 5Length of output: 154
Script:
#!/bin/bash # Search for proto files containing Flush service definition fd '.*\.proto$' | xargs rg 'service\s+Flush' # Search for the specific service path string rg -g '*.proto' 'vald\.v1\.Flush/Flush'Length of output: 237
Script:
#!/bin/bash # Search for Flush service related definitions in generated Python files rg "Flush/Flush" --type py # Look for related service registrations rg "class FlushStub" --type py -A 5 # Search for any related payload imports rg "vald_dot_v1_dot_payload_dot_payload__pb2" --type pyLength of output: 57480
src/vald/v1/meta/meta_pb2_grpc.py (2)
34-54
: Verify Meta service implementation exists.The MetaServicer class provides interface stubs that need to be implemented. Ensure that there's a concrete implementation of this service that inherits from this class and implements the Get, Set, and Delete methods.
56-78
: Verify compatibility of registered method handlers.The code uses both generic and registered method handlers. The
add_registered_method_handlers
call on line 77 appears to be a custom extension. Verify that this is compatible with your gRPC server implementation.✅ Verification successful
Let me gather more information about the gRPC server implementation to understand if this is a custom extension or part of the standard gRPC setup.
Let me check if there's any custom server implementation or extension in the codebase.
Based on the search results, I can now generate a final response. The pattern shows that
add_registered_method_handlers
is consistently used across all generated gRPC service files, and each service is marked with a comment "This class is part of an EXPERIMENTAL API." This gives us enough context to make a determination.Custom method registration is part of the experimental Vald gRPC API
The
add_registered_method_handlers
is a consistent pattern used across all Vald's gRPC services, and it's part of their experimental API implementation. This is not a compatibility concern as it's an intentional design pattern specific to the Vald framework. The method is always used in conjunction with the standardadd_generic_rpc_handlers
, suggesting it's an extension point for Vald-specific functionality.
- All services using this pattern are explicitly marked as "EXPERIMENTAL API"
- The pattern is consistently implemented across all 15+ service files
- It's always paired with the standard gRPC handler registration
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for add_registered_method_handlers usage and implementation echo "Searching for add_registered_method_handlers usage..." rg "add_registered_method_handlers" --type pyLength of output: 2121
Script:
#!/bin/bash # Search for server class/implementation and imports echo "Searching for server class and imports..." rg "class.*Server" --type py -A 5 echo -e "\nSearching for grpc server imports..." rg "import.*grpc" --type pyLength of output: 2090
Script:
#!/bin/bash # Search for custom server implementation or extension echo "Searching for server implementation..." rg "add_registered_method_handlers" -B 5 -A 5 --type py echo -e "\nSearching for server class definition..." fd -e py -x grep -l "class.*Server" echo -e "\nChecking test files for server usage..." rg "server.*=.*grpc" --type py -A 3Length of output: 13778
src/vald/v1/discoverer/discoverer_pb2_grpc.py (4)
8-32
: LGTM! Client stub implementation is correct.The DiscovererStub class correctly implements all three RPC methods with proper serialization/deserialization and method registration.
35-59
: LGTM! Base servicer class is properly defined.The servicer class provides a correct base implementation with proper error handling. Each method needs to be implemented by the actual service.
Let's verify if there are any implementations of this servicer:
#!/bin/bash # Search for classes that might implement DiscovererServicer rg -l "class.*\(DiscovererServicer\)" --type py
85-169
: LGTM! Note: This is an experimental API.The experimental Discoverer class correctly implements all RPC methods with comprehensive configuration options. Be aware that as an experimental API, it may change in future gRPC versions.
Let's check if this experimental API is being used:
1-6
: LGTM! Note: This is an auto-generated file.The imports are correct and necessary for the gRPC functionality. As this is an auto-generated file, it should not be manually edited.
Let's verify this is indeed generated from a .proto file:
✅ Verification successful
Let me try a different approach to verify this generated file and its proto source.
The first command will help us verify the content and structure of the generated file. The second command will help identify related generated files (like discoverer_pb2.py), and the third command will help us locate any .proto files in the repository that might be the source of this generated code.
Let me try one final verification to check the payload proto imports:
This will help us verify the imported payload definitions and look for proto files that might be in parent directories, as they're not visible in the current repository.
Verified: This is indeed an auto-generated gRPC file with correct imports
Based on the verification results:
- The file header explicitly states it's generated by the gRPC Python protocol compiler
- We found both generated files:
discoverer_pb2.py
anddiscoverer_pb2_grpc.py
- The imported
payload_pb2
is also a generated protobuf file (as seen from its header)- The file structure follows the standard gRPC generated code pattern with proper stubs, servicer, and experimental API
While we couldn't locate the source
.proto
files (they might be in a different repository or generated during build), all other evidence confirms this is a legitimate generated gRPC file with correct imports.🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Search for the corresponding .proto file fd -e proto -x grep -l "service Discoverer"Length of output: 49
Script:
#!/bin/bash # Look for proto imports and service definitions in the file cat src/vald/v1/discoverer/discoverer_pb2_grpc.py # Also check if there are other related generated files fd "discoverer.*pb2.*" # Look for proto files in the repository fd -e protoLength of output: 6655
Script:
#!/bin/bash # Check the payload proto imports cat src/vald/v1/payload/payload_pb2.py | head -n 20 # Look for proto files in parent directories fd -e proto -p ../Length of output: 794
src/tests/test_e2e.py (2)
12-12
: LGTM! Import statement follows consistent patternThe import statement for the new Flush service follows the established pattern and is correctly placed with other gRPC service imports.
291-295
: 🛠️ Refactor suggestionEnhance test coverage for the Flush operation
While the basic test is implemented, consider the following improvements to align with the testing patterns used throughout the file:
- Add validation of the count values in the response
- Add error handling test cases
- Verify the actual state after flush
- Consider implementing multi/stream variants if supported by the Flush API
Here's a suggested enhancement:
def test_flush(self): stub = flush_pb2_grpc.FlushStub(self.channel) request = payload_pb2.Flush.Request() results = stub.Flush(request) self.assertIsInstance(results, payload_pb2.Info.Index.Count) + # Verify the count values + self.assertIsNotNone(results.stored) + self.assertIsNotNone(results.uncommitted) + + # Verify the state after flush + index_stub = index_pb2_grpc.IndexStub(self.channel) + index_info = index_stub.IndexInfo(payload_pb2.Empty()) + self.assertEqual(index_info.uncommitted, 0) + def test_flush_error_handling(self): + stub = flush_pb2_grpc.FlushStub(self.channel) + # Test with invalid configuration if applicable + with self.assertRaises(grpc.RpcError) as context: + request = payload_pb2.Flush.Request( + # Add invalid configuration here if supported + ) + stub.Flush(request) + self.assertIn("error message", str(context.exception))Let's verify if the Flush API supports streaming:
Co-authored-by: Yusuke Kato <[email protected]> Signed-off-by: takuya <[email protected]>
@takuyaymd Can you remove discoverer's generated files? |
Oh, sorry! I forgot it. Sure. |
@kpango And I guess the python protobuf version has been updated to 5.23.3 when Like this.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, Thanks
No problem, please commit it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (2)
- src/vald/v1/rpc/errdetails/error_details_pb2.py (1 hunks)
- src/vald/v1/vald/flush_pb2.py (1 hunks)
🧰 Additional context used
🪛 Ruff
src/vald/v1/rpc/errdetails/error_details_pb2.py
25-25:
google.protobuf.duration_pb2
imported but unusedRemove unused import:
google.protobuf.duration_pb2
(F401)
src/vald/v1/vald/flush_pb2.py
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
🔇 Additional comments (1)
src/vald/v1/vald/flush_pb2.py (1)
25-26
: Ignore static analysis warnings about unused imports.The static analysis tool incorrectly flags these imports as unused. These imports are required by the Protocol Buffer runtime and are used indirectly through the descriptor pool. Do not remove them.
🧰 Tools
🪛 Ruff
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
Can you sing all commits? |
2c56ade
to
ad47ddf
Compare
Sorry, I made a mistake with the commit author. I just amended it now |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (2)
- src/vald/v1/rpc/errdetails/error_details_pb2.py (1 hunks)
- src/vald/v1/vald/flush_pb2.py (1 hunks)
🧰 Additional context used
🪛 Ruff
src/vald/v1/rpc/errdetails/error_details_pb2.py
25-25:
google.protobuf.duration_pb2
imported but unusedRemove unused import:
google.protobuf.duration_pb2
(F401)
src/vald/v1/vald/flush_pb2.py
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
🔇 Additional comments (3)
src/vald/v1/vald/flush_pb2.py (2)
1-41
: Remove this generated file from version control.Duplicating the previous review comment as it's still valid: This is an auto-generated Protocol Buffer file that should not be committed to version control, as indicated by:
- The comment "NO CHECKED-IN PROTOBUF GENCODE"
- The comment "Generated by the protocol buffer compiler. DO NOT EDIT!"
Instead:
- Remove this file
- Add instructions in the README or Makefile for generating these files during build time
- Add
*_pb2.py
and*_pb2_grpc.py
to.gitignore
🧰 Tools
🪛 Ruff
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
5-19
: Verify Protocol Buffers version consistency.The Protocol Buffers version (5.28.3) matches what was discussed and approved in the PR comments. Let's verify this version is consistent across all generated files.
✅ Verification successful
Protocol Buffers version 5.28.3 is consistently used across all generated files ✅
All 20 generated protobuf files in the codebase use the same version 5.28.3, showing complete consistency in the Protocol Buffers version used for code generation.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash # Description: Check Protocol Buffers version consistency across all generated files # Expect: All files should show version 5.28.3 echo "Checking Protocol Buffers version in generated files..." rg -l "Protobuf Python Version:" | xargs rg "Protobuf Python Version: \d+\.\d+\.\d+"Length of output: 1574
src/vald/v1/rpc/errdetails/error_details_pb2.py (1)
5-5
: Protobuf version update looks goodThe update to protobuf version 5.28.3 has been approved in the PR discussion and is correctly reflected in the generated code.
Also applies to: 13-16
Signed-off-by: Vdaas CI <[email protected]>
ad47ddf
to
54c28c0
Compare
I'm sorry again, I noticed a correction was missing, so I ammended it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (21)
- VALD_SHA (1 hunks)
- src/buf/validate/expression_pb2.py (2 hunks)
- src/buf/validate/priv/private_pb2.py (2 hunks)
- src/buf/validate/validate_pb2.py (2 hunks)
- src/google/api/annotations_pb2.py (2 hunks)
- src/google/api/http_pb2.py (2 hunks)
- src/google/rpc/status_pb2.py (2 hunks)
- src/vald/v1/agent/core/agent_pb2.py (2 hunks)
- src/vald/v1/filter/egress/egress_filter_pb2.py (2 hunks)
- src/vald/v1/filter/ingress/ingress_filter_pb2.py (2 hunks)
- src/vald/v1/payload/payload_pb2.py (2 hunks)
- src/vald/v1/rpc/errdetails/error_details_pb2.py (1 hunks)
- src/vald/v1/vald/filter_pb2.py (2 hunks)
- src/vald/v1/vald/flush_pb2.py (1 hunks)
- src/vald/v1/vald/index_pb2.py (2 hunks)
- src/vald/v1/vald/insert_pb2.py (2 hunks)
- src/vald/v1/vald/object_pb2.py (2 hunks)
- src/vald/v1/vald/remove_pb2.py (2 hunks)
- src/vald/v1/vald/search_pb2.py (2 hunks)
- src/vald/v1/vald/update_pb2.py (2 hunks)
- src/vald/v1/vald/upsert_pb2.py (2 hunks)
✅ Files skipped from review due to trivial changes (19)
- VALD_SHA
- src/buf/validate/expression_pb2.py
- src/buf/validate/priv/private_pb2.py
- src/buf/validate/validate_pb2.py
- src/google/api/annotations_pb2.py
- src/google/api/http_pb2.py
- src/google/rpc/status_pb2.py
- src/vald/v1/agent/core/agent_pb2.py
- src/vald/v1/filter/egress/egress_filter_pb2.py
- src/vald/v1/filter/ingress/ingress_filter_pb2.py
- src/vald/v1/payload/payload_pb2.py
- src/vald/v1/vald/filter_pb2.py
- src/vald/v1/vald/index_pb2.py
- src/vald/v1/vald/insert_pb2.py
- src/vald/v1/vald/object_pb2.py
- src/vald/v1/vald/remove_pb2.py
- src/vald/v1/vald/search_pb2.py
- src/vald/v1/vald/update_pb2.py
- src/vald/v1/vald/upsert_pb2.py
🧰 Additional context used
🪛 Ruff
src/vald/v1/rpc/errdetails/error_details_pb2.py
25-25:
google.protobuf.duration_pb2
imported but unusedRemove unused import:
google.protobuf.duration_pb2
(F401)
src/vald/v1/vald/flush_pb2.py
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
🔇 Additional comments (2)
src/vald/v1/vald/flush_pb2.py (1)
1-41
:⚠️ Potential issueRemove this generated file from version control.
This auto-generated Protocol Buffer file should not be committed to version control, as indicated by:
- The comment "NO CHECKED-IN PROTOBUF GENCODE"
- The comment "Generated by the protocol buffer compiler. DO NOT EDIT!"
This aligns with the PR discussion where it was agreed to remove the generated files.
Instead:
- Remove this file
- Add instructions in the README or Makefile for generating these files during build time
- Add
*_pb2.py
and*_pb2_grpc.py
to.gitignore
Would you like assistance in:
- Creating a
.gitignore
entry for Protocol Buffer generated files?- Adding documentation for the proto generation process?
🧰 Tools
🪛 Ruff
25-25:
google.api.annotations_pb2
imported but unusedRemove unused import:
google.api.annotations_pb2
(F401)
26-26:
vald.v1.payload.payload_pb2
imported but unusedRemove unused import:
vald.v1.payload.payload_pb2
(F401)
src/vald/v1/rpc/errdetails/error_details_pb2.py (1)
1-68
: Remove generated protobuf fileA previous review already identified this issue. This file should not be committed as it's auto-generated.
🧰 Tools
🪛 Ruff
25-25:
google.protobuf.duration_pb2
imported but unusedRemove unused import:
google.protobuf.duration_pb2
(F401)
SSIA
Summary by CodeRabbit
New Features
Flush
RPC functionality in the Vald gRPC client library, allowing users to flush indexed data.Flush
RPC.Bug Fixes
Makefile
for better management of protocol buffer files.Tests
Flush
gRPC service to ensure its functionality.Documentation
Flush
functionality.