fix: Improve error message when validating IO config #353
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Some of the user models can have large number of model inputs/ouptuts. Without this change the model tensor name is not a part of the error message. This change improves the error logging of the server offering better debugging ability to the user.
There is also a fix included which correctly prints a model input as an input instead of as an output.
The testing is updated to expect this new behavior here: triton-inference-server/server#7340
For the following model configuration:
Before
E0611 19:35:52.422993 138 model_repository_manager.cc:1371] "Poll failed for model directory 'missing_datatype': model output must specify 'data_type' for missing_datatype"
After
E0611 19:31:47.991606 9486 model_repository_manager.cc:1371] "Poll failed for model directory 'missing_datatype': model input 'input' must specify 'data_type' for missing_datatype"