Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Related Issues
llama-cpp-python==0.3.0
, integrations tests were running for > 4 hours.llama-cpp-python<0.3.0
#1111, I pinned the package to a previous version.llama-cpp-python==0.3.1
, I spent some time debugging the problem.Proposed Changes:
n_ctx
to 512 (default inllama-cpp-python
, while the default in our integration is 0 = inferred from the model)llama-cpp-python
How did you test it?
CI
Notes for the reviewer
I think in the future we will rework function calling with
llama-cpp-python
and remove the test with this model (functionary-small-v2.4
), which is becoming obsolete.In the meantime, I prefer to try to be compatible with the latest versions of this library.
Checklist
fix:
,feat:
,build:
,chore:
,ci:
,docs:
,style:
,refactor:
,perf:
,test:
.