-
Notifications
You must be signed in to change notification settings - Fork 5
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #188 from janhq/update-submodule-2024-08-13-17-07
Update llama.cpp submodule to latest release b3580
- Loading branch information
Showing
1 changed file
with
1 addition
and
1 deletion.
There are no files selected for viewing
Submodule llama.cpp
updated
16 files
+3 −1 | .github/workflows/bench.yml | |
+10 −12 | .github/workflows/build.yml | |
+2 −4 | .github/workflows/python-check-requirements.yml | |
+2 −0 | README.md | |
+3 −0 | common/grammar-parser.cpp | |
+106 −39 | docs/backend/SYCL.md | |
+3 −3 | examples/export-lora/README.md | |
+26 −9 | examples/export-lora/export-lora.cpp | |
+1 −1 | examples/llava/requirements.txt | |
+3 −2 | examples/server/server.cpp | |
+3 −3 | flake.lock | |
+1 −1 | ggml/src/ggml.c | |
+978 −3 | gguf-py/gguf/quants.py | |
+237 −0 | gguf-py/tests/test_quants.py | |
+2 −2 | src/llama-sampling.cpp | |
+2 −7 | src/llama.cpp |