You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-- Installing: /wrkdirs/usr/ports/devel/tabby/work/target/release/build/llama-cpp-server-8837603d1835d022/out/bin/llama-tokenize
cargo:root=/wrkdirs/usr/ports/devel/tabby/work/target/release/build/llama-cpp-server-8837603d1835d022/out
--- stderr
CMake Warning at cmake/build-info.cmake:14 (message):
Git not found. Build info will not be accurate.
Call Stack (most recent call first):
CMakeLists.txt:77 (include)
CMake Warning at ggml/src/CMakeLists.txt:274 (message):
AMX requires gcc version > 11.0. Turning off GGML_AMX.
CMake Warning at common/CMakeLists.txt:30 (message):
Git repository not found; to enable automatic generation of build info,
make sure Git is installed and the project is a Git repository.
CMake Warning:
Manually-specified variables were not used by the project:
CMAKE_ASM_COMPILER
CMAKE_ASM_FLAGS
thread 'main' panicked at crates/llama-cpp-server/build.rs:66:36:
Failed to copy server binary to output directory: No such file or directory (os error 2)
Information about your version
0.20.0
Additional context
FreeBSD 14.1
The text was updated successfully, but these errors were encountered:
Also: we already have llama-cpp package available, so there should be no need to bundle it in tabby.
Is it possible to use the external llama-cpp package?
Describe the bug
Information about your version
0.20.0
Additional context
FreeBSD 14.1
The text was updated successfully, but these errors were encountered: