-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(llama.cpp): embed metal file into result binary for darwin #4279
Conversation
✅ Deploy Preview for localai ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
|
...and if ggml-metal.metal is put back - we have the same error:
|
LocalAI/backend/cpp/llama/Makefile Line 35 in 5f688d7
Removing this line fixes the whole thing |
Signed-off-by: Ettore Di Giacinto <[email protected]>
f5f73d1
to
1d391cd
Compare
argh right, good catch! I forgot we explicitly disabled it in the backend Makefile. Thanks for testing |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can confirm that this works and we're using the embedded metal lib
Description
This PR fixes #4274
Notes for Reviewers
Signed commits