Skip to content

Should we add a ngram cache API to llama.cpp? #4235

ggerganov started this conversation in Ideas
Discussion options

You must be logged in to vote

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
2 replies
@ggerganov
Comment options

ggerganov Nov 27, 2023
Maintainer Author

@cmp-nct
Comment options

Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Ideas
Labels
question Further information is requested
4 participants