Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: use playground image based on llama-cpp-python #140

Merged
merged 2 commits into from
Jan 26, 2024

Conversation

jeffmaury
Copy link
Contributor

Fixes #139

@feloy
Copy link
Contributor

feloy commented Jan 25, 2024

I get an unauthorized access when I try to pull the image quay.io/bootsy/playground:v0

@jeffmaury
Copy link
Contributor Author

I get an unauthorized access when I try to pull the image quay.io/bootsy/playground:v0

Should be ok now

@feloy
Copy link
Contributor

feloy commented Jan 25, 2024

It seems that we get only the beginning of the response from the model

model-begin

@jeffmaury
Copy link
Contributor Author

Need to verify with a curl request

@feloy
Copy link
Contributor

feloy commented Jan 25, 2024

The reply does not contain anything indicating there will be a suite.

Is it an internal problem in the container, or are we using the good request?

reply

@feloy
Copy link
Contributor

feloy commented Jan 26, 2024

Yes, that works for me now.
The container gets more time to start now, I have created a related issue: #147

Copy link
Contributor

@lstocchi lstocchi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super slow on windows and the answer is not really the one i expected but it works 😅

image

llama-cpp-python uses 24 as a default for max_tokens

Signed-off-by: Jeff MAURY <[email protected]>
@jeffmaury jeffmaury merged commit 9dd52d4 into containers:main Jan 26, 2024
3 checks passed
@jeffmaury jeffmaury deleted the GH-139 branch January 26, 2024 09:44
mhdawson pushed a commit to mhdawson/podman-desktop-extension-ai-lab that referenced this pull request Nov 22, 2024
Ability to trigger the testing-framework workflow manually
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Replace localAI by llama-cpp-python
3 participants