Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vram required ? #15

Open
Koorvux opened this issue Jan 22, 2024 · 2 comments
Open

Vram required ? #15

Koorvux opened this issue Jan 22, 2024 · 2 comments

Comments

@Koorvux
Copy link

Koorvux commented Jan 22, 2024

No description provided.

@mp3pintyo
Copy link

I'm trying the basic workflow now. 15GByte VRAM under Windows 11 :(

@WASasquatch
Copy link

WASasquatch commented Feb 1, 2024

Interestingly, when I load the model, it peaks out both RAM and VRAM (32GB and 24GB VRAM) and the system starts glitching up really bad while the console starts spitting out lots of checkpoint missing key errors, invalid selections so it'll select all, etc.

After that it calms down and inference is much lower, but super slow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants