Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for higher vram GPUs (24GB+) #267

Open
aperire opened this issue Dec 16, 2024 · 2 comments
Open

Add support for higher vram GPUs (24GB+) #267

aperire opened this issue Dec 16, 2024 · 2 comments

Comments

@aperire
Copy link

aperire commented Dec 16, 2024

Hi, I understand that fluxgym was designed to support low vram cards to train Loras. However, due to it's simple UI, there are a lot of users who have 4090 or ones waiting for 5090s (36GB vram) who are willing to use Fluxgym.

Support for higher vrams would be a great decision without sacrificing any convenience for other users.

@vivek-kumar-poddar
Copy link

Need this. Please add 24, 36, 48 or a field to enter the custom GPU memory size. Much needed when using platforms like Vultr or Runpod. Thanks!

@K-O-N-B
Copy link

K-O-N-B commented Dec 27, 2024

yes, 4090 is waitting for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants