Skip to content

Latest commit

 

History

History
73 lines (51 loc) · 3.69 KB

README.md

File metadata and controls

73 lines (51 loc) · 3.69 KB

ComfyUI-UltraPixel (WIP)

ComfyUI node for UltraPixel


ComfyUI-UltraPixel was constructed using the original code from https://github.com/catcathh/UltraPixel, I'm intend on eventually rewriting ComfyUI-UltraPixel with much better integration with ComfyUI's native code vs basically ComfyUI-UltraPixel just being a 'modified wrapper' around the original UltraPixel code as it is now. This might take some time as some rather important life events have just happened and I need to dedicate my time to them accordingly, I hope you all understand.


Now works (as of 7/17) with 10GB/12GB/16GB GPUs:

  • 10GB GPUs work up to (about) 2048x2048 (for text2image and controlnet)
  • 12GB GPUs work up to (about) 3072x3072 (for text2image and controlnet)
  • 16GB GPUs work up to (about) 4096x4096 (for text2image) and 3840x4096 (for controlnet)

Install by git cloning this repo to your ComfyUI custom_nodes directory.

git clone https://github.com/2kpr/ComfyUI-UltraPixel

Install the requirements from within your conda/venv.

pip install -r requirements.txt

Load one of the provided workflow json files in ComfyUI and hit 'Queue Prompt'.

When the workflow first runs the first node will download all the necessary files into a ComfyUI/models/ultrapixel directory.
(make sure to update as there was an issue with downloading stage_b_lite_bf16.safetensors which was fixed here)

To enable ControlNet usage you merely have to use the load image node in ComfyUI and tie that to the controlnet_image input on the UltraPixel Process node, you can also attach a preview/save image node to the edge_preview output of the UltraPixel Process node to see the controlnet edge preview. Easiest to just load the included workflow_controlnet.json file in ComfyUI.

As mentioned above the default directory for the UltraPixel and StableCascade downloaded model files is ComfyUI/models/ultrapixel, if you want to alter this you can now change ultrapixel_directory or stablecascade_directory in the UltraPixel Load node from 'default' to the full path/directory you desire.

Example Output for prompt: "A close-up portrait of a young woman with flawless skin, vibrant red lipstick, and wavy brown hair, wearing a vintage floral dress and standing in front of a blooming garden."


Example Output for prompt: A highly detailed, high-quality image of the Banff National Park in Canada. The turquoise waters of Lake Louise are surrounded by snow-capped mountains and dense pine forests. A wooden canoe is docked at the edge of the lake. The sky is a clear, bright blue, and the air is crisp and fresh.


Example ControlNet Output for prompt: A close-up portrait of a young woman with flawless skin, vibrant red lipstick, and wavy brown hair, wearing a vintage floral dress and standing in front of a blooming garden, waving


Example ControlNet Output for prompt: A close-up portrait of a young woman with blonde hair bobcut wearing a beautiful blue dress giving the thumbs up

Credits:

All thanks to the team that made UltraPixel:
https://jingjingrenabc.github.io/ultrapixel/
https://arxiv.org/abs/2407.02158
https://github.com/catcathh/UltraPixel