Ziggurat@fedia.io to AI Generated Images@sh.itjust.works · 2 days agoLadies and gentleman, this is your catpainfedia.ioexternal-linkmessage-square10fedilinkarrow-up122arrow-down11
arrow-up121arrow-down1external-linkLadies and gentleman, this is your catpainfedia.ioZiggurat@fedia.io to AI Generated Images@sh.itjust.works · 2 days agomessage-square10fedilink
minus-squareNaz@sh.itjust.workslinkfedilinkEnglisharrow-up1·10 hours agoYou need a lot of VRAM and a large visual model for higher complexity. Lower VRAM means you run models that only do one thing consistently/well. See: FLUX
minus-squarebradd@lemmy.worldlinkfedilinkEnglisharrow-up1·9 hours agoI have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?
You need a lot of VRAM and a large visual model for higher complexity.
Lower VRAM means you run models that only do one thing consistently/well.
See: FLUX
I have 2x 24G 3090 but IIRC comfyui doesn’t support multiple GPU. That seem too low?