Комментарии:
cant wait in 10 years to be able to generate full length movies in real time out of a single prompt on low end hardware
Ответитьoh god i wish i have 4090 🥲
ОтветитьIs it better to download stable difussion or use the website?
ОтветитьThanks for sharing this. I ran your prompt using ComfyUI (Windows 10) with SDXL 1.0 on my RTX 2060 12GB card. It took 55 sec - around 1.9 s/it. vram around 7.1/12GB. The results are very nice. BTW. Steps: 28, Batch size 4, Sampler: Euler, CFG scale: 7, Seed: 1181893402, Size: 768x960, Denoising 0.75
Ответитьwhat do you mean, by this video is out of date, do you have another one?
Ответитьhows the performance in rendering deforum videos? can u post the comparison?
Ответитьhow about v100?
Ответитьmeanwhile my 6700 xt amd gpu takes 1 minute for 1 image in text to image 512 x 512.
Ответитьnow if only I could find out why I suddenly started getting cuda out of memory issue as of today even on old images I previously had no trouble generating...
ОтветитьNice thanks for this, hard to find videos regarding the 4090's performance outside of speculative "is it worth it" vids. And often those are not focused on the VRAM.
Ответитьwhat about the generation time when adding lora and upscaling to a checkpoint model?
Ответить四萬元的顯卡速度還是這樣慢
Ответитьwith the 3090ti i'm getting 1 image per second.
ОтветитьThanks for the test, do you know if dual GPU supported with stable diffusion? maybe even with Nvlink with last gen RTX 3090.
ОтветитьWhat about using 2x 3090 in sli? can you do that on stable diffusion?
ОтветитьThanx bro... i was looking for this
Ответитьon 1070ti its takes 3 min and 36 sec for batch size 4 (~54 sec per image)
768x960, Steps 28, Batch size 4
There is a way to get more out of the 4090 and there is a guide that desperately needs to be made on how to do that.
Ответитьon 4070ti its takes 28sec for batch size 4 (~6 sec per image)
Steps: 28, Sampler: Euler, CFG scale: 7, Seed: 2902406594, Size: 768x960, Model hash: d8691b4d16, Model: deliberate_v11, Denoising strength: 0.75, Mask blur: 4
i followed and the instructions in the description as well as using argument --xformers in the bat file and got up to 24it/s with 512x512 image generation eular a 25 steps
2048x1440 images takes me 20ish seconds to generate. Crazy. 4090 gpu overclocked
With M1 Max MBP, it took me 10 min!
Ответить4090 - 11.4s
3090ti - 17.49s
difference = 1.53 times. Where did 42.16 times faster come from?!?
It'd me took 1.5 hours to make it on my 1650 (not even TI) 💀💀💀
So that's why I spent money on virtual GPU supply servers
I think we need to compare price difference in % vs speed difference in %
Ответитьno xformers installed?
Ответитьthat'S the best advertising for a RTX4090 :D
Ответить☠️☠️☠️☠️ я со своей картой АМД жду по 2 часа, чтоб процессор а не видеокарта, отрисовал 1 изображение....
Ответитьmy gtx 1650 ti mobile:
🗿🗿🗿🗿
A single text/image with my 1080ti needs 60 seconds.
Thanks for the benchmark. I was under the impression that the 4090 is much slower.
Thanks I'm wondering how it would perform
Ответитьnow i want rtx 4090 damn
Ответитьhmmm, sounds like you need about 120 rtx 4090s to do a live cartoon.
Ответитьdefinitely switching to a 4090 in a few days
ОтветитьNovel Card I see x)
ОтветитьYakuza_Suske Thank you. Thanks to your question, my system is now much faster.
I can't keep up with all the updates. Today I found that changing the line in the webui-user.bat file to:
set COMMANDLINE_ARGS= --xformers
significantly speeds up generating files and reduces the need for video card memory.
Now I can generate a picture with test parameters on gtx 1080:
Time taken: 4m 8.04sTorch active/reserved: 3608/5178 MiB, Sys VRAM: 5699/8192 MiB (69.57%)
This is a huge boost.
Have you tried with "--xformers"? I have a 3080Ti and with xformers my generating on 512x512 is 17-18 it/s
Ответитьhello fellow pirate 🏴☠ 1337x
ОтветитьWith my gtx 1080 it takes a lot of time (7m 4.96s) with these settings.
Steps: 100, Sampler: Euler a, CFG scale: 16, Seed: 1578922111, Size: 2048x512, Eta: 0
Time taken: 7m 4.96sTorch active/reserved: 4636/5386 MiB, Sys VRAM: 7238/8192 MiB (88.35%)
What is the max resolution you can get with 24gb vram
ОтветитьThank you. Finally, at least someone answered the question of how much faster the 4090 is in SD. Can I ask you to make a test of the model full-ema with the parameters Steps: 100, Sampler: Euler a, CFG scale: 16, Size: 2048x512 ?
Ответить