RTX 4090 vs 3090 ti stable diffusion test. (UPDATE) This video is now out of date!

RTX 4090 vs 3090 ti stable diffusion test. (UPDATE) This video is now out of date!

John Williams

1 год назад

41,588 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@Oreoezi
@Oreoezi - 12.11.2023 02:05

cant wait in 10 years to be able to generate full length movies in real time out of a single prompt on low end hardware

Ответить
@mada_faka
@mada_faka - 30.09.2023 22:14

oh god i wish i have 4090 🥲

Ответить
@haiderdiego680
@haiderdiego680 - 04.09.2023 05:21

Is it better to download stable difussion or use the website?

Ответить
@tuiastro
@tuiastro - 24.08.2023 17:19

Thanks for sharing this. I ran your prompt using ComfyUI (Windows 10) with SDXL 1.0 on my RTX 2060 12GB card. It took 55 sec - around 1.9 s/it. vram around 7.1/12GB. The results are very nice. BTW. Steps: 28, Batch size 4, Sampler: Euler, CFG scale: 7, Seed: 1181893402, Size: 768x960, Denoising 0.75

Ответить
@user-bu5tx3yt9f
@user-bu5tx3yt9f - 27.07.2023 18:16

what do you mean, by this video is out of date, do you have another one?

Ответить
@ParvathyKapoor
@ParvathyKapoor - 13.07.2023 21:01

hows the performance in rendering deforum videos? can u post the comparison?

Ответить
@user-du2jb4wu6r
@user-du2jb4wu6r - 16.06.2023 16:57

how about v100?

Ответить
@__-fi6xg
@__-fi6xg - 04.06.2023 04:02

meanwhile my 6700 xt amd gpu takes 1 minute for 1 image in text to image 512 x 512.

Ответить
@Squeezitgirdle
@Squeezitgirdle - 29.05.2023 08:26

now if only I could find out why I suddenly started getting cuda out of memory issue as of today even on old images I previously had no trouble generating...

Ответить
@rat_king_reginald
@rat_king_reginald - 23.05.2023 06:39

Nice thanks for this, hard to find videos regarding the 4090's performance outside of speculative "is it worth it" vids. And often those are not focused on the VRAM.

Ответить
@RoboMagician
@RoboMagician - 16.05.2023 10:10

what about the generation time when adding lora and upscaling to a checkpoint model?

Ответить
@SuperSuperSuperSuperSuper
@SuperSuperSuperSuperSuper - 08.05.2023 02:46

四萬元的顯卡速度還是這樣慢

Ответить
@eyevenear
@eyevenear - 28.04.2023 07:21

with the 3090ti i'm getting 1 image per second.

Ответить
@AOTanoos22
@AOTanoos22 - 12.04.2023 12:57

Thanks for the test, do you know if dual GPU supported with stable diffusion? maybe even with Nvlink with last gen RTX 3090.

Ответить
@Maisonier
@Maisonier - 27.03.2023 04:33

What about using 2x 3090 in sli? can you do that on stable diffusion?

Ответить
@inkinno
@inkinno - 19.02.2023 07:04

Thanx bro... i was looking for this

Ответить
@jkfi
@jkfi - 15.02.2023 21:37

on 1070ti its takes 3 min and 36 sec for batch size 4 (~54 sec per image)

768x960, Steps 28, Batch size 4

Ответить
@ArisenProdigy
@ArisenProdigy - 15.02.2023 16:28

There is a way to get more out of the 4090 and there is a guide that desperately needs to be made on how to do that.

Ответить
@PIDAGOK
@PIDAGOK - 06.02.2023 20:56

on 4070ti its takes 28sec for batch size 4 (~6 sec per image)
Steps: 28, Sampler: Euler, CFG scale: 7, Seed: 2902406594, Size: 768x960, Model hash: d8691b4d16, Model: deliberate_v11, Denoising strength: 0.75, Mask blur: 4

Ответить
@moosiemoose1337
@moosiemoose1337 - 01.02.2023 01:46

i followed and the instructions in the description as well as using argument --xformers in the bat file and got up to 24it/s with 512x512 image generation eular a 25 steps

2048x1440 images takes me 20ish seconds to generate. Crazy. 4090 gpu overclocked

Ответить
@lol-di3tf
@lol-di3tf - 20.01.2023 01:42

With M1 Max MBP, it took me 10 min!

Ответить
@user-rl4hd2iz6c
@user-rl4hd2iz6c - 31.12.2022 15:20

4090 - 11.4s
3090ti - 17.49s
difference = 1.53 times. Where did 42.16 times faster come from?!?

Ответить
@shavel6418
@shavel6418 - 30.12.2022 01:38

It'd me took 1.5 hours to make it on my 1650 (not even TI) 💀💀💀
So that's why I spent money on virtual GPU supply servers

Ответить
@spider853
@spider853 - 16.12.2022 03:39

I think we need to compare price difference in % vs speed difference in %

Ответить
@Cuteexe
@Cuteexe - 15.12.2022 16:11

no xformers installed?

Ответить
@crckdns
@crckdns - 14.12.2022 19:02

that'S the best advertising for a RTX4090 :D

Ответить
@versonind8197
@versonind8197 - 13.12.2022 14:47

☠️☠️☠️☠️ я со своей картой АМД жду по 2 часа, чтоб процессор а не видеокарта, отрисовал 1 изображение....

Ответить
@yuduzfridoed3672
@yuduzfridoed3672 - 07.12.2022 16:20

my gtx 1650 ti mobile:
🗿🗿🗿🗿

Ответить
@techpriest4787
@techpriest4787 - 27.11.2022 07:43

A single text/image with my 1080ti needs 60 seconds.
Thanks for the benchmark. I was under the impression that the 4090 is much slower.

Ответить
@nathaniellin2632
@nathaniellin2632 - 14.11.2022 19:30

Thanks I'm wondering how it would perform

Ответить
@shakal_
@shakal_ - 08.11.2022 20:17

now i want rtx 4090 damn

Ответить
@xirtus
@xirtus - 03.11.2022 05:47

hmmm, sounds like you need about 120 rtx 4090s to do a live cartoon.

Ответить
@koctf3846
@koctf3846 - 31.10.2022 18:47

definitely switching to a 4090 in a few days

Ответить
@GyroO7
@GyroO7 - 31.10.2022 18:00

Novel Card I see x)

Ответить
@profiWork
@profiWork - 31.10.2022 14:46

Yakuza_Suske Thank you. Thanks to your question, my system is now much faster.
I can't keep up with all the updates. Today I found that changing the line in the webui-user.bat file to:
set COMMANDLINE_ARGS= --xformers
significantly speeds up generating files and reduces the need for video card memory.
Now I can generate a picture with test parameters on gtx 1080:
Time taken: 4m 8.04sTorch active/reserved: 3608/5178 MiB, Sys VRAM: 5699/8192 MiB (69.57%)
This is a huge boost.

Ответить
@yakuza_suske3189
@yakuza_suske3189 - 31.10.2022 08:34

Have you tried with "--xformers"? I have a 3080Ti and with xformers my generating on 512x512 is 17-18 it/s

Ответить
@joe.todddq
@joe.todddq - 30.10.2022 06:05

hello fellow pirate 🏴‍☠ 1337x

Ответить
@alexeia3747
@alexeia3747 - 28.10.2022 12:44

With my gtx 1080 it takes a lot of time (7m 4.96s) with these settings.
Steps: 100, Sampler: Euler a, CFG scale: 16, Seed: 1578922111, Size: 2048x512, Eta: 0
Time taken: 7m 4.96sTorch active/reserved: 4636/5386 MiB, Sys VRAM: 7238/8192 MiB (88.35%)

Ответить
@zerog4879
@zerog4879 - 28.10.2022 10:16

What is the max resolution you can get with 24gb vram

Ответить
@alexeia3747
@alexeia3747 - 28.10.2022 00:34

Thank you. Finally, at least someone answered the question of how much faster the 4090 is in SD. Can I ask you to make a test of the model full-ema with the parameters Steps: 100, Sampler: Euler a, CFG scale: 16, Size: 2048x512 ?

Ответить