Your GPU is Useless Now

Your GPU is Useless Now

Vex

10 месяцев назад

483,786 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Daniel Battaglia
Daniel Battaglia - 20.09.2023 14:43

lol im still using intel i5-8600k with 4070ti and i cant even turn ray traycing on xD, cpu has a party everytime a big graphic game turns on xD

Ответить
MrGivmedew
MrGivmedew - 20.09.2023 14:18

You shouldn’t be talking about this. You don’t have ANY INSIGHTS INTO THIS AT ALL!!!

You could have at least reached out to developers and asked.

What you are saying right now is a bunch of nonsense.

There are things that need to be done that GPUs can’t do. It’s always been like that but now there is more than ever.

Ответить
Peter Johnson
Peter Johnson - 20.09.2023 13:04

Tell me you know nothing about game development without telling me you've never looked into game development.

Ответить
Sean B
Sean B - 20.09.2023 12:08

I always thought resolution had a factor in CPU usage aswell. Example, lower resolution had a harder impact on CPU than say a 4k resolution.

Ответить
Винчанац
Винчанац - 20.09.2023 11:42

Its simple.
Almost all developers doesn't care about optimization so much anymore.
I think that most of them do this on purpose in collaboration with graphics card manufacturers so that users will have to buy stronger and at the same time more expensive graphics cards.

Ответить
BryanXan
BryanXan - 20.09.2023 10:45

Rtx dlss fsr scale d3d 12 suckss

Ответить
Eliee
Eliee - 20.09.2023 09:52

I do not care stunning graphics and i just want a good game experience like actually fun not just good looking same shit

Ответить
Lenka A.
Lenka A. - 20.09.2023 09:17

I am not tech genie, but it's clear to me if I want to play AAA games in 1440p or higher, I need one of the latest cpu. I mean why is it even a question? My question would be, why the 7800x3d doesn't make more fps if paired with rtx4090 gpu? best cpu with best gpu and the frames in games are not big difference compared with 5800x3d apparently.

Ответить
Christopher Privett
Christopher Privett - 20.09.2023 08:46

Things are moving along too quickly, back in the day hardware restrictions forced devs to be creative just to get the games on a cart or disk so they started off having to optimize from jump. Now there's lots of space, lots of power and nothing forcing them to optimize. In previous generations, improvements came slower, consoles could hang around 10 years, in order to make better games they had to make those obsolete lil boxes scream, and they did. Now... you can wait for a patch to fix the broken game that gets released before it's finished. When in the past, updates weren't a thing, so they had to make sure it at least worked before shipping. (usually did)

Ответить
Rodney
Rodney - 20.09.2023 08:32

The pendulum is swinging the other direction

Ответить
Morgan Draegar
Morgan Draegar - 20.09.2023 08:21

We need more quality, and less quantity...

Ответить
CJ
CJ - 20.09.2023 08:12

this is an issue that has crept up ever since the 8th gen consoles and has now ballooned into this.
cpu's falling behind is one of the many reasons why frame generation exists.

Ответить
YouTube Other
YouTube Other - 20.09.2023 08:09

I don't think that CPUs are that big of a bottleneck... but the optimization are crap now days.

Ответить
durp hurp
durp hurp - 20.09.2023 08:02

drawcall strikes again

Ответить
AmericanRX
AmericanRX - 20.09.2023 07:52

Honestly I don't play mainstream games these days because of the braindead gameplay and lackluster plots. Now I get to add unoptimized garbage on top of that.

Ответить
wackyracoon
wackyracoon - 20.09.2023 06:56

Simple answer from my pov, Unreal and Unity got so popular and user friendly that people are just using those instead of making more modest games with more efficient self-developed engines. Now we're in a landscape where nearly all game devs have been using unity and unreal for the past 15 years and those engines have gotten so bloated to the point where games look and perform worse than they did 5 years ago, and there aren't enough people feeding back into engine development which leads into a stagnation of sorts.

Ответить
Tpecep
Tpecep - 20.09.2023 06:37

There are many pipelines that get loaded. So the % load tells literally nothing. It might be just RT cores overload and they make card bottleneck, but if you turn off dlss or up your res you can get same fps with 100% load..

Ответить
Franco Adrian
Franco Adrian - 20.09.2023 04:35

ME with i9 7980xe from 2017 rock hard any games with rtx 4090 fe with watercoled vector v2 and 4k 144hz monitor and 64gb of ram

Ответить
twothreeoneoneseventwoonefourfive
twothreeoneoneseventwoonefourfive - 20.09.2023 03:32

That's why I bought Ryzen 9 7950x (joke, I am not a gamer)

Ответить
OG Gamers Live
OG Gamers Live - 20.09.2023 01:04

it's not the CPU I have a 5900x and it's working just fine with my 3090 T.i I ran into an issue back in February were either windows or Ryzen master had changed some settings in the BIOS, and it was destroying my gaming performance. did a factory reset in BIOS and then re set my XMP profile Re Enabled resizable Bar and its been performing fine ever since

Ответить
Idiotontheinternet
Idiotontheinternet - 20.09.2023 01:00

me playing cyberpunk 2077 on gtx 1050 mobile and intel i5 8300H with 30-60 fps thanks to the radeon shabang im playing in 200p

Ответить
Crazych1cken96
Crazych1cken96 - 20.09.2023 00:57

lazy ports by devs thats the problem

Ответить
agodelianshock
agodelianshock - 20.09.2023 00:56

Developers have seen that gpu's can't keep up with what they need, but cpu's have had headroom to spare for quite a few years. So they use a cpu solution to solve a gpu inadequacy.

Ответить
biggrill man
biggrill man - 20.09.2023 00:04

dx 12 is trash

Ответить
Viper Benchmarks
Viper Benchmarks - 19.09.2023 21:56

also 8 vs 18 core and almost no difference in performance but both heavily used, why

Ответить
Dr HakenNase
Dr HakenNase - 19.09.2023 20:53

Me and my brother downloaded the New PayDay 3. I7 8700k Delidded 4,8ghz and RTX 2080Ti really is struggling like 70 fps on medium settings and the game doesnt even look that good. CPU and GPU usage at 60% usage

Ответить
45eno
45eno - 19.09.2023 18:41

Running a midrange 5900x/6900xt/64gb/x570s system and I am surprised to sometimes see what appears to be a cpu bottleneck as the GPU isn’t at 99% utilized and the game isn’t hitting any artificial caps like fps limit or Vsync enabled. I see this even on older games. This is on almost 4K level of pixels with 5120x1440 which is 11% less pixels than 4K.

Ответить
André Felipe
André Felipe - 19.09.2023 17:57

Unreal DOES NOT have 'CPU issues' dude, it all comes down to developers implementing things correctly or not. Fortnite doesn't have issues because Epic knows what they're doing.

Ответить
Mazuto
Mazuto - 19.09.2023 17:48

Good thing that I'm running a 36 core dual Xeon setup with 256gb ram 😄

Ответить
Erik Merckx
Erik Merckx - 19.09.2023 17:26

Market is saturated. Tons of developers with games coming out, too much other type of entertainment competing for people's time. This is a market in which it is increasingly harder to recoup the cost of development.

Games keep coming out with bugs and having optimization problems because game development is complex and they need to release on a certain schedule to start recouping costs, so I don't see optimization getting better, unless technology (hardware and software) stops innovating and throwing new things at developers that they need to handle.


Also, I believe I've seen an article that discusses NVidia GPU architecture and how they're designed with far more core than can be utilized (which still works out fine) with the buses available, so not being able to utilize it fully may not be all that indicative in certain cases.

Ответить
musikSkool
musikSkool - 19.09.2023 17:11

ChatGPT "AI" might solve the problem of optimizing code for better CPU runtime.

Ответить
Nikhil Chouhan
Nikhil Chouhan - 19.09.2023 17:06

In all the footages you have shown, the CPU usage isn't at 100 percent, so what makes you think the CPU is actually the problem. The problem I think is less VRAM on even higher tier cards like the 3080. With games utilizing 4k textures in most games now, Nvidia releases a fuckin 8 gb card in 2020. I would go on to say as much as that CPUs are actually much much ahead of competition than GPUs. CPUs haven't been a bottleneck in games since the Ryzen launch forced industry moving towards 8 core cpus than the dual cored ones Intel offered

Ответить
jer j
jer j - 19.09.2023 15:38

DLSS on and 1440p

Believe me, I have a 4080(and a 7800x3d) and it's very easy to overpower it. I play VR simracing alot now, and it's very hard on the GPU. The Witcher with it's new updates, with full settings up, if I turn frame generation off, I top out at like 55 FPS.

Also IDK if you want it pinned at 99% all the time, it leaves no headroom for when every car is on the screen etc.

Ответить
e1m1
e1m1 - 19.09.2023 15:33

I've been barking this since ages but no one seems to listen. My 3080 has been getting idled hard with 8700k while the internet is filled with misleading and outright wrong benchmarks. One of the big pain points is how slow directstorage adoption has been, spiderman games have a lot of compression/decompression happening to accomodate for hard disk storage

For reference, consoles are running equivalent of zen2 cpus right now

Ответить
Matej Drenjančević
Matej Drenjančević - 19.09.2023 15:03

if you look at your CPU usage, it's at 30%, so it's not a CPU's fault. It's most likely a memory bottleneck

Ответить
R L
R L - 19.09.2023 13:44

4k my boy

Ответить
R
R - 19.09.2023 13:28

One of the biggest things this gen is "direct storage" on consoles. Consoles now have 7-8 cpu cores pinned + an extremely fast ssd. Theres been a huge gap between that and DX12 + direct storage support on PCs. I reckon once direct storage is actually on par with the console offerings, developers can utilize it more to reduce cpu load.

Ответить
Outis
Outis - 19.09.2023 13:27

a lot of the newer games are coded in 4k so they look good but plenty of them are cpu single core/ram heavy and that's also because of the way it was coded. and then you have aaa devs releasing unoptimized garbage too

Ответить
Andrei Marin
Andrei Marin - 19.09.2023 12:24

Get one of the Ryzen X3D CPUs and your GPU with start sweating.

Ответить
TowerC Gaming
TowerC Gaming - 19.09.2023 11:59

Dude really does not know what he is talking about.

Ответить
Tanya I Am Your Father
Tanya I Am Your Father - 19.09.2023 10:07

Next-gen CPUs are supposedly gonna have big jumps in power for both late 2024 and late 2026 product lines so I think we will be seeing next-gen consoles Holiday 2026 because of this. PS6 and NextBox are gonna be beasts.

Ответить
James Deluxa
James Deluxa - 19.09.2023 10:05

devs do bare min. PC ports are whatever to them.

Ответить
Bulletproof_Vaan
Bulletproof_Vaan - 19.09.2023 09:23

Doesnt matter the guy that cant afford a High End Cpu also cant afford an 3080. You can see that you 5900x is used only max 26% The Developers still can´t make a good use of some cores and threads.

Ответить
Rhobson Vanzella
Rhobson Vanzella - 19.09.2023 08:04

Games are simply not optimized. Computing power is "cheap", developers (publishers, actually) don't care with doing it right, just doing it fast, so bruteforce time it is!

HOWEVER, there are games and games. You cannot say that "a CPU is holding back the GPU from putting more frames" when the GPU is close to 90% utilization and the CPU is only halfway through!
This is basically FUD, we need to call out and demand that the few offending triple-A publishers to stop focusing on quantity of game launches per year.

As always, vote with your wallet....

Ответить
Zeus .Edwards
Zeus .Edwards - 19.09.2023 07:01

everything that is new is a test, consumers are the lab rats for testing.

Ответить
Gerardo Avelini
Gerardo Avelini - 19.09.2023 04:47

In the new update of cyberpunk 2077. The game now will use up to 90% of the CPU.

Ответить
Elliot Marty
Elliot Marty - 19.09.2023 04:34

I’m gonna say it’s because thanks to AI performance enchanting software like DLSS/FSR/XESS have become a crutch for developers. Take Remnant 2 for example. They straight up admitted that they optimized the game fully intending for people to use DLSS. They aren’t bothered to optimize as well thanks to this software and it’s so annoying.

Ответить
Joseph Harris
Joseph Harris - 19.09.2023 03:32

Tldr ray tracing is a gimmick.

Ответить
The Editor AMVs
The Editor AMVs - 19.09.2023 03:26

I believe a huge issue here is with how Vulkan/DX12 is being utilized. Though in theory it can have better performance in practice more control means more work means you need to really know what you’re doing to get it right.

Ответить