Комментарии:
lol im still using intel i5-8600k with 4070ti and i cant even turn ray traycing on xD, cpu has a party everytime a big graphic game turns on xD
ОтветитьYou shouldn’t be talking about this. You don’t have ANY INSIGHTS INTO THIS AT ALL!!!
You could have at least reached out to developers and asked.
What you are saying right now is a bunch of nonsense.
There are things that need to be done that GPUs can’t do. It’s always been like that but now there is more than ever.
Tell me you know nothing about game development without telling me you've never looked into game development.
ОтветитьI always thought resolution had a factor in CPU usage aswell. Example, lower resolution had a harder impact on CPU than say a 4k resolution.
ОтветитьIts simple.
Almost all developers doesn't care about optimization so much anymore.
I think that most of them do this on purpose in collaboration with graphics card manufacturers so that users will have to buy stronger and at the same time more expensive graphics cards.
Rtx dlss fsr scale d3d 12 suckss
ОтветитьI do not care stunning graphics and i just want a good game experience like actually fun not just good looking same shit
ОтветитьI am not tech genie, but it's clear to me if I want to play AAA games in 1440p or higher, I need one of the latest cpu. I mean why is it even a question? My question would be, why the 7800x3d doesn't make more fps if paired with rtx4090 gpu? best cpu with best gpu and the frames in games are not big difference compared with 5800x3d apparently.
ОтветитьThings are moving along too quickly, back in the day hardware restrictions forced devs to be creative just to get the games on a cart or disk so they started off having to optimize from jump. Now there's lots of space, lots of power and nothing forcing them to optimize. In previous generations, improvements came slower, consoles could hang around 10 years, in order to make better games they had to make those obsolete lil boxes scream, and they did. Now... you can wait for a patch to fix the broken game that gets released before it's finished. When in the past, updates weren't a thing, so they had to make sure it at least worked before shipping. (usually did)
ОтветитьThe pendulum is swinging the other direction
ОтветитьWe need more quality, and less quantity...
Ответитьthis is an issue that has crept up ever since the 8th gen consoles and has now ballooned into this.
cpu's falling behind is one of the many reasons why frame generation exists.
I don't think that CPUs are that big of a bottleneck... but the optimization are crap now days.
Ответитьdrawcall strikes again
ОтветитьHonestly I don't play mainstream games these days because of the braindead gameplay and lackluster plots. Now I get to add unoptimized garbage on top of that.
ОтветитьSimple answer from my pov, Unreal and Unity got so popular and user friendly that people are just using those instead of making more modest games with more efficient self-developed engines. Now we're in a landscape where nearly all game devs have been using unity and unreal for the past 15 years and those engines have gotten so bloated to the point where games look and perform worse than they did 5 years ago, and there aren't enough people feeding back into engine development which leads into a stagnation of sorts.
ОтветитьThere are many pipelines that get loaded. So the % load tells literally nothing. It might be just RT cores overload and they make card bottleneck, but if you turn off dlss or up your res you can get same fps with 100% load..
ОтветитьME with i9 7980xe from 2017 rock hard any games with rtx 4090 fe with watercoled vector v2 and 4k 144hz monitor and 64gb of ram
ОтветитьThat's why I bought Ryzen 9 7950x (joke, I am not a gamer)
Ответитьit's not the CPU I have a 5900x and it's working just fine with my 3090 T.i I ran into an issue back in February were either windows or Ryzen master had changed some settings in the BIOS, and it was destroying my gaming performance. did a factory reset in BIOS and then re set my XMP profile Re Enabled resizable Bar and its been performing fine ever since
Ответитьme playing cyberpunk 2077 on gtx 1050 mobile and intel i5 8300H with 30-60 fps thanks to the radeon shabang im playing in 200p
Ответитьlazy ports by devs thats the problem
ОтветитьDevelopers have seen that gpu's can't keep up with what they need, but cpu's have had headroom to spare for quite a few years. So they use a cpu solution to solve a gpu inadequacy.
Ответитьdx 12 is trash
Ответитьalso 8 vs 18 core and almost no difference in performance but both heavily used, why
ОтветитьMe and my brother downloaded the New PayDay 3. I7 8700k Delidded 4,8ghz and RTX 2080Ti really is struggling like 70 fps on medium settings and the game doesnt even look that good. CPU and GPU usage at 60% usage
ОтветитьRunning a midrange 5900x/6900xt/64gb/x570s system and I am surprised to sometimes see what appears to be a cpu bottleneck as the GPU isn’t at 99% utilized and the game isn’t hitting any artificial caps like fps limit or Vsync enabled. I see this even on older games. This is on almost 4K level of pixels with 5120x1440 which is 11% less pixels than 4K.
ОтветитьUnreal DOES NOT have 'CPU issues' dude, it all comes down to developers implementing things correctly or not. Fortnite doesn't have issues because Epic knows what they're doing.
ОтветитьGood thing that I'm running a 36 core dual Xeon setup with 256gb ram 😄
ОтветитьMarket is saturated. Tons of developers with games coming out, too much other type of entertainment competing for people's time. This is a market in which it is increasingly harder to recoup the cost of development.
Games keep coming out with bugs and having optimization problems because game development is complex and they need to release on a certain schedule to start recouping costs, so I don't see optimization getting better, unless technology (hardware and software) stops innovating and throwing new things at developers that they need to handle.
Also, I believe I've seen an article that discusses NVidia GPU architecture and how they're designed with far more core than can be utilized (which still works out fine) with the buses available, so not being able to utilize it fully may not be all that indicative in certain cases.
ChatGPT "AI" might solve the problem of optimizing code for better CPU runtime.
ОтветитьIn all the footages you have shown, the CPU usage isn't at 100 percent, so what makes you think the CPU is actually the problem. The problem I think is less VRAM on even higher tier cards like the 3080. With games utilizing 4k textures in most games now, Nvidia releases a fuckin 8 gb card in 2020. I would go on to say as much as that CPUs are actually much much ahead of competition than GPUs. CPUs haven't been a bottleneck in games since the Ryzen launch forced industry moving towards 8 core cpus than the dual cored ones Intel offered
ОтветитьDLSS on and 1440p
Believe me, I have a 4080(and a 7800x3d) and it's very easy to overpower it. I play VR simracing alot now, and it's very hard on the GPU. The Witcher with it's new updates, with full settings up, if I turn frame generation off, I top out at like 55 FPS.
Also IDK if you want it pinned at 99% all the time, it leaves no headroom for when every car is on the screen etc.
I've been barking this since ages but no one seems to listen. My 3080 has been getting idled hard with 8700k while the internet is filled with misleading and outright wrong benchmarks. One of the big pain points is how slow directstorage adoption has been, spiderman games have a lot of compression/decompression happening to accomodate for hard disk storage
For reference, consoles are running equivalent of zen2 cpus right now
if you look at your CPU usage, it's at 30%, so it's not a CPU's fault. It's most likely a memory bottleneck
Ответить4k my boy
ОтветитьOne of the biggest things this gen is "direct storage" on consoles. Consoles now have 7-8 cpu cores pinned + an extremely fast ssd. Theres been a huge gap between that and DX12 + direct storage support on PCs. I reckon once direct storage is actually on par with the console offerings, developers can utilize it more to reduce cpu load.
Ответитьa lot of the newer games are coded in 4k so they look good but plenty of them are cpu single core/ram heavy and that's also because of the way it was coded. and then you have aaa devs releasing unoptimized garbage too
ОтветитьGet one of the Ryzen X3D CPUs and your GPU with start sweating.
ОтветитьDude really does not know what he is talking about.
ОтветитьNext-gen CPUs are supposedly gonna have big jumps in power for both late 2024 and late 2026 product lines so I think we will be seeing next-gen consoles Holiday 2026 because of this. PS6 and NextBox are gonna be beasts.
Ответитьdevs do bare min. PC ports are whatever to them.
ОтветитьDoesnt matter the guy that cant afford a High End Cpu also cant afford an 3080. You can see that you 5900x is used only max 26% The Developers still can´t make a good use of some cores and threads.
ОтветитьGames are simply not optimized. Computing power is "cheap", developers (publishers, actually) don't care with doing it right, just doing it fast, so bruteforce time it is!
HOWEVER, there are games and games. You cannot say that "a CPU is holding back the GPU from putting more frames" when the GPU is close to 90% utilization and the CPU is only halfway through!
This is basically FUD, we need to call out and demand that the few offending triple-A publishers to stop focusing on quantity of game launches per year.
As always, vote with your wallet....
everything that is new is a test, consumers are the lab rats for testing.
ОтветитьIn the new update of cyberpunk 2077. The game now will use up to 90% of the CPU.
ОтветитьI’m gonna say it’s because thanks to AI performance enchanting software like DLSS/FSR/XESS have become a crutch for developers. Take Remnant 2 for example. They straight up admitted that they optimized the game fully intending for people to use DLSS. They aren’t bothered to optimize as well thanks to this software and it’s so annoying.
ОтветитьTldr ray tracing is a gimmick.
ОтветитьI believe a huge issue here is with how Vulkan/DX12 is being utilized. Though in theory it can have better performance in practice more control means more work means you need to really know what you’re doing to get it right.
Ответить