Not Enough VRAM!!!

Not Enough VRAM!!!

Daniel Owen

1 год назад

235,359 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Portman
Portman - 04.10.2023 21:27

How many of you can tell apart medium or high textures from ultra though? I'm willing to wager almost none unless pixel peeping.

Ответить
apenas um nome de canal
apenas um nome de canal - 01.10.2023 18:52

The good thing about the RTX 40xx is that it will force devs to be a little extra careful with Vram usage

Ответить
LaryWulf
LaryWulf - 28.09.2023 12:57

Who cares about texture bruh, 😂

Ответить
Prince of 1000 Enemies
Prince of 1000 Enemies - 26.09.2023 21:34

i recently saw a comment on reddit that asked whats the best budget gpu with highest vram and pc users bagged on him for it wondering why on earth would that matter.. THIS VIDEO. This is why it matters.

Ответить
Tessa Wolf
Tessa Wolf - 25.09.2023 16:17

Remember games like Crysis 3 could run on 1gb of vram, and a game like starfield uses almost 10gb.

Ответить
Jeanne
Jeanne - 20.09.2023 13:46

This is why I won't build a new pc with the 4070ti

Ответить
Eric yapp
Eric yapp - 13.09.2023 01:27

So that nvidia can sell more graphic cards

Ответить
DeathStriker88
DeathStriker88 - 12.09.2023 11:06

But what if I don't play these overly expensive paid games or AAA games?

Ответить
Zin Armagadan
Zin Armagadan - 26.08.2023 13:37

I still have a 1080 Ti and I tried out RE4 Remake with settings well over the VRAM limit and it never crashed. I wonder why that's the case for the 3070?

Ответить
bill clay
bill clay - 21.08.2023 07:59

at least it ain't Atari 2600 pac man looking lol.

Ответить
Hircine92h
Hircine92h - 12.08.2023 11:34

I've always said 3070 and 3070Ti are garbage GPUs. Even 3080 10GB is meh. RX 6700XT > 3070/3070Ti.

Ответить
Blurredborderlines
Blurredborderlines - 11.08.2023 23:15

They may have patched it for better performance but that atrocious screen tearing doesn’t lie - If you’re game is this poorly optimized it doesn’t matter how much you fuck around with the source code, it’s running into a resource limitation and you’re not addressing the issue at hand. Compress your textures for the love of god, GPU’s are literally designed for high transfer speed compression and decompression, stop forcing in resource hogging workarounds like DLSS and actually make a game that works natively.

Ответить
Unknown
Unknown - 11.08.2023 12:31

Me with 128 mb vram : ohh.... out of syllabus💀

Ответить
Luke
Luke - 11.08.2023 04:47

I was so looking forward to playing RE4 remake but seeing that the game is riddled with crashes even with a beefy system made me think twice. I guess I'll wait for about 6-12 months till the devs iron out the issues.

Ответить
Kent Bergström
Kent Bergström - 05.08.2023 15:24

4070 TI should have 16 gb vram, fuck Nvidia.

Ответить
Ray CCXR
Ray CCXR - 04.08.2023 08:23

Is controller work on re4 remake if works please suggest me the controller

Ответить
Tsumiaee
Tsumiaee - 31.07.2023 11:34

8GB is large enough infact its perfect the problems is game studios release unoptimised piles of shit.

Ответить
Miniyi Art
Miniyi Art - 27.07.2023 05:23

unless u are standing very still and actually looking at the comparison between 2gb texture and the 8 gb texture, u could see slight difference but other then that, the game play pretty stay the same, so i would say more issue is with the player and not just that, the game optimization in this day and age is horrible, just because there a good pc spec out there, doesn't mean they could start being lazy on optimization with their game

example of game like rdr 2 with huge detail on open world has better texture looking then some of the game that required 12 gb of vram in extreme setting for whatever reason, it like yes not enough vram but it also not for us the consumer to fix the issue, it the game dev that needed to fix the issue

Ответить
TinTown
TinTown - 26.07.2023 04:39

I’m here learning why my 3070 ti was on sale for SO cheap

Ответить
Fran Totti
Fran Totti - 21.07.2023 09:45

Nvidia still releasing new GPU cards with 8gb ram, and surprisingly people still buy them. if people want that , Nvidia can do nothing for them. The market definitely needs third competitor .

Ответить
GamingLovers
GamingLovers - 21.07.2023 06:15

just turn off the raytracing

Ответить
Rid Omi
Rid Omi - 07.07.2023 20:15

my memory card is 12010 but this game is showing 8000 why?

Ответить
Tech
Tech - 03.07.2023 17:55

I play at 1440p. I just picked up a 6700xt because of the VRAM issue.

Ответить
siuuuu
siuuuu - 02.07.2023 22:32

bro i played it on 1050 with 2gb vram💀

Ответить
CamaroWarrior
CamaroWarrior - 30.06.2023 13:21

I have a ryzen 5800x, 32gb ddr4 3600mhz, 3090 ftw3, and RE4 @4k fsr on, fully maxed out settings my system uses around 13gb system ram and up to 16.5gb vram and get around 75-85fps depending on the scene ... which shows 16gb should be minimum a gpu should have now a days

Ответить
Captain Hamburgers
Captain Hamburgers - 26.06.2023 15:59

8gb VRAM is why i wont buy a rtx 4060 unless some companies release a 16gb or at least 12gb version. I mainly play single player games so fps higher than 60fps dont really interest me. I'd happily play some games at 30 fps.

Ответить
Mendonça Pais
Mendonça Pais - 22.06.2023 04:46

if they come to talk about 4k I understand, but even 1440p is not right

Ответить
Mendonça Pais
Mendonça Pais - 22.06.2023 04:45

for me is enough 8gb vram, at least up to 1440p and for rtx 3060 ti 8gb vram

Ответить
MacAaroni
MacAaroni - 16.06.2023 10:08

Im switching to AMD

Ответить
BredPitt Brasileiro
BredPitt Brasileiro - 15.06.2023 13:57

Gtx 1060 3gb ☠️

Ответить
J P A
J P A - 14.06.2023 23:28

I wonder what was the reason for NVidia suddenly deciding that 8GB is enough VRAM... since the 1070ti.
Now with consoles having 16GB of RAM, and this memory being accessible by GPUs, we will see GPUs perfectly capable of running games fail disgracefully because NVidia decided that the VRAM from three generations of GPUs ago is enough.
I have an RTX3070 and it is almost useless for working with things like 3D rendering because of the 8GB, which would have been perfectly acceptable 10 years ago...

Ответить
blakedmc1989RaveHD
blakedmc1989RaveHD - 06.06.2023 14:10

i wouldn't play higher than 1080p but i'm on a GTX 1080Ti and it has 11gb of Vram lol

Ответить
Hydrargyrum Night
Hydrargyrum Night - 01.06.2023 01:48

I disagree a bit on the presentation of a few of the points here, or in other words, assessment approach and claims made.

It's true that there is a bit of a discrepancy by now between VRAM capacities supplied by GPU manufacturers at specific price points and options modern games allow alongside the level of performance work done (or not done) by game developers to minimize the VRAM requirements. However,

1. First, there is almost no limit to how bad from "gaming experience improvement per performance cost" point of view an option a developer can easily add. The difference in developer's behavior between supplying say a 8 GB texture pack with the game or not supplying it and limiting the highest option at 4 GB is very minimal, and yet with the approach used in this video in the former case the gamer will be happy that they were able to "max out" the game in the latter case, and upset that they couldn't in the former. It makes no sense. What matters is can you enjoy playing the game - have a good time with it actually playing rather than analyzing options or staring at a wall point blank - on the GPU you can buy at a reasonable price or not.

A developer merely providing an option that negligibly improves if at all the experience that works only on expensive GPUs today - and might start working on less expensive GPUs in the future - should not be lowering your assessment of the GPU's value - the value of the GPU is dependent on the quality of your experience playing the game, not on whether or not the developer chose to spend a few hours of their time to include an option to use textures at the resolution they were mastered by the artist (and thus readily available to the developer anyway) - a silly and impractical but not completely pointless to include option - alongside the standard necessary option of including lower resolution textures.

2. Second, by lowering texture sizes VRAM capacity requirement is not the only requirement you're lowering. You are also lowering VRAM bandwidth requirements, and more subtly and provably on fewer applications/games - other requirements as well, such as texture cache capacities and bandwidths internal to the microarchitecture of the GPU chip. Chances are, your card will need more than "just more VRAM" to match the performance you were demonstrating, it would almost certainly also need a wider memory bus and/or (less realistically) higher frequency DRAM chips, all of which will drive the costs up of the board and the GPU chip on it even further.

3. Third, VRAM capacity is indeed a parameter that has very threshold-like impact on performance: if it's not enough, performance drops of the cliff (or drops to 0 - crashing the application), usually making the application unusable. On the other hand, having more of it than necessary has 0 positive impact on performance. But it still needs to be paid for by the gamer in the latter case. Thus the question becomes, crucially,

- what proportion of the hours spent playing (the entire library of) games did the gamer had to spend with the asset quality (like textures) turned down to fit the VRAM capacity of their GPU so much that the visual quality associated with those assets very clearly dropped substantially below both the rest of the graphical properties of the game and what is currently deemed acceptable, objectively worsening the experience the gamer had with the game?

If all you need to do to handle VRAM capacity of your GPU is to turn down texture resolution from pointlessly high to okay-ish-but-distractingly-not-so-great in a couple of games you cumulatively spend 1/10 of the time you spend playing games, then the VRAM capacity chosen is exactly right by the virtue of saving you money.

If VRAM capacity you have is always enough, never mind always enough not just to get a good experience playing but to formally "max out" whatever nonsense options the game developer had fun adding for giggles or otherwise, then the VRAM capacity you have is way too high and you wasted a lot of money buying it.

This sort of principle is generally applied in engineering across the board at every level of the design the vast majority of which is never discussed by the public as its way too technical and involved. The principle applies however: exceptionally rarely a component in a well-built engineering system (computers or cars or even airplanes or anything else) is ever "maxed out", it's always a balance of cost and positive impact on main, target, user-facing quality metrics of the end product or overall experience.

That all being said, clearly, switching from PS4-era consoles with 8 GB of RAM to PS5-era of consoles with 16 GB of RAM may indeed strongly suggest that overwhelming majority of graphically intensive games from now on may have a baseline expectation of an increased VRAM capacity even for the most basic assets necessary to run at all, forcing users to dramatically reduce the quality of more flexibly sized assets textures - reduce it below what was normal years ago - in order to merely run the game on GPUs with old VRAM capacities that weren't scaled up appropriately in response to that shift in the console market. It's important to remember that consoles have unified RAM - they have to fit into those 16 GB all of the data that in a PC is partially distributed (and partially duplicated) between RAM and VRAM.

Clearly, the transition in consoles has not been addressed by Nvidia: GTX 1070 released 6 years ago almost to date also had 8 GB of VRAM, it's been 2.5 years since consoles doubled the amount of VRAM/RAM they have, and yet, today's RTX 3070 Ti is still shipping with 8 GB.

Perhaps this means that all modern GPUs with the price that even remotely approaches the price of consoles (or higher; not of the entire PC - if you're buying a PC only for gaming and nothing else and want a budget option, you should be buying a console. It's the decision to add a discrete GPU to a PC you already have for other purpose is what's competing with consoles) really do have to have 12 GB of VRAM (or higher) to make sense in the world of RAM+VRAM-combined 16 GB consoles, and clearly RTX 3070 Ti selling with 8 GB of VRAM at a price ever so slightly exceeding that of PS5 does not deliver.

It puts a card like 3080 10GB in an even tougher spot - effectively that card that started as an amazing value proposition was doomed to rapidly loose adequacy (for its price) the moment the new generation of consoles released, which is basically the moment the card released. Very sad and very convenient to Nvidia. I have a hard time believing Nvidia's leadership didn't know how much VRAM consoles will have long before their public release dates, all chances are, they had all the information and all the time necessary to architect and market Ampere GPUs accordingly. They didn't. In fact, Ada is doing even worse (3060 is 12 GB card, 4060 and a $400 version of 4060 Ti even are 8 GB).

Ответить
Armin.8688
Armin.8688 - 30.05.2023 20:20

Imagine having a perfect ssd with 256 gb of memory and you want to move a 300 gb file to it, but you cant...
This guy is saying my ssd is so fast but windows doesnt let me to move this file to it

Ответить
ROG GAMER
ROG GAMER - 29.05.2023 11:18

Today same happened to my amd rx 6600xt(8GB) the game ran fine for a while on 10gb VRAM useage BUT AFTER SOMETIME IT starts lagging stuttering and then giving fatal d3d error just like your's so i turned down some settings and i was good to go

Ответить
fpslucky13
fpslucky13 - 27.05.2023 08:26

I don’t understand why people think a last gen mid tier card is going to run new releases at max….i don’t think that’s unreasonable.

But the fact a 3060 had more than the 3070 blows my mind.

That’s the part that makes no sense to me.

Ответить
AshS
AshS - 26.05.2023 09:10

Maybe it'll force developers to actually optimize somewhat instead of requiring the latest and greatest cards to run a game.

Ответить
Starly
Starly - 17.05.2023 01:22

This is why I'd get RX 6800 XT instead of RTX 3070, however I was glad I was able to find a new gen GPU at the end of 2020 since they were almost non-existent then.
Still personally I don't think It's that big of a deal personally. Most games have DLSS and It's usually lighting and artstyle that makes a game pretty not texture quality. RDR2 is still one of the best looking games despite textures not being that good.

Ответить
Daniel Augusto Gomes Diogenes
Daniel Augusto Gomes Diogenes - 15.05.2023 09:20

Hey man, I just bought myself a RTX 3070 and it isn't using more than 70% on Resident Evil 4 for some reason... Is it faulty? I can't get a benchmark as good as yours, not even close actually

Ответить
Avaraz
Avaraz - 12.05.2023 12:57

Let's hope games never asks for more than 16gb of VRAM or else we are all doomed

Ответить
Delsere
Delsere - 09.05.2023 20:01

Thank you for validating this concern, I wasn't too sure who to side with. Easy sub!

Ответить
thousandyoung
thousandyoung - 09.05.2023 12:22

Ngreedia doing what they do best and you always falling for it. Also, it's Called Intentional Shitporting to force Hardware Upgrade. There are Games that look a million times better than this Shit and don't ask for all that crazy amounts of VRAM. You People just love eating Shit from Corpos. The only SHIT these Asswipes don't forget to add is DENUVO so it always runs even shittier than it should and hopefully destroys your PC in the Process. Devs are also Garbage.

Ответить
MyLateDroid101 D
MyLateDroid101 D - 08.05.2023 01:56

I went RED a year ago becayse of the VRAAM restriction. Glad i did

Ответить
Edmundo studios
Edmundo studios - 05.05.2023 12:26

Are the textures even better than PS5 at this point on 8GB?

Ответить
roystan gomes
roystan gomes - 05.05.2023 09:52

i am happy i got a 6700xt 12gb .. the 12 gb will be useful in many games in the next few years.

Ответить
Damazy Włodarczyk
Damazy Włodarczyk - 04.05.2023 02:19

You think game devs will ignore the fact that most steam users have 3060 and 3060ti's? Of course they won't. You are supposedly a math teacher, so try to think a little. 8gb will last until ps6 ports.

Ответить
nikolygtx
nikolygtx - 03.05.2023 22:02

Game don't look near enough to fill up 4gb of vram

Ответить
Daniel Oset Romera
Daniel Oset Romera - 01.05.2023 12:37

Something is really, REALLY wrong when RDR 2 asks for just 5,6 gb of vram IN 4K while Hogwarts Legacy basically can´t run properly with 8GB in 1440p, and barely in 1080p. These ports are simply bad (TLOU, Hogwarts and to some extent RE 4 Remake), any way you slice it.

Ответить
Jeff the Truth Bringer
Jeff the Truth Bringer - 30.04.2023 08:23

VRAM was the deciding factor I went with a RX6800 over another 3080 my 3080 died when Hogwarts came out and I was hitting that 12GB a lot so when i was buying another gpu I found the RX6800 and I could get it for $400 tax&warrenty new and everything so a deal I couldn't pass on then I went back to play Hogwarts and I stopped having stutter problems I wasn't getting crashes and I wasn't getting muddy textures I saw I was using on average 12GBv on 1440p then I did some testing and I was getting 13-14GBv at 1080p with RT.
And I also saw it on Dead Space at 1440p was using over 12GBv a lot of the times so my old 3080 would have had problems where my RX6800 wouldn't even thew the 3080 was a little stronger I feel really bad for all these 3060&3070 card users for the years to come because modern AAA games have start to hit that wall to where even at 1080p game's are using over 10GBv.

Ответить