Комментарии:
How tf do you spot 1 frame out of hundreds per second. I just believe that everyone is just happy to have more frames
Ответитьdecent video mate, decided not using dlss3 on my 70ti with cyberpunk
ОтветитьIt makes me wonder what FSR3 frame generation will be like in comparison?
ОтветитьWhat about the price and raw performance? I am, personally, confused and don't know either to buy a 4060 or 3070?
ОтветитьWhilst it still has all sorts of problems, DLSS and Ray Tracing are both pretty interesting and it will be great to see how they mature. I remember when bloom was first a thing and for the first few years every single surface became like a mirror, I usually turned it off. Over time they tuned it down, improved it and now the whole ultra reflective everything has largely gone. Hopefully DLSS and Ray Tracing will follow a similar path.
ОтветитьSo if I play at 10fps it’s an issue ? Feel like I’m being trolled by reviewers.
ОтветитьI just hope that my eyes will continue not keeping up with this artifacts and issues lmao. I just hope it don't get to noticeable like the horse tail hair tragedy in Red Dead Redemption
ОтветитьIt deff got better on my cyborg 15 rtx4060 45watttttttttttt
Ответитьi have a i5 8400 and a 2070
do i need to replace the cpu to go 40-series? i want hdmi 2.1, better rasta and sota dlss / rt
thinking 4070
The real question is: why are NVIDIA's vector calculations for those generated frames that far off that we are forced to see crazy artifacting. On the UI there are vectors that are off by a factor of like 10? Either this tech is simply not meant for gaming or games don't have any priority for nVidia's developers at the moment.
ОтветитьIt'll get better
ОтветитьI think frame generation sounds like a gimmick. Adding "fake frames" to double fps on a graph but not really increasing performance. Really ? How is increasing latency, and just posting higher fps, not a gimmick ? Just know exactly what it is and what it is not, as explained greatly in this video. I also agree with some videos that say , the best idea is to not use it unless you need to use it. Just use DLSS, which is awesome. But remember, DLSS put more load on a cpu, and takes some load off the gpu. Poorly optimized games that are released too early, causes gamers to need more computer power to compensate for poor programming.
ОтветитьThis level of critique is assenine for groundbreaking/amazing technology...Who the hell plays their games in slow motion and zoomed in so much. I dont feel any lag in all of my games and play at max settings
(i9, 32gb, 4070ti).
So someone just confirm to me out for curiosity, so DLSS3 on a 60Hz monitor without FreeSync/G-Sync is simply useless? (Yeah I'm still using this monitor and play games at 60fps with V-Sync ^^).
ОтветитьMind you both Hitman and Plague play wonderfully at 40fps, they don't need 200Hz presentation. It's difficult to spot the difference between 80 and 160fps on these games to begin with. So frame interpolation working well with them is... very much a dubious benefit.
ОтветитьAny fixes for those UIs in games mentioned now 4 months later?
Ответитьit's possible that Nvidia only told the devs of the other games after they where sure that the fix they did on Cyberpunk worked, and that left the others with not enough time to apply the fixes as they didn't want to delay launches or forgo fixing other issues because they probably assumed that this feature is quite niche for now and they rather fix crashes and performance issues for lower end systems
ОтветитьDLSS 3 is already GODLIKE quality and will only get better in time. Meanwhile AMD's FSR 3 is dead in the water because it's imposible to design a platform-agnostic frame generation technology. Nvidia proving yet again they are blazing the way forward and AMD is the cheap copy trying to keep up.
Ответитьi just find out that i like these kind of content
ОтветитьI'll answer the question so that people don't have to wonder. No.
ОтветитьLet`s test RTX 4060 Ti DLSS 3 vs Radeon RX 6800 XT FSR 2 and compare frame cost $
ОтветитьComplaining about frame gen in f1 is dumb because it’s worthless in racing titles due to the latency. You shouldn’t even be looking at that game.
ОтветитьIt seems like DLSS 3 frame generation is a kind of "win more" type of deal. If you're already running the game at the level you want it to, it can just increase the framerate and smoothness, but if the game isnt running at where you want it, it cant really fix your issues without making significant compromises elsewhere. Thank yiu for the very well made and easy to understand video! I understood on a basic level that DLSS used AI to make fake frames to increase framerates, but i didnt know at all what that meant in actual game implementation. Now i know where amd how to use it :D
ОтветитьNot fond of per game tweaks that are lost sometimes with updates, when you want to play your classic game and new drivers break it, its hell. I would prefer finer user control, for example reverting to safer, slower iterations for older or unsupported games.
ОтветитьDumb question. Can they evolve DLSS tech so that it only applies to 3D objects and not the 2D elements?
ОтветитьUI elements being garbled seem kind of unsolvable with the way DLSS works to my understanding without a complete overhaul to how DLSS works or the game engine (and I'm not sure how to do that practically). I don't really think we can fault the gamedevs for those artifacts except in providing a DLSS option in the first place which they probably shouldn't with this type of content. The problem is that those UI elements often don't write to the G-Buffer and are rendered even when they're partially occluded so there's no information there like the depth of the text being drawn for supersampling to interpolate properly. There's a similar issue for objects behind transparent objects.
G-Buffers are always single-layer, so to speak. The depth buffer only stores the depth of the frontmost geometry being rendered. even for state-of-the-art engines like UE 5 that support transparency. And multi-layer G-Buffers are too costly to even conceive typically (most production engines use 384+ bits per pixel). But without a multi-layer G-Buffer and also multi-layer frame buffer, a super sampler can't interpolate things across time and space correctly for occluded objects that are still rendered behind or partially behind others, as in the case of UI/HUD elements and things behind transparent surfaces.
I think the only solution for games that want to lean heavily on DLSS is to avoid showing things like text and limit transparency as much as possible, favoring more HUD-less designs and far fewer/smaller transparent/translucent surfaces. There's no way I can see uniformly to really correct this problem short of storing multi-layered, multi-depth G-Buffers. DLSS always seemed a bit to me like a dead end. Dead ends and workarounds to existing problems can help for a while and I'm not saying it's obsolete by any stretch, but spatiotemporal SS can't solve these types of issues, so I think for a while we can put in the workarounds and design games with content that works well with DLSS, but it's a type of hardware feature that we need to outgrow as we find software techniques or hardware improvements that make it no longer necessary. It's a band-aid feature to treat symptoms, so to speak, not the root problem which is that we lack the software techniques and hardware to render fast enough at full screen resolution for high-res displays; the future definitely shouldn't require DLSS.
As an aside, I've been trying to look at the DLSS SDK for solutions to these types of problems and maybe it's possible to render things like text as a post-process if DLSS can provide an intermediate/interpolated G-Buffer or projection/view matrix (ideally interpolated G-Buffer since we need the temporally interpolated depth information). I am admittedly a novice to DLSS API but I can't seem to find a way to do that still. Even if it could do that, this still provides a huge problem for 3D objects behind other transparent/translucent 3D objects. And there's still a need to completely change some pipelines to even do this properly. DLSS is far from a solution that "just works".
DLSS3 actually fixed my cpu bottleneck for some reason. Didnt saw anyone mention it
Ответить6800 XT seems a better deal imo
ОтветитьNvidia and you testers, should explain to as why Frame Generation gives only 25-50% more Fps vs simple Dlss! Aren't all frames duplicated or is there extreme overhead?
ОтветитьSo its DOA...? Good!
ОтветитьIf DLSS3 needs 120 base FPS (interpolated to 240) to look/feel good, what is even the use of it? It would be useful it it would allow 30FPS RTX 4050 to get 60 FPS in 4K. But in current state it is just a gimmick. It will not improve latency in esport titles, and you do not need 240 FPS in witcher 3.
ОтветитьNvidia wont do squat. Neither will developpers. Because it doesnt bring in money.
ОтветитьCyberpunk 2077 is the best working game. How things have changed!
ОтветитьAs a developer, I'm amazed at how badly DLSS3 performs frame generation. There are vertex ID's in every 3D model, but the dumb software cant connect the dots between frames and keeps blurring them.
ОтветитьDLSS 3 has weird artifacts.
ОтветитьI am looking forward to the future of DLSS 3
ОтветитьLive frame generation isn't really a good idea with the current technology.
ОтветитьRDR2 has one of the most broken verison of DLSS and you never even spoke about it 😭
Ответитьhow does frame cap work with frame generation? If I set frame cap to 140 on 144 Hz monitor, do I get a max of 70 real frames? If so, do I need a 240 Hz monitor to make use of real frames between 70 and 144Hz?
ОтветитьI hope with Diablo4 all issues fixed :>
ОтветитьGood stuff
ОтветитьJust so you know, it's magic. You do not have the right and should not be upset about image enhancement and frame generation.
Otherwise, you will look like a child who is upset that the laptop does not have a touch screen. DLSS shouldn't have existed at all. It's a miracle that he exists. I'm an AI enthusiast and it's a very amazing real-time technology.
By the way, I don't see the problem of DLSS-ing the UI separately. But there is a more cunning and simple way, to overlay the UI in its original form on an improved DLSS image. Of course, you will have to make settings so that the UI is at a higher resolution than the game. But this problem must exist and be expected.
This tech is a revolution waiting to happen. Once this goes into mainstream cards / consoles, we’ll see a generational leap overnight.
Ответить165 Hz is “moderate” lmaooo
ОтветитьFor anyone reading this who enjoys MSFS. NVidias DLSS3/Frame generation does wonders for this title! crisp and clear gameplay for me, Yeah I know there are fake frames HOWEVER I don't notice any on my 4K monitor. I obsoletely think this DLSS3 is awesome
ОтветитьAnd my stance on this hasn't changed at all, frame generation brings nothing useful to gamers, because what we care about is response time and that doesn't chance and may actually decrease due to processing load the only way this will ever work properly is if the game engine is specifically told the frame is generated so it can take input on the ghost frames, and that wont happen for at least a few years if ever and by then EVEN more powerful hardware will exist that can run this whole generation of games flawlessly anyway.
Ответить