Borderlands 4 has had some varying results re: launch performance, and it seems to have re-ignited a debate over DLSS/Frame Generation. I think it’s an impressive technique that gives older PCs a little more staying power/stability, but there’s obviously some concern it’s being leaned on as a crutch for developers to avoid the need to properly optimize their engines.
What are your thoughts on DLSS (and similar technologies)? Do you use it and have you found it useful? Is it a cheat around better codework, and more importantly, does that even matter if the end result is that a game runs smoothly?
Does that technology have the plague of “hallucinated” information, which is common with AI-generated stuff?
Yes, it introduces ghosting and makes everything a motion-blur like smeary mess, even with motion-blur turned off. Especially on lower output framerates.
I think it’s actually pretty awful and yet people seem to love it, as if framerate is more important than anything else.
People obsessing over a single metric to the point of detriment to everything else? Didn’t we go through this with graphics and trying to achieve hyper realism?
So does TAA, often even to a greater extent. The other options are no AA which leads to jaggies or super sampling which will tank your FPS.
All methods have their pros and cons. What’s worse is a matter of taste.
As a game developer, yes, it is overutilized (just like all the other AI tools), and is avoiding actually doing a good job at optimising performance.
More specifically, if you can’t be bothered to write optimised code, then you also don’t care enough to write bug-free code, and you are probably creating an unmaintainable mess.
Typical developer: In future it will be maintained by AI anyway, so it is not my problem. 🙂
As a developer myself, then they go to release on console and discover the Series S.
The Series S doesn’t have DLSS or the like, and is quite low spec, and MS insist devs must support it if they want to release on Xbox at all – so it forces a lot of optimisations.
There is a saying for what is happening right now. Minimum Viable Product. We’re being given the worst form the product can take because it’s cheaper to produce that way, and people think they can get away with it because technology to excuse their fuck ups exists. Unfortunately, the consumer is calling out their laziness because it’s so overtly visible. Randy fucked up, and fucked up again when he tried to deflect from his failings. Now, here’s the question. If DLSS wasn’t a thing, would Randy have let the game be released in such a messy state to cut down… Read more »
Those were definitely all words.
I’m gonna go boot up Fallout New Vegas again..
Side note: playing Borderlands on console has been pretty smooth in general but I had to turn a hell of a lot of things off to avoid the massive motion sickness it was giving me. No FOV slider on console seems to be another huge gap for them.
Based on my experience as a gamer, I don’t think developers would properly optimize their engines, regardless of if they used DLSS or not. If they didn’t have things like DLSS to help, they would still push it out and instead just up the system requirements. Everything is secondary to the deadline. Once it hits, optimization or no, it is going out. Heck, some times the game isn’t even working and they still ship it.
I’m just……NOT going to open anything that’s bigger than 100gb? It’s a good place to start.
I take GamersNexus’ stance on DLSS 4: Frame Generation is “fake frames”, and Nvidia’s claims about “4090 performance in the 5070” are complete bullshit.
That said, DLSS 3 is a perfectly fine technology for eeking a few extra FPS out of games that your GPU isn’t quite up to snuff for. That’s just running the game at a lower resolution and using real-time AI upscaling to get back to native rez.
I have an older GPU (RTX 2060) and a laptop which is not exactly a powerhouse. So I like what DLSS can do for me with regards to games which would otherwise look or run way worse on those systems.
But if you have a fairly powerful new system and need to resort to these AI shenanigans right away with a new release then I would call it a crutch.
DLSS in quality mode in most games I’ve tried have given a better image quality than TAA, plus better performance. When that’s the case, I don’t see any issue.
Even frame generation has its use, but it’s not a panacea. It does nothing to input latency for example.
Im just annoyed by the core requirements on cpus. Was playing oblivion remaster just fine on my i7-7700k until they patched it. I haven’t even bought borderlands because it has the same requirements.. so I guess its time to upgrade.
But in general, games have been doing that for a long time. Put as much work into it as they can push the publisher, then patch afterwards. Now they use new tools to skimp even further.