Borderlands 4 has had some varying results re: launch performance, and it seems to have re-ignited a debate over DLSS/Frame Generation. I think it’s an impressive technique that gives older PCs a little more staying power/stability, but there’s obviously some concern it’s being leaned on as a crutch for developers to avoid the need to properly optimize their engines.
What are your thoughts on DLSS (and similar technologies)? Do you use it and have you found it useful? Is it a cheat around better codework, and more importantly, does that even matter if the end result is that a game runs smoothly?
Does that technology have the plague of “hallucinated” information, which is common with AI-generated stuff?
Yes, it introduces ghosting and makes everything a motion-blur like smeary mess, even with motion-blur turned off. Especially on lower output framerates.
I think it’s actually pretty awful and yet people seem to love it, as if framerate is more important than anything else.
People obsessing over a single metric to the point of detriment to everything else? Didn’t we go through this with graphics and trying to achieve hyper realism?
we go through this with every single thing.
So does TAA, often even to a greater extent. The other options are no AA which leads to jaggies or super sampling which will tank your FPS.
All methods have their pros and cons. What’s worse is a matter of taste.
The new versions are much better than they used to be. The “Balanced” and especially “Quality” modes in the latest DLSS version are virtually indistinguishable from the native rendering.
Except that in reality DLSS actually massively reduces ghosting and smearing in most cases, and frame gen tends to make motion clearer as well (even though the generated frame quality can be iffy, you generally don’t see the artifacts.)
Gotta be honest, I haven’t seen the difference. Having said that, I use it to get from 45-50 FPS to 60 and with that starting point it works really well. I’ve heard that if you’re getting 15-20 fps then yeah, it looks like a melted turd which I can understand as there’s not enough reference frames.
So i am running B4 on a 4080 laptop gpu, at 1440p with settings mostly on high, without frame generation im sitting between 60-80. with frame generation i am mostly at 120, the crazy thing is, the input latency while there, is so minimal it doesnt bother me, and it has extremely bothered me in any other game i have tried it on. visually i see very, very little blur. this is the first game i have actually left it on. Borderlands 4 definitely needs better optimization, as it is not that great looking of a game, especially given how… Read more »
No. Older versions could have ghosting problems, particularly when you were dealing with things like bright lines on dark backgrounds, but it never invented things that aren’t there. New versions of the software (which are mostly backwards compatible to old hardware) are virtually indistinguishable from native rendering on the medium and high quality presets.
As a game developer, yes, it is overutilized (just like all the other AI tools), and is avoiding actually doing a good job at optimising performance.
More specifically, if you can’t be bothered to write optimised code, then you also don’t care enough to write bug-free code, and you are probably creating an unmaintainable mess.
Typical developer: In future it will be maintained by AI anyway, so it is not my problem. 🙂
Yeah I’m sorry but I’m going to put a few questionmarks next to the notion that everyone in every profession is just an opportunist looking for the easiest route to the highest paycheck. I know a few developers, they’re not the ones who look at it like this, that’s usually more people on the finances side. And even though I only know a few developers, the same goes for every other profession in my experience: most of the people who make stuff want to be able to take pride in their work and have a personal interest in doing a… Read more »
Thank you Mor for reading me correctly, and xvasek for re-reading it. It is refreshing to see proper conversation on the Internet 🙂
To my defense, I responded quickly, upset about yet another big game coming out in a bad shape, and AI used as a bandaid.
As Dorander put it perfectly, the devs don’t want to be sloppy, it is pressure from management that makes us cut corners and program things in a less than ideal way. A good programmer really takes pride in their work, and is not happy when half-baked solutions get released.
Xvasek, I’m pretty sure he’s criticizing it. Please re-read this sentence:
He’s clearly having an issue with it. And it’s understandable. The moment we allow technology to take over, we will become sloppy at what we do. Do that enough times and no AI is going to save your project.
Sorry for this, the comment was meant to be ironic. 🙁 Now reading it again – OK, it was a really bad joke, but the sad point is, that everybody is taking it seriousy, the world has really changed…
Nah it wasn’t even that bad of a joke, but way too many people read that as addressed at harinezumi instead just being some hypothetical typical corpo dev.
Sometimes jokes just don’t come across very well in text, you might hear your own voice in your head when you’re writing it but other people don’t have that advantage. Lay it on thicker next time? 😀
You are right though, something has changed. There’s so much insanity being spoken seriously these days that it’s become harder to tell if somebody’s joking or not.
As a developer myself, then they go to release on console and discover the Series S.
The Series S doesn’t have DLSS or the like, and is quite low spec, and MS insist devs must support it if they want to release on Xbox at all – so it forces a lot of optimisations.
There is a saying for what is happening right now. Minimum Viable Product. We’re being given the worst form the product can take because it’s cheaper to produce that way, and people think they can get away with it because technology to excuse their fuck ups exists. Unfortunately, the consumer is calling out their laziness because it’s so overtly visible. Randy fucked up, and fucked up again when he tried to deflect from his failings. Now, here’s the question. If DLSS wasn’t a thing, would Randy have let the game be released in such a messy state to cut down… Read more »
Yeah I’m sorry but that’s just not what Minimum Viable Product means. MVP (heh) is a strategy in which you first make sure the basics of your product are in order (minimum viability) and then start expanding additional functions. This strategy is frequently a good idea. It has nothing to do with offering a piece of basic junk and then never expanding on it. I’ll give you a real life example. A few years ago I was part of a group that was responsible for rolling out a big piece of educational software, that was replacing a similar program after… Read more »
Those were definitely all words.
I’m gonna go boot up Fallout New Vegas again..
Side note: playing Borderlands on console has been pretty smooth in general but I had to turn a hell of a lot of things off to avoid the massive motion sickness it was giving me. No FOV slider on console seems to be another huge gap for them.
Having played on medium quality and/or 30FPS for most of my life, enabling DLSS for better framerate and maybe a few artifacts no worse than janky AA or pop-in or whatever else seems like a perfectly fine tradeoff.
Haven’t had much use case for framegen tho.
Based on my experience as a gamer, I don’t think developers would properly optimize their engines, regardless of if they used DLSS or not. If they didn’t have things like DLSS to help, they would still push it out and instead just up the system requirements. Everything is secondary to the deadline. Once it hits, optimization or no, it is going out. Heck, some times the game isn’t even working and they still ship it.
This is why I’ve flat out stopped pre-ordering and will not buy a game until it’s been properly tested and optimized, and has had a few patches.
Since somebody’s likely to come along and tell me that I’m hurting the industry by relying on other customers for this; that this now has to happen post-release by other consumers is a tragedy but I’m not responsible for that choice. The industry decides to work this way, not me.
I mean pre-ordering wasn’t bad, until everything started going digital and developers realized that all they had to do was sell you a key to gain access to the “game”.
Pre-ordering has always been bad. Money has a time value, and while there’s not a lot of time value to $80.00, the amount that can be made off of 1,000,000 people’s $80 is staggering. Pre-ordering is simply a way for companies – especially AAA ones – to make massive interest on your money without even giving you a product in return. That’s truly the only reason it ever became a thing. If ever there’s a game you KNOW you want for sure, and you really want the “pre-order bonuses” to go with it, wait until just before it comes out… Read more »
Maybe you are hurting the industry, maybe you aren’t, but it’s not your responsibility to avoid doing so. That industry exists to sell you a product. They’re doing an absolutely terrible job of making that sale. So ya, they’re the ones that get hurt by their stupid short-sighted development decisions. I too won’t pre-order anything, and I’m incredibly skeptical of EA titles now as well. I think Abiotic Factor and Enshrouded are the only EA games I’ve been willing to buy in on in recent years.
Exactly. When did it become the customer’s responsibility to quality test their products and having to pay for the privilige, rather than them showing us the product’s qualities so we can make an informed decision on whether or not to spend our money?
I’m just……NOT going to open anything that’s bigger than 100gb? It’s a good place to start.
I take GamersNexus’ stance on DLSS 4: Frame Generation is “fake frames”, and Nvidia’s claims about “4090 performance in the 5070” are complete bullshit.
That said, DLSS 3 is a perfectly fine technology for eeking a few extra FPS out of games that your GPU isn’t quite up to snuff for. That’s just running the game at a lower resolution and using real-time AI upscaling to get back to native rez.
I have an older GPU (RTX 2060) and a laptop which is not exactly a powerhouse. So I like what DLSS can do for me with regards to games which would otherwise look or run way worse on those systems.
But if you have a fairly powerful new system and need to resort to these AI shenanigans right away with a new release then I would call it a crutch.
Personally, I think DLSS is a crutch for nVidia. I’m calling them out because DLSS came first, and AMD had to follow suit with a similar technology. Basically it’s a way for them to show an “increase” in performance by using an upscaler. Instead of improving their chips more, they used the crutch of DLSS to show theirs is “better” and get a product out on shelves. To me, it doesn’t feel like a great value for my dollar as I’m now paying for “filler” and not raw performance. I do have to hand it to AMD, though – by… Read more »
DLSS in quality mode in most games I’ve tried have given a better image quality than TAA, plus better performance. When that’s the case, I don’t see any issue.
Even frame generation has its use, but it’s not a panacea. It does nothing to input latency for example.
Im just annoyed by the core requirements on cpus. Was playing oblivion remaster just fine on my i7-7700k until they patched it. I haven’t even bought borderlands because it has the same requirements.. so I guess its time to upgrade.
But in general, games have been doing that for a long time. Put as much work into it as they can push the publisher, then patch afterwards. Now they use new tools to skimp even further.
What I liked about DLSS and FSR. Is that they made systems capable of a tier up of resolutions. Systems that were good with 1080p suddenly where comfortably doing 1440p and so on. Just like that 4K Wasn’t just for high-end. It was in reach of mid-range, with 8K being eyed as the next step for enthusiasts. It is disheartening to hear companies try to undo this and especially to see fans pretend that it is perfectly normal again to having to set everything to the lowest setting to play on 4K. Claiming it just isn’t time yet for 4K.… Read more »
I have mixed feelings about this. On the one hand, I fully support and endorse companies including ultra-high-end settings that exceed the top end of what modern machines can support. 8k+ resolutions 500 hz refresh rates, ultra realistic complex shadows and lighting, etc. I view this as future-proofing and putting in extra work to ensure the game can look good on the gaming machines of 5 years from now in the future. This is a good thing. On the other hand, if for some reason just doing 1440p with fancy graphics on par with other modern games for some reason… Read more »
Let’s be fair, we’ve never really had insight into the code behind any game. Whether a game was deemed ‘optimized’ or not was only ever measured by the end-result that consumers experienced. Smooth experience? Game must be ‘properly optimized’. Glitches, stuttering, tearing and whatnot? Game must be ‘badly optimized’.
So if that end result is a positive experience of a smooth running and good looking game, why would we suddenly now start caring what’s “under the hood” when we never really cared how it was done before?
DLSS is … fine … if you want to up your frame rate by rendering at a lower resolution and then upscaling. If a game comes with it on by default I’ll leave it to see if it feels or looks wrong. So far, no issues.
But Frame Generation? The issue there is it’s *not* something to lift up a lower spec PC, if that PC can’t manage a high enough frame rate ‘naturally’. It’ll just start feeling floaty (only real frames will be informed from user input changes), and be vulnerable to *multi*-frame generation causing artifacts.
Developers (and the misers they’re leashed to) using new tech as a crutch to get away with badly optimized code is a tale way older than most of you reading this comment. It doesn’t really matter what you think of it, because all of you will vote with your voices and practically none of you will vote with your wallets and feet, so why would they care?
This is absolutely true. DLSS is being used as a crutch to avoid optimizing games.
The other side of the issue is that the game optimization that IS still being done is done backwards. They first create the game, then at the very end try to quickly optimize where they still can before or even after release. Optimization should be a part of the process from the very beginning. But big companies don’t want that, as then development time would increase and the big companies wouldn’t be able to keep the yearly release cycle going…
It is absolutely a crutch. The AAA industry can’t be trusted with anything.-
My CPU and graphics card don’t meet the minimum for Borderlands 4. Since my computer came with Windows 8 and is upgraded to 11, it’s not surprising.
It’s a tool, just like any other. It can be used properly for great benefit, or be used as a crutch by teams who are not given the time by higher ups to do things properly. If you put bean-counters in charge of game development… that’s exactly what’s going to happen. The game runs with X, you get X. That’s it. No thought put to proper development and engineering.
So, the last bastion of code optimization (games) is about to fall? The world of ridiculous hardware requirements is probably here to stay.
The ensh*ttification of the world continues.
I think the DLSS thing matters way less than the condescending “fuck you, you try doing X thing if you think you’re so smart” attitude from devs.
Running Flight Simulator 2024 on a core i5 9600K and a 4060 RTX in 4K. I’m quite thankful for DLSS, because if it weren’t there, I’d play a mess of pixels.
I could run it in FHD or any other resolution instead of 4K, but on my screen, it’s really awful when I’m lowering the resolution.
I love DLSS, but yeah it can be viewed as a crutch sometimes by developers. In this case though, I think Randy is just using it as an excuse. People forget that BL3 also launched in nearly as poor of shape (and never got fixed) and that was before upscaling was a thing.
I’ve had good luck with BL4 on my nearly 5 year old pc, though It’s seen some upgrades since the beginning. I have mixed feelings about DLSS as I’ve always wanted to have a powerful enough pc to run at “Maximum Pretty” on whatever games I had at the time. And used to be able to do that with mid-high tier GPUs. Now it seems with DLSS and frame gen stuff, even the mighty top of the line card can’t quite keep up. Although I also remember a time when game devs would occasionally put something out, with graphics options… Read more »
I have to confess that I don’t understand any of this
Pat of the problem is Nvidia, the r other part is Unreal. Basically Nvidia is promising graphic standards than even the 5090 can’t deliver on. Unreal games are build on those promises, and you end up with a cluster cluck. Like 4K gaming is just not a thing that works on ultra, maybe medium, but even on the most high end card path tracing just eats to much processing power for a 5090 to handle. To hide this Nvidia has pushed DLSS, upscaling, and frame gen.So even though you may get enough frames, your latency is horrible. Now, I have… Read more »
The main problem is that “older PCs” don’t have the capability for frame generation at the level required for Borderlands 4 to run smoothly. People who actually need frame gen don’t have it and the people with GPUs new enough to support it shouldn’t need it in cell shaded game. Also, input lag. It’s definitely a crutch the industry is leaning on way too hard. Personally, I’m not interested in frame generation at all.
Gearbox can go fu*k themselves. I refuse to use any of that crap and rather turn down the settings. it just doesn’t look great. And their terrible optimization make them even worse.
EVERY AAA UE5 game runs like shit when it’s released now. they all go by the “release now, fix later, fu*k the gamers” saying. They already got your money so they don’t give a fuck.
The tech is certainly very useful, but like any technology it can make things look worse if it isn’t use properly, as Jedi Survivor’s infamous fizzle can attest to.
Most of the time I just use whatever the game defaults too. I have noticed a few games be “smeary” but usually only with games I’m going to be adjusting graphics on any.
AAA studios at least are relying on it to forgo optimization it feels like with Borderlands 4 being the prime example.
What is this and do I need it for NetHack? 😉
Yes I use it and love it. But it also shouldn’t be required for top end PCs to run the game at 60FPS and max details.
jeah the dev of that game seems to agressively try to get everyone to hate him … considering what a disaster BL3 was i wasnt going to get 4 until i saw “extremly good reviews” .. and then the reviews for the game were bad the performance was bad … and then the dev was extremly anti-consumer … jeah wont be getting that game
Missed opportunity to have 4 plates on the bar instead of just 2.
The problem isn’t with Gearbox not optimizing on PCs. The problem is with GPU power stagnation and the blame *should* be on Nivdia and AMD for allowing it. Every Nvidia chart nowadays has that little asterisk of ‘using DLSS’ on it these days. In a perfect world, DLSS and similar tech would be used to let lower end PCs play higher end stuff, but that’s not what the reality is. Also, DLSS and similar tech is also used as an anti-aliasing solution as well, so it’s double dipping in the performance category. I would love to have high end gpus… Read more »
Frames are frames. Insert Mr Incredible meme here.
I don’t consider the generated frames to be “fake” even if they did come from an AI.
Frame generation gives my computer a little extra overhead so that it can handle higher settings without taking as much of a hit to game performance. That said, it can become a crutch for unoptimized code, and I’d rather have an optimized game with a higher frame rate than have to rely on frame generation.
I use it to lower wattage in my games. CIV VII does not need full rendering of every frame, but when I scroll, I do not want to see lagging. And I do not need the full power of my gpu running when its just replaying the near-same frame over and over again.
DLSS drastically lowered my gpu-load, therefore less watts to be paid.
If Gearbox can’t be arsed to optimise the game prior to release, then I can’t be arsed to pay release price for it.
Man, am I glad I’ve always hated Borderlands.