this post was submitted on 20 Sep 2023
4 points (83.3% liked)

Gaming

19959 readers
7 users here now

Sub for any gaming related content!

Rules:

founded 5 years ago
MODERATORS
top 25 comments
sorted by: hot top controversial new old
[–] hot_milky@lemmy.ml 4 points 1 year ago

It's not a prediction, {Company} will simply push whatever future that benefits {Company}.

[–] Krotiuz@kbin.social 4 points 1 year ago (1 children)

I'm one of those people that uses DLSS, because I've got a large fancy 4k monitor that is big enough that is looks like shit at lower resolutions.

DLSS is better than nothing but it's no replacement for native rendering, it introduces a heap of visual anomalies and inconsistencies, especially in games with a consistent motion (racing games look like shit with DLSS), so I tend to be having lows of 50fps on medium before I'll even think about DLSS.
I'm also pretty sure Nvidia is paying devs to have it on by default, because everytime it's patched into a game they clear all the current graphics settings to turn on DLSS, at least in my experience.

[–] Nefyedardu@kbin.social 4 points 1 year ago

I hate how AI upscaling looks and I really don't get why everyone seems to be gaga over it. In addition to the artifacts and other weirdness it can introduce, it just looks generally like someone smeared vaseline over the picture to me.

[–] Fizz@lemmy.nz 2 points 1 year ago (1 children)

Why is native gaming out and dlss here to stay? I hate the feel and look of dlss and fsr.

[–] refurbishedrefurbisher@lemmy.sdf.org 1 points 1 year ago* (last edited 1 year ago)

Because Nvidia wants an excuse to continue price gouging consumers on midrange cards.

[–] vrighter@discuss.tchncs.de 2 points 1 year ago (1 children)

I prefer native. If you can't render something, then just don't. Not make everything else worse too just so you can claim to use a feature, and then try to make up junk to fill in the gaps. upscaling is upscaling. It will never be better than native.

[–] Amir@lemmy.ml 1 points 1 year ago* (last edited 1 year ago) (1 children)

Have you tried DLSS Quality on 1440p or 4K? I genuinely think it looks like better anti aliasing than MSAA 4x or whatever you usually would use.

[–] vrighter@discuss.tchncs.de -1 points 1 year ago

they have to "guess" what data they should fill up the missing data with. Or you could render natively and calculate, so you don't have to guess. So you can't get it wrong.

[–] mindbleach@sh.itjust.works 2 points 1 year ago

I like the concept. I don't like Nvidia making up neat gimmicks as anti-competitive behavior.

[–] lorty@lemmy.ml 1 points 1 year ago (1 children)

So long games don't force it to be on, then whatever. Although I expect it to become a requirement for a usable framerate for next gen games. Big developers don't want to optimize anymore and upscaling/framegen technologies are a great crutch.

[–] emptyother@programming.dev 0 points 1 year ago (2 children)

Of course nobody want to optimize. Its boring. It messes up the code. Often reqires one to cheat the player with illusions. And its difficult. Not something just any junior developer can be put to work on.

[–] miss_brainfart@lemmy.ml 0 points 1 year ago (1 children)

You'd expect that when Raytracing/Pathtracing and the products that drive it have matured enough to be mainstream, devs will have more time for that.
Just place your light source and the tech does the rest. It's crazy how much less work that is.

But we all know the Publishers and shareholders will force them to use that time differently.

[–] DWin@sh.itjust.works 0 points 1 year ago

Eh, in my experience that's not how development works. With every new tool to improve efficiency, the result is just more features rather than using your new found time to improve your code base.

It's not just from the publishers and shareholders either. Fixing technicial debt issues is hard, and the solutions often need a lot of time for retrospection. It's far easier to add a crappy new feature ontop and call it a day. It's the lower effort thing to do for everyone, management and the low down programmers alike.

[–] iegod@lemm.ee -1 points 1 year ago (1 children)

Who are you directing the comments to? The dev company or individuals? I disagree in the latter. On the former I still think it's a mischaracterizatuon of the situation. If the choice is to spend budget in scope and graphics at the expense of optimization that doesn't seem a hard choice to make.

[–] emptyother@programming.dev -1 points 1 year ago

I might have generalized a bit too much. Of course some individual devs love the challenge of getting better performance out of anything.

But not enough of them that every dev company has an army of good developers who know how to do this with the expertise they are needing performance for. Theres a lot of ways one dev can specialize: gpu api (directx/opengl/vulcan/etc), os, game engine, disk access, database queries. One who knows graphic api well might not know how to optimize database queries. It doesnt help throwing money at the problem either, those who know this stuff usually already have good jobs. So you might have no choice than to use the devs you have, and the money you have budgeted, to release the game within contracted time.

[–] PenguinTD@lemmy.ca 1 points 1 year ago (1 children)

I don't get this "raw pixels are the best pixels" sentiment come from, judging from the thread everyone has their own opinion but didn't actually see the reason behind why people doing the upscalers. Well bad news for you, games have been using virtual pixels for all kinds of effects for ages. Your TV getting broadcast also using upscalers.(4k broadcast not that popular yet.)

I play Rocket Leauge with FSR from 1440p to 2160p and it's practically looking the same to 2160p native AND it feels more visually pleasing as the upscale also serve as extra filter for AA to smooth out and sharpen "at the same time". Frame rate is pretty important for older upscaler tech(or feature like distance field AO), as many tech relies information from previous frame(s) as well.

Traditionally, the render engine do the stupid way when we have more powerful GPU than engine demand where the engine allows you to render something like 4x resolution then downscale for AA, like sure it looks nice and sharp BUT it's a bruteforce and stupid way to approach it and many follow up AA tech prove more useful for gamedev, upscaler tech is the same. It's not intended for you to render 320x240 then upscale all the way to 4k or 8k, it will pave way for better post processing features or lighting tech like lumen or raytracing/pathtracing to actually become usable in game with decent "final output".(remember the PS4 Pro checkboard 4k, that was a really decent and genuinely good tech to overcome PS4 Pro's hardware limit for more quality demanding games. )

In the end, consumer vote with their wallet for nicer looking games all the time, that's what drives developers gear toward photo real/feature film quality renderings. There are still plenty studio gears toward stylized, or pixel art and everyone flip their shit and praise while those tech mostly relies on the underlying hardware advance pushed by photo real approach, they just use the same pipeline but their way to reach their desired look, Octopath Traveler II used Unreal Engine.

Game rendering is always about trade-offs, we've come a LONG way and will keep pushing boundaries, would upscaler tech become obsolete somewhere down the road? I have no idea, maybe AI can generate everything at native pixels, right?

[–] miss_brainfart@lemmy.ml 1 points 1 year ago* (last edited 1 year ago)

I don't have anything against upscaling per se, in fact I am surprised at how good FSR 2 can look even at 1080p. (And FSR is open source, at least. I can happily try it on my GTX 970)

What I hate about it is how Nvidia uses it as a tool to price gouge harder than they've ever done.

[–] hubobes@sh.itjust.works 1 points 1 year ago (1 children)

But it looks bad and frame gen has very noticeable latency.

[–] exscape@kbin.social -1 points 1 year ago (1 children)

Doesn't always look bad, it's game dependent.

In hardware unboxeds test, it's better than native in some titles, and better or equivalent in half of titles tested.

It looks way better than native w/ TAA for me in BG3 (at 1440p). TAA is way too blurry. And yet it's also faster.

[–] hubobes@sh.itjust.works 1 points 1 year ago

That’s like their opinion. In every single game I tried it looked worse than native. I always try it at first and then usually disable it.

[–] Maruki_Hurakami@lemm.ee 0 points 1 year ago (1 children)

Maybe I did something wrong but I tried DLSS in BG3 with my 2080 and it looked really bad.

[–] jacaw@sh.itjust.works 0 points 1 year ago (1 children)

For some reason, Larian shipped an old version of DLSS with the game. It looks better if you swap out the DLL for a newer one. I use DLAA on my 3070 TI and it looks good, but I did have to swap the DLL.

[–] SHOW_ME_YOUR_ASSHOLE@lemm.ee 0 points 1 year ago (1 children)

I'm down to try that on my 3070 laptop, are there good instructions for this process?

[–] jacaw@sh.itjust.works 1 points 1 year ago* (last edited 1 year ago)

You can either use DLSS Swapper or manually download a new DLL and drop it in yourself. It’s essentially just replacing the nvngx.dll in the game’s directory with a new one.

There are some issues, though - for example, upgrading from a version prior to 2.5.1 will disable the use of the sharpness slider. I mitigate this by using DLSSTweaks to force preset C, which favors the newest frame more heavily.