this post was submitted on 16 Sep 2024
77 points (92.3% liked)

PC Gaming

8250 readers
558 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 37 comments
sorted by: hot top controversial new old
[–] Kyrgizion@lemmy.world 106 points 3 days ago (3 children)
[–] alessandro@lemmy.ca 52 points 2 days ago

Can’t or won’t?

"money"

[–] dan1101@lemm.ee 3 points 2 days ago

Bosses said use AI so we use AI.

[–] givesomefucks@lemmy.world 1 points 2 days ago (1 children)

Why wouldn't it?

It's talking about two things "AI" which is actually a pretty good use of the label

  1. Generatinga lower Rez screen and upscaling.

  2. Generating addition frames based on what might happen in between real screens

There's no valid reason not to use that. Hardware costs more so you'd be paying a lot more money for the same performance. With less people making that choice, the price differential would be even greater.

Like, this is right. They can't make them without this for low enough people will buy it.

It's facts bro

[–] Eldritch@lemmy.world 2 points 2 days ago (1 children)

In the future it'll be less about upscaling. And more about giving the algorithm a minimalist basic 3D representation as a starting point. And then told to make it photo real. Ray tracing isn't really going anywhere. But AI radiosity is going to supplant it in many applications.

Think about it. These algorithms are already making impressive if uncanny images from Simple Text prompts. In less time than it would take most CPU GPU combinations on consumer Hardware to actually Ray trace a scene. Ray tracing will always be there when you need the accuracy. But AI radiosity is going to offer benefits most people don't even comprehend yet.

For instance once it makes its way into consumer Hardware etc. Suddenly a lot of older games will be able to have their Graphics upgraded with no recoding or tricks. Just using the input video stream as a reference.

[–] givesomefucks@lemmy.world 1 points 2 days ago (1 children)

That's an entirely different thing and not happening anytime soon...

That's the worst thing about labeling this stuff "ai" it chefs lumped into crazy no feasible shit like you're talking about.

[–] Eldritch@lemmy.world 3 points 2 days ago (1 children)

Wrong as usual. Heres a video from 2 years ago. From an actual light transport researcher for 3d rendering. Again, this likely won't be on your 50 series GPU. But it is something very actively being researched. And likely will start showing up in consumer hardware before the end of the decade.

[–] givesomefucks@lemmy.world -4 points 2 days ago (1 children)

A two year old YouTube prediction with no other evidence?

Obviously you're a man of science...

[–] Eldritch@lemmy.world 4 points 2 days ago (1 children)

It's literally just a two to three minute long video. With literal links to the research paper in question. Perhaps you should read more and talk less.

[–] givesomefucks@lemmy.world -1 points 2 days ago* (last edited 2 days ago)

It's weird on a day old thread you immediately get two extra upvotes, and I get three downvotes as soon as you reply...

I don't help people who play thos silly games, have a nice life.

[–] _sideffect@lemmy.world 10 points 2 days ago

Yeah, that's a big load a bull crap

[–] henfredemars@infosec.pub 48 points 3 days ago (1 children)

That's rather depressing to hear. AI is often used as a crutch used to pave over crappy code that would cost money to properly optimize. Maybe Nvidia is also using AI as a crutch instead of developing better GPUs that can actually render more pixels?

[–] catloaf@lemm.ee 10 points 2 days ago (1 children)

Usually people are against just throwing more hardware at a problem.

They're going to keep making more powerful hardware either way, since parallel processing capability supports graphics and AI just fine. But if they can use a novel software solution to drastically increase performance, why not?

[–] tunetardis@lemmy.ca 10 points 2 days ago (2 children)

They’re going to keep making more powerful hardware either way, since parallel processing capability supports graphics and AI just fine.

It's not quite as simple as that. AI needs less precision than regular graphics, so chips developed with AI in mind do not necessarily translate into higher performance for other things.

In science/engineering, people want more—not less—precision. So we look for GPUs with capable 64-bit processing, while AI is driving the industry in the other direction, from 32 down to 16.

[–] catloaf@lemm.ee 4 points 2 days ago (1 children)

For science and engineering, workstation cards like the A6000 aren't going anywhere.

[–] henfredemars@infosec.pub 1 points 2 days ago

That’s true, but I would like to see improvements driven along the consumer segment also. AI rendering is a nice software addition but I could easily see it becoming a distraction from hardware improvements.

Consumers generally can’t just throw more money at a problem in the way that professional and business can.

[–] averyminya@beehaw.org 3 points 2 days ago

It's funny because we don't even need GPU's. There's tech that offloads the model's "search" to an analog computer which is ~98% accurate for a fraction of the energy.

I imagine NVIDIA isn't too excited about that side of AI, though.

[–] Mango@lemmy.world 34 points 2 days ago (1 children)

AMD can.

Sucks to suck, Nvidia!

[–] d00ery@lemmy.world 9 points 2 days ago (1 children)
[–] Mango@lemmy.world 8 points 2 days ago (1 children)

K, but do we NEED it? Can we not continue without it?

[–] d00ery@lemmy.world 2 points 2 days ago

I should say that that's a fairly recent news article talking about the next FSR in development.

As to if we need it ... I don't know, we certainly managed a long time without AI everything!

[–] TommySoda@lemmy.world 18 points 3 days ago* (last edited 2 days ago) (2 children)

This just seems like they are trying to take a shortcut that might end up having unforeseen consequences. I have no problem with AI upscaling as a technology. It's already proven its merit with almost all triple A games that have come out in the past few years. But this just seems like a way to push the cost off onto consumers by making them buy more expensive hardware at the cost of efficiency. Games are so poorly optimized these days that this just seems like another way to release games that run like ass. If you see this as a benefit in any way, just remember that we will all be paying the extra cost that they get to save.

And of course there's gonna be people that'll just be like "upgrade your PC, bro" which just makes us fight amongst ourselves instead of fighting the companies that are fucking us over. We'll fight each other for hours on end about how shitty someone's PC is before we even consider that the game they are playing is so poorly optimized it's a miracle it even works on a high end PC. It's already to the point that a $4,000 PC isn't even enough to play some common triple A titles at a good frame rate. I can play God of War at the highest setting with no issues whatsoever but can't even play Jedi Survivor at a stable frame rate. Sure a better PC would achieve better results, but that's not a hardware issue.

[–] MonkderVierte@lemmy.ml 2 points 2 days ago* (last edited 2 days ago)

Games are so poorly optimized these days

Yeah. Valheim runs with 2 FPS in the menu on my iGPU that runs even badly optimized Ark Survival on medium settings.

[–] sunzu2@thebrainbin.org 5 points 2 days ago

Maybe gamers should star withholding money... that's the most effective way to regulate these clowns. Deny them profit unless their product/service DESERVES to be rewarded.

Remember every time you give a shiti company money, you are feeding your enemy. They turn around and use this money to enslave you as worker AND customer.

Last 15 years clearly painted a picture of who and what we are dealing with... don't collaborator with the corpo oppressor. something about six foot pole...

[–] jlow@beehaw.org 14 points 2 days ago (1 children)

You better look for a new job then and let someone tale over who can, then.

[–] Artyom@lemm.ee 1 points 1 day ago

Lazy CEOs just don't want to work anymore!

[–] warm@kbin.earth 15 points 2 days ago

Fuck off Nvidia.

[–] reddig33@lemmy.world 10 points 2 days ago (1 children)

Gotta sell more video cards!

[–] MonkderVierte@lemmy.ml 1 points 2 days ago
[–] szczuroarturo@programming.dev 2 points 2 days ago

Yes they cant. Dlss is something they developed and every single one of their GPU has cuda cores ( not only for ai , they are just generaly usefull ). Pepole are expecting them to work with dlss. Its kinda stating the obvius

[–] edgemaster72@lemmy.world 9 points 2 days ago

Can't be bothered to*

[–] Ghostalmedia@lemmy.world 5 points 3 days ago (3 children)

Can someone EL5 the pros and cons of upscaling? Why is this so controversial with some gamers?

[–] warm@kbin.earth 16 points 2 days ago (1 children)

Pros, more fps on low end hardware. Cons, worse image, ghosting, blur, artifacting, lower overall performance because devs rely on upscaling.

It's existence is a crutch. Games should be made properly and not rely on ML upscaling for meaningful performance.

[–] Ghostalmedia@lemmy.world 5 points 2 days ago (1 children)

Thanks! So, from what I grok, the claim is basically that the games could probably run fine if they were written and optimized properly, but since they’re probably not, people have to buy a GPU that applies a bandaid solution. Right?

[–] warm@kbin.earth 8 points 2 days ago

Yep. As more people buy GPUs that have the capabilities to use machine learning upscaling (the bandaid) then the more likely developers are to use it instead of spending time improving performance.

I see it the most in Unreal Engine games, Unreal Engine allows devs to make a "realistic" style game fast, but performance is often left in the dirt. UE also has some of the worst anti-aliasing out of the box, so DLSS for example, is a good catch all to try and improve framerates and provide some AA, but instead you just get a lot of blur and poor graphical fidelity. The issues probably don't exist at higher resolutions, like 4K (which is maybe what they develop with), but the majority of people still use 1080p.

Oops sorry for the rant! I just got pissed off with it again recently in Satisfactory!

[–] swab148@lemm.ee 8 points 2 days ago

Basically, they use AI as a crutch instead of making the games better. This is bad because it will require more power and more expensive hardware to run the AI.

[–] catloaf@lemm.ee 4 points 2 days ago

It's just another graphics-enhancing tool. But instead of fixed-algorithm antialiasing, it's machine-learning-powered "fill in the blank".

I don't think this application is really all that controversial. Some people are unhappy that AAA games are going to expect or require compatible hardware in the future for no good reason, if the game itself runs just fine without it.