this post was submitted on 29 Dec 2023
306 points (97.5% liked)

Technology

59086 readers
3617 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.

top 50 comments
sorted by: hot top controversial new old
[–] Stovetop@lemmy.world 155 points 10 months ago

But boy did it change the price you have to pay for it.

[–] lemann@lemmy.dbzer0.com 125 points 10 months ago (6 children)

Hands up if you/someone you know purchased a Steam Deck or other computer handheld, instead of upgrading their GPU πŸ™‹β€β™‚οΈ

To be honest I stopped following PC hardware altogether because things were so stagnant outside of Intel's alder lake and the new x86 P/E cores. GPUs that would give me a noticeable performance uplift from my 1060 aren't really at appealing prices outside the US either IMO

[–] givesomefucks@lemmy.world 62 points 10 months ago (2 children)

It's diminishing returns.

We need a giant leap forward to show a noticeable effect now.

Like, if a cars top speed was 10mph, a 5 mph increase is fucking huge.

But getting a supercar to top off at 255 instead of 250, just isn't a huge deal. And you wouldn't notice unless you were testing it.

So even if they keep increasing power at a steady rate, the end user is going to notice it less and less everytime.

[–] Xiaz@lemmy.world 53 points 10 months ago (1 children)

We had hardware getting massive leaps for years. Problem is, devs got used to hardware having enough grunt to overcome lack of optimizations. Now we got shit coming out barely holding 60+ on 4080s and requiring usage of FSR or DLSS as a bandaid to make the game get back to playable framerates.

If you’ve got 30 series or 7000 series from AMD you don’t need to look for a more performant card, you need devs to put in time for polish and optimization before launch and not 6 months down the line IF the game is a commercial success.

[–] anlumo@lemmy.world 17 points 10 months ago (6 children)

Hell, Cyberpunk 2077 dropped 10-20fps with the last patch on my 4090, and the devs don’t care enough to fix it.

Cities Skylines 2 aims for only 30fps, and it can’t even hit that on my pretty good gaming PC.

load more comments (6 replies)
[–] ugjka@lemmy.world 18 points 10 months ago (1 children)

Money is in the AI chips for datacenters, i think regular consumers will be more more only getting dinner's leftovers

load more comments (1 replies)
[–] uninvitedguest@lemmy.ca 17 points 10 months ago (1 children)

From 2020 I planned on building a new gaming PC. Bought an ITX case and followed hardware releases closely... And then got disillusioned with it all.

Picked up a Steam Deck in August of 2022 and couldn't be happier with it. The ITX case is collecting dust.

[–] theangryseal@lemmy.world 13 points 10 months ago (1 children)

I game exclusively on my Steam deck these days.

I absolutely love it. I dock it and use the desktop as my standard pc too. It does everything I need it to do.

load more comments (1 replies)
[–] myfavouritename@lemmy.world 5 points 10 months ago

That's exactly what I did!

[–] TheDarkKnight@lemmy.world 4 points 10 months ago

Yep. In fact I bought a second deck (OLED) instead of upgrading GPU’s. Prices are nuts, I’ll wait.

load more comments (2 replies)
[–] crsu@lemmy.world 77 points 10 months ago (3 children)

You guys think I should upgrade my Voodoo 3 card? No one is joining my quake server anymore anyway

[–] xan1242@lemmy.ml 18 points 10 months ago

Come play Unreal with us then hehehe

[–] Rai@lemmy.dbzer0.com 8 points 10 months ago

We’ve all moved over to The Specialists!

Nah you need that Fireball upgrade man

[–] just_change_it@lemmy.world 60 points 10 months ago (2 children)

Given technological progress and efficiency improvements I would argue that 2023 is the year the gpu ran backwards. We've been in a rut since 2020... and arguably since the 2018 crypto explosion.

[–] Vash63@lemmy.world 12 points 10 months ago

Nah 2022 it was running backwards far more. 2023 was a slight recovery but still worse than 2021.

[–] AnneBonny@lemmy.dbzer0.com 7 points 10 months ago (6 children)

I feel the same way. I don't have the data to prove it.

load more comments (6 replies)
[–] Cyberjin@lemmy.world 42 points 10 months ago (1 children)

I wanted to upgrade my 1060 for the longest time for something like the 3080. But during to demand and prices hikes, I waited.. 40 series got released and the prices stayed high.

So I just gave up, I got a steam deck and PS5 instead.

[–] nova_ad_vitum@lemmy.ca 12 points 10 months ago (1 children)

A lot of people did this. The GPU market for gaming might have actually shrunk. You would think Nvidia would panic but due to AI chip demand their stock is at an ATH and no company changes course or reevaluates and what they're doing when shareholders are lining up to suck their dicks, so...no end in sight. Meanwhile AMD doesn't seem to want to even try to make a play for market share.

[–] Cyberjin@lemmy.world 7 points 10 months ago (5 children)

Technically AMD does have more market share when you think about all the devices has AMD in them like Playstation, Xbox, steam deck and other handhelds.

But yeah Nvidia doesn't care about gaming anymore, If I had to pick a GPU today, I would pick AMD because Nvidia 6-8 VRAM isn't enough and AMD is better on linux.

load more comments (5 replies)
[–] trag468@lemmy.world 29 points 10 months ago (4 children)

Still rocking a 1080. I don't see a big enough reason to upgrade yet. I mostly play PC games on my steam deck anyways. I thought starfield was going to give me a reason. Cyberpunk before that. I'm finally playing cyberpunk but the advanced haptics on PS5 sold me on going that route over my PC.

[–] Kit@lemmy.blahaj.zone 7 points 10 months ago

I just "upgraded" from a GTX 1080 to an RTX 4060 Ti 16Gb, but only because I was building a PC for my boyfriend and gave him the 1080. I'm really not seeing a noticeable difference in frame rate on 1440p.

[–] Yokozuna@lemmy.world 6 points 10 months ago

1080 gang rise up.

But seriously, my 1080 does fine for most things, and I have a 2k 144hz monitor. It's JUST starting to show its age as I can't blast everything on high/ultra anymore and have to turn down the biggest fps guzzling settings.

[–] ATDA@lemmy.world 6 points 10 months ago

Yeah I keep waiting for a good deal to retire my 1080ti.

Guess I could go for a 3060 or something but 4 series will probably leave my old CPU behind.

load more comments (1 replies)
[–] ManosTheHandsOfFate@lemmy.world 27 points 10 months ago (1 children)

I just upgraded from a 1070 to a 3060ti. The numbers definitely did not justify a 4060ti.

[–] kaitco@lemmy.world 9 points 10 months ago (1 children)

How was that change? I’m thinking of doing the same, but it requires a power supply update too, so I’m on the fence.

load more comments (1 replies)
[–] HeyJoe@lemmy.world 26 points 10 months ago (1 children)

As someone who upgraded from a 2016 GPU to a 2023 one I was completely fine with this. Prices finally came down and I got the best card 2023 offered me, which may not have been impressive for this generation but was incredible from what I came from.

[–] DacoTaco@lemmy.world 12 points 10 months ago (1 children)

And how much did you pay for the 2016 card, what range was it in, and what is the new card's cost and range?

Overal, gpus have been a major ripoff, despite these upgrades giving good performance boosts

[–] HeyJoe@lemmy.world 13 points 10 months ago (4 children)

I believe about $300 for an AMD RX480 (great card and still going strong). This time I had a bit more money and wanted something more powerful. I went with the AMD 7800 XT Nitro ($550) which I got on release day. Sure it's not top of the line but it has played pretty much everything I throw at it with all settings set to max and still maintaining 60fps or above. I have an UW monitor with its max resolution being 5120x1440 which is what most games will play at and everything still plays fine. It's almost crazy to me that this card would be considered mid range.

load more comments (4 replies)
[–] verdantbanana@lemmy.world 21 points 10 months ago (3 children)

intel GPUs definitely won out for what you get for the money

[–] crsu@lemmy.world 20 points 10 months ago

That's not a sentence I'm used to seeing

[–] CalcProgrammer1@lemmy.ml 10 points 10 months ago (2 children)

I've been very happy with my Arc A770, it works great on Linux and performs well for what I paid for it.

load more comments (2 replies)
[–] Kit@lemmy.blahaj.zone 8 points 10 months ago

I'm so glad that Intel has stepped into the GPU space, even if their cards are weaker. More competition will hopefully light a fire under NVidia to get their shit together.

[–] aluminium@lemmy.world 19 points 10 months ago* (last edited 10 months ago)

I finally upgraded my GTX970 to a used RTX 3080 for 300€. The difference at least for me for the same 300€ was insane.

[–] DrPop@lemmy.ml 18 points 10 months ago (1 children)

I just don't see the point in upgrading every new release anyway, or even buying the most expensive one. I've had my gigabyte Rx 570 for several years and I can play Baldurs Gate 3 full settings with no issues. Maybe I haven't tasted 120 fps but I'm just happy I can play modern games. When it comes time to get a new graphics card, which may be soon since I am planning to build my wife's PC, maybe then I'll see what's going on with the higher end ones. Maybe I'm just a broke ass though.

[–] cyberpunk007@lemmy.world 5 points 10 months ago* (last edited 10 months ago)

Ya the problem I landed in was not anticipating how hard it would be to push my new monitor. Ultra wide 2.5k resolution with 144Hz. I can't do cyberpunk full res more than 60fps, and that's with dlss enabled and not all settings at max.

2070s

[–] Paddzr@lemmy.world 16 points 10 months ago (1 children)

I had to buy 3070 ti at scalped price. Ended up paying Β£700 for it. I hate myself for it but the prices didn't shift for months after and my gtx 1080 kicked the bucket. No way in hell am I buying anything this gen. My wife's 1080 is going for now, maybe we'll get 5080 if it's not a rip off.

[–] DacoTaco@lemmy.world 47 points 10 months ago (2 children)

Its nvidia, its always a ripoff :p

[–] filister@lemmy.world 14 points 10 months ago (1 children)

Especially now when gaming GPUs are an afterthought for them.

[–] DacoTaco@lemmy.world 4 points 10 months ago (5 children)

Thats only nvidia though. Amd seems to still be trying to compete with nvidia some way or another

load more comments (5 replies)
load more comments (1 replies)
[–] Anti_Face_Weapon@lemmy.world 13 points 10 months ago (1 children)

NVIDIA fucking sucks. But I do a lot of modeling in blender and holy damn do I want that RTX.

load more comments (1 replies)
[–] Buffalox@lemmy.world 9 points 10 months ago* (last edited 10 months ago)

So how about the 2Β½ years from 2016 to 2018 between Nvidia GFX 1080ti and RTX 2080?
I think the headline should say A Year not THE year.

[–] AlpacaChariot@lemmy.world 5 points 10 months ago (8 children)

What's everyone's recommendation for a cheap AMD GPU to use with Linux? I was looking recently at a Radeon RX 580, I know there are much better cards out there but the prices are about double (Β£350-400 instead of Β£180). I'd mostly be using it to play games like the remastered Rome Total War.

load more comments (8 replies)
[–] konalt@lemmy.world 5 points 10 months ago (1 children)

I upgraded from an RX 480 to an RTX 3060 a few days ago. Crazy difference, especially in compute

load more comments (1 replies)
[–] autotldr@lemmings.world 4 points 10 months ago

This is the best summary I could come up with:


The performance gains were small, and a drop from 12GB to 8GB of RAM isn't the direction we prefer to see things move, but it was still a slightly faster and more efficient card at around the same price.

In all, 2023 wasn't the worst time to buy a $300 GPU; that dubious honor belongs to the depths of 2021, when you'd be lucky to snag a GTX 1650 for that price.

But these numbers were only possible in games that supported these GPUs' newest software gimmick, DLSS Frame Generation (FG).

The technology is impressive when it works, and it's been successful enough to spawn hardware-agnostic imitators like the AMD-backed FSR 3 and an alternate implementation from Intel that's still in early stages.

And DLSS FG also adds a bit of latency, though this can be offset with latency-reducing technologies like Nvidia Reflex.

But to put it front-and-center in comparisons with previous-generation graphics cards is, at best, painting an overly rosy picture of what upgraders can actually expect.


The original article contains 787 words, the summary contains 168 words. Saved 79%. I'm a bot and I'm open source!

load more comments
view more: next β€Ί