this post was submitted on 31 Jan 2025
197 points (95.4% liked)

PC Gaming

9121 readers
793 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] kingshrubb@lemmy.ml 10 points 5 hours ago
[–] umbrella@lemmy.ml 3 points 5 hours ago (1 children)

amd exists, people.

please put your money where your mouth is please?

[–] stevedice@sh.itjust.works 2 points 5 hours ago

Not at the higher end, it doesn't.

[–] m750@lemmy.world 2 points 6 hours ago

They tried this with the 4080 12 gb... This time they dug in and did it whole hog with all the 5080s.

Shame amd is not going to be competitive

[–] the_q@lemm.ee 2 points 6 hours ago

Or you know buy an AMD card and quit giving your money to the objectively worse company.

[–] ZeDoTelhado@lemmy.world 16 points 23 hours ago (1 children)

For the people looking to upgrade: always check first the used market in your area. It is quite obvious for now the best thing to do is just try to get 40 series from the drones that must have the 50 series

[–] Squizzy@lemmy.world 3 points 15 hours ago (3 children)

I have never had a gpu, I want 4k120 so which one should I get?

[–] ZeDoTelhado@lemmy.world 3 points 13 hours ago* (last edited 13 hours ago) (1 children)

Not sure if in your area is a thing, but a 4090 second hand at a decent price should do it

[–] Trilobite@lemm.ee 3 points 11 hours ago

Good luck with that they are selling for $1800 used on eBay

[–] lengau@midwest.social 1 points 14 hours ago (1 children)

What do you want to do at 4k, 120 Hz?

[–] Squizzy@lemmy.world 2 points 8 hours ago

Watch Yellowstone.

Nah just I have the G series oled capable of it so why not get the system that can too. GoW and the like.

[–] sunzu2@thebrainbin.org 1 points 14 hours ago (1 children)
[–] Squizzy@lemmy.world 1 points 8 hours ago (1 children)

I thought it was a standard enough option to expect?

[–] sunzu2@thebrainbin.org 1 points 7 hours ago (1 children)

I am assuming you mean 120 fps for modern vidya games maxed out?

[–] Squizzy@lemmy.world -1 points 6 hours ago

Yes, given a console can do it handy enough I thought pcs would have a good handle on it. I understand it is game dependent but i have been hearing 4k120 for over a decade now from pc gamers.

[–] cyberpunk007@lemmy.ca 19 points 1 day ago (4 children)

lol this reminds me of whatever that card was back in the 2000's or so, where you could literally make a trace with a pencil to upgrade the version lower to the version higher.

[–] root@aussie.zone 2 points 5 hours ago

I remember using the pencil trick on my old AMD Duron processor to unlock it. 🤣

[–] inclementimmigrant@lemmy.world 9 points 17 hours ago (1 children)

Yeah, those were the days when cost control was simply to use the same PCB but with just the traces left out. There were also quite a few cards that used the exact same PCB, traces intact, that you could simple flash the next tier card's BIOS and get significant performance bumps.

Did a few of those mods myself back in the day, those were fun times.

[–] cyberpunk007@lemmy.ca 4 points 15 hours ago (5 children)

Ok now how do I turn my 2070s into a 5090? 😅

[–] CheeseNoodle@lemmy.world 3 points 9 hours ago* (last edited 9 hours ago)

Get 500 dollars then use AI to generate the other 3/4 of the money and buy a 5090.

[–] inclementimmigrant@lemmy.world 3 points 12 hours ago* (last edited 12 hours ago) (1 children)

Well of you ask Nvidia it's now just the driver making frames for you.

[–] cyberpunk007@lemmy.ca 3 points 12 hours ago

Ya I'm sad to see and exit. Maybe in a year or two I'll get that sapphire 7900 xtx or whatever it is.

[–] Nomecks@lemmy.ca 2 points 14 hours ago

Liquid nitrogen cooling

[–] sunzu2@thebrainbin.org 1 points 14 hours ago (1 children)

This would be more like 2070 to 2080

[–] cyberpunk007@lemmy.ca 2 points 14 hours ago

I know it was a joke

[–] vonbaronhans@midwest.social 1 points 14 hours ago (1 children)

My 2070 is still treating me pretty well!

[–] cyberpunk007@lemmy.ca 1 points 14 hours ago* (last edited 12 hours ago) (2 children)

Mine too except for 3440x1440 with 144Hz possible. I hardly get that resolution and refresh rate on higher demanding games

[–] vonbaronhans@midwest.social 1 points 10 hours ago

Yeah same, but I only have a 1440p monitor, and I can barely tell the difference after 90hz, anyhow.

[–] sunzu2@thebrainbin.org 2 points 14 hours ago (1 children)
[–] cyberpunk007@lemmy.ca 2 points 12 hours ago

Yeah my mistake

load more comments (2 replies)
[–] circuitfarmer@lemmy.sdf.org 67 points 1 day ago (26 children)

Vote with your wallets. DLSS and Ray Tracing aren't worth it to support this garbage.

load more comments (26 replies)
[–] Majorllama@lemmy.world 23 points 1 day ago (4 children)

Just like I rode my 1080ti for a long time it looks like I'll be running my 3080 for awhile lol.

I hope in a few years when I'm actually ready to upgrade that the GPU market isn't so dire... All signs are pointing to no unfortunately.

[–] stevedice@sh.itjust.works 1 points 5 hours ago (1 children)

Just like I rode my 1080ti for a long time

You only skipped a generation. What are you talking about?

[–] Majorllama@lemmy.world 1 points 3 hours ago

I had the 1080ti well after the 3080 release. I Got a great deal and had recent switched to 1440p so I pulled the trigger on a 3080 not long before the 4000 series cards dropped.

[–] Bronzie@sh.itjust.works 2 points 20 hours ago (2 children)

Same here.

Only upgraded when my 1080 died, so I snagged a 3080 for an OK price. Not buying a new card untill this one dies. Nvidia can get bent.

Maybe team red next time….

[–] Briict@lemmy.dbzer0.com 2 points 14 hours ago

I'm extremely petty and swore off Nvidia for turning off my shadow play unless I made an account for my 660, had to downgrade my driver to get it back without an account. I only have an Nvidia card now because I got a 3080ti for free after my rx 580 died.

[–] vonbaronhans@midwest.social 1 points 14 hours ago

I'm still rocking a 2070 and doing great. Turns out the games that I like rarely depend on graphical fidelity, but rather on good visual design and game design.

But yeah if graphical fidelity is your bag, or if you need every possible frame for competitive reasons, then your options are much more limited and far more expensive. Sucks.

[–] OminousOrange@lemmy.ca 11 points 1 day ago (6 children)

I'm still riding my 1080ti...

[–] Cypher@lemmy.world 1 points 8 hours ago

1080ti has had the longest useful lifespan of any GPU ever.

Im still running mine but Im likely to buy a 5090 very soon as Im making the leap to 4k gaming and vr from 2k.

[–] leftzero@lemmynsfw.com 4 points 1 day ago

980ti here, playing Cyberpunk 2077 at sufficiently high settings without issues (if you call ~30fps 1440p with no path tracing “without issues”, that is).

load more comments (4 replies)
load more comments (1 replies)
[–] ogeist@lemmy.world 39 points 1 day ago (1 children)

I've got the feeling that GPU development is plateauing, new flagships are consuming an immense amount of power and the sizes are humongous. I do give DLSS, Local-AI and similar technologies the benefit of doubt but is just not there yet. GPUs should be more efficient and improve in other ways.

[–] PM_Your_Nudes_Please@lemmy.world 32 points 1 day ago (1 children)

I’ve said for a while that AMD will eventually eclipse all of the competition, simply because their design methodology is so different compared to the others. Intel has historically relied on simply cramming more into the same space. But they’re reaching theoretical limits on how small their designs can be; They’re being limited by things like atom size and the speed of light across the distance of the chip. But AMD has historically used the same dies for as long as possible, and relied on improving their efficiency to get gains instead. They were historically a generation (or even two) behind Intel in terms of pure hardware power, but still managed to compete because they used the chips more efficiently. As AMD also begins to approach those theoretical limits, I think they’ll do a much better job of actually eking out more computing power.

And the same goes for GPUs. With Nvidia recently resorting to the “just make it bigger and give it more power” design philosophy, it likely means they’re also reaching theoretical limitations.

load more comments (1 replies)
load more comments
view more: next ›