this post was submitted on 21 Apr 2025
231 points (98.7% liked)

PC Gaming

10859 readers
1069 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Ulrich@feddit.org 88 points 4 days ago* (last edited 4 days ago) (8 children)

tl:dw some the testing shows 300-500% improvements in the 16GB model. Some games are completely unplayable on 8GB while delivering an excellent experience on the 16GB.

It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

[–] empireOfLove2@lemmy.dbzer0.com 35 points 4 days ago* (last edited 4 days ago) (2 children)

It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

Its moreso for OEM system integrators, who can buy up thousands of these 5060ti's and sell them in systems as 5060Ti's, and the average Joe who buys prebuilts won't know to go looking at the bottom half of the tech sheet to see if its an 8 or 16.
As well as yes, direct scamming consumers, because Jensen needs more leather jackets off the AI craze and couldn't give a rats ass about gamers.

[–] CalipherJones@lemmy.world 2 points 3 days ago

Ngl gamers don't deserve respect.

[–] MBech@feddit.dk 6 points 4 days ago* (last edited 4 days ago) (1 children)

I agree that they don't give half a shit about their actual product, but their biggest competitor has never been more competitive, and Nvidia knows it. Pissing off your costumer base when you don't have a monopoly is fucking stupid, and Nvidia and the prebuilt manufacturers knows this. It's business 101.

There's gotta be something else. I know businesses aren't known for making long term plans, because all that will ever matter to them is short term profits. But this is just way too stupid to be because of that.

[–] empireOfLove2@lemmy.dbzer0.com 5 points 4 days ago (1 children)

There’s gotta be something else.

That something else is that they don't need the gamer market. Providing consumer cards is literally an inconvenience for them at this point, they make 2 billion a quarter from gaming cards but 18 billion on datacenter compute, with some insane 76% gross margins on those products they sell (to continue funding R&D).

[–] sunzu2@thebrainbin.org 1 points 3 days ago

And just like that gamers are not a priority

Amd and Intel need to work harder tho

[–] inclementimmigrant@lemmy.world 18 points 4 days ago* (last edited 4 days ago) (1 children)

To me it sounds like they are preying on the gamer who isn't tech savvy or are desperate. Just a continuation of being anti-consumer and anti-gamer.

load more comments (1 replies)
[–] recursive_recursion@lemmy.ca 6 points 4 days ago* (last edited 4 days ago) (7 children)

It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

for money/extreme greed

load more comments (7 replies)
load more comments (5 replies)
[–] WormFood@lemmy.world 66 points 4 days ago (2 children)

it is 2019, the 2060ti has 8gb of vram. it is 2020, the 3060ti has 8gb of vram. it is 2023, the 4060ti has 8gb of vram. it is 2025, the 5060ti has 8gb of vram.

[–] nik282000@lemmy.ca 16 points 4 days ago (8 children)

My 1080 from 2017 has 8gb of vram. Still works fine.

load more comments (8 replies)
[–] CanadianCarl@sh.itjust.works 2 points 3 days ago* (last edited 3 days ago) (1 children)

My 3060 has 12gb of vram...

[–] AnUnusualRelic@lemmy.world 2 points 3 days ago

You probably have to return the 4gb extra then.

[–] nutsack@lemmy.dbzer0.com 21 points 3 days ago (11 children)

why the fuck is a 50 series Ti card having only 8gb of vram

[–] AnUnusualRelic@lemmy.world 4 points 3 days ago

Maybe it has an Sd slot or something?

load more comments (10 replies)
[–] filister@lemmy.world 47 points 4 days ago (2 children)

The whole fact that NVIDIA is not allowing AIBs to send the 8GB card to reviewers is quite telling. They are simply banking on illiterate purchasers, system integrators to sell this variant. That's another low for NVIDIA but hardly surprising anyone.

Planned obsolescence.

[–] sleep_deprived@lemmy.dbzer0.com 38 points 4 days ago (1 children)

This is worse than planned obsolescence. This is basically manufactured ewaste.

[–] CalipherJones@lemmy.world 2 points 3 days ago

They should use the 5060s for disposable vapes.

[–] HeyJoe@lemm.ee 4 points 4 days ago

I agree, but it is still crazy that there are people out there making $500 plus purchases without the smallest bit of research. I really hope this card fails only for the reason that it deserves to.

[–] dumblederp@aussie.zone 13 points 4 days ago (3 children)

Can we get a gpu that just has some ddr5 slots in it?

[–] Avg@lemm.ee 4 points 3 days ago

You would need so many channels for that to be viable.

[–] ikidd@lemmy.world 2 points 3 days ago
[–] SharkAttak@kbin.melroy.org 1 points 3 days ago (1 children)

Why not many little simple sockets in which pop as many memory chips as needed?

[–] dumblederp@aussie.zone 1 points 3 days ago

ddr5 was just a placeholder in the above statement, whatever works. TPTB are welcome to release a line of gpu-ram with appropriate connections.

[–] kugmo@sh.itjust.works 6 points 4 days ago* (last edited 4 days ago) (11 children)

On the flip-side, every game worth playing uses 2GB vram or less at 1080p.

[–] FeelzGoodMan420@eviltoast.org 6 points 3 days ago

That's an absolute lie and you know it.

load more comments (10 replies)
load more comments
view more: next ›