Considering that the AI craze is what's fueling the shortage and massive increase in GPU prices, I really don't see gamers ever embracing AI.
Games
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
[...] I really don’t see gamers ever embracing AI.
They've spent years training to fight it, so that tracks.
The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD.
…No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction, even though Battlemage is excellent.
For local AI, the only thing that gets sucked up are 3060s, 3090s, and for the rich/desperate, 4090s/5090s, with anything else being a waste of money with too little VRAM. And this is a pretty small niche.
Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don't get made. And what's left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn't been enough for a while. Whole situation is just absurd.
Still have limited wafers at the fabs. The chips going to datacenters could have been consumer stuff instead. Besides they (nVidia, Apple, AMD) are all fabricated at TSMC.
Local AI benefits from platforms with unified memory that can be expanded. Watch platforms based on AMD's Ryzen AI MAX 300 chip or whatever they call it take off. Frameworks you can config a machine with that chip to 128 GB RAM iirc. It's the main reason why I believe Apple's memory upgrades cost a ton so that it isn't a viable option financially for local AI applications.
I'm pretty sure the fabs making the chips for datacenter cards could be making more consumer grade cards but those are less profitable. And since fabs aren't infinite the price of datacenter cards is still going to affect consumer ones.
There’s what AI could’ve been (collaborative and awesome), and then there’s what the billionaire class is pushing today (exploitative shit that they hit everyone over the head with until they say they like it). But the folks frothing at the mouth over it are unwilling to listen to why so many people are against the AI we’ve had forced upon us today.
Yesterday, Copilot hallucinated four different functions when I asked it to refactor a ~20 line TS function, despite me handing it 2 helper files that contained everything available for it to use. If I can’t confidently ask it to do anything, it’s immediately useless to me. It’s like being stuck with an impulsive liar that you have to get the truth out of.
Dude I couldn't even get copilot to generate a picture with the size I wanted despite specifying the exact pixels for height and width.
A guy I used to work with would, at least I would swear it, submit shit code just so I would comment about the right way to do it. No matter how many times I told him how to do something. Sometimes it was code that didn't actually do anything. Working with co-pilot is a lot like working with that guy again.
Funny enough, here’s a description of AI I wrote yesterday that I think you’ll relate to:
AI is the lazy colleague that will never get fired because their dad is the CTO. You’re forced to pair with them on a daily basis. You try to hand them menial tasks that they still manage to get completely wrong, while dear ol’ dad is gassing them up in every all-hands meeting.
It's fundamentally a make-shit-up device. It's like pulling words out of a hat. You cannot get mad at the hat for giving you poetry when you asked for nonfiction.
Get mad at the company which bolted the hat to your keyboard and promised you it was psychic.
I think that's exactly who they're mad at
Carmack is an AI sent from the future, so he's a bit biased.