Man they are going to ride the pandemic as a cause for high prices until it's a skeleton just skidding on the ground. It's been four years since pandemic supply issues, pretty sure those are over now. Unless they mean the price gouging that happened then that hasn't gone down.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
So now we can finally go back to good old code optimization, right? Right? (Padme.jpg)
We'll ask AI to make it performant, and when it breaks, we'll just go back to the old version. No way in hell we are paying someone
Damn. I hate how it hurts to know that's what will happen
Also they’re not going to play Silksong any better than a ten year old console.
is it just me or this title is weird?
It's not just you. The title gets causation totally wrong. If people made bad assumptions about how technology would change in the future, it's their assumptions that are the problem, not reality.
game graphics and design peaked in 2008. N64 was more optimized than anything that came after. Im so over current gen, and last gen and the gen before that too. Let it all burn. :)
Edit: Furthermore,
https://lemmy.blahaj.zone/pictrs/image/222c26df-19d9-4fce-9ce3-2f3dcffefc60.webp
Was about to say this too. Can't tell a difference between most games made in 2013 vs 2023.
Battlefield 1 still beats 99% of games releasing now
Ironic the image is of a switch, like Nintendo has been on the cutting edge at all in the last 20+ years
It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.
Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.
Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.
This is absolutely right. We are getting to the point where the circuit pathway is hundreds or even dozens of electrons wide. The fact that we can even make circuits that small in quantity is fucking amazing. But we are rapidly approaching laws-of-physics type limits in how much smaller we can go.
Plus let's not forget an awful lot of the super high-end production is being gobbled up by AI training farms and GPU clusters. Companies that will buy 10,000 chips at a time are absolutely the preferred customers.
Did you read the article? That's exactly what it said.
Not to mention that even when some components do shrink, it's not uniform for all components on the chip, so they can't just do 1:1 layout shrinks like in the past, but pretty much need to start the physical design portion all over with a new layout and timings (which then cascade out into many other required changes).
Porting to a new process node (even at the same foundry company) isn't quite as much work as a new project, but it's close.
Same thing applies to changing to a new foundry company, for all of those wondering why chip designers don't just switch some production from TSMC to Samsung or Intel since TSMC's production is sold out. It's almost as much work as just making a new chip, plus performance and efficiency would be very different depending in where the chip was made.
Consoles are just increasingly bad value for consumers compared to PCs.
Are they tho? Have you seen graphics card prices?
I can get ps5 graphics with a $280 video card, games are often way cheaper, I can hook the pc up to my TV, and still play with a ps5 or Xbox controller, or mouse and keyboard.
I suspect next gen there will be a ps6 and Xbox will make a cheap cloud gaming box and just go subscription only.
Didn't Google Stadia do the cloud thing and failed miserably?
Microsofts cloud gaming is already profitable. Also, they got their ass kicked so badly against the ps5 that there's no profitable avenue in developing and trying to sell a future console. They're better off concentrating on pc games and cloud gaming. Sony can't really compete against them in that market, just like microsoft is unlikely to make it worth while to compete against Sony in a console.
The internet isn't good enough globally to do that, and still won't be by 2030 after the ps6/nextbox is out. Maybe the gen after next. But even then, there's a lot of countries I could see still being patchy. Right now in Australia, Sony won't even let you access the PS3 streaming games because they know it won't work well enough.
You overestimate how much Microsoft would care about people with bad internet. They'll opt for smaller numbers paying a subscription per month/year on cheap hardware. No point in losing money against Sony directly.
My 4070 cost $300 and runs everything.
The whole PC cost around $1000, and i have had it since the Xbox One released.
You can get similar performance from a $400 steam deck which is a computer.
On what planet does a Steam Deck give 4070 performance?
And on which does a 4070 cost $300 for that matter? They cost more than a whole PS5.
This article doesn't factor in the new demand that is gobbling up all the CPU and GPU production: Ai server farms. For example, Nvidia, that was once only making graphic cards for gamers, has been trying to keep up with global demand for Ai. The whole market is different, then toss tarrifs and the rest of top.
I wouldn't blame moores law death, technology is still advancing, but per usual, based on demand.
technology is still advancing
Actually not really: performance per watt of the high end stuff has been stagnating since Ampere generation. NVidia hides it by changing models in its benchmarks or advertising raw performance without power figures.