this post was submitted on 31 Jan 2024
501 points (97.0% liked)
Technology
60115 readers
2822 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Only downside if integrated graphics becomes a thing is that you can’t upgrade if the next gen needs a different motherboard. Pretty easy to swap from a 2080 to a 3080.
Integrated graphics is already a thing. Intel iGPU has over 60% market share. This is really competing with Intel and low-end discrete GPUs. Nice to have the option!
Yeah, I know integrated graphics is a thing. And that’s been fine for running a web browser, watching videos, or whatever other low-demand graphical application was needed for office work. Now they’re holding it up against gaming, which typically places large demands on graphical processing power.
The only reason I brought up what I did is because it’s an if… if people start looking at CPU integrated graphics as an alternative to expensive GPUs it makes an upgrade path more costly vs a short term savings of avoiding a good GPU purchase.
Again, if one’s gaming consists of games that aren’t high demand like Fortnite, then upgrades and performance probably aren’t a concern for the user. One could still end up buying a GPU and adding it to the system for more power assuming that the PSU has enough power and case has room.
For a slightly different perspective, I will not game on anything other than a Steamdeck. So, this is kind perfect for me. But, I am a long hauler with hardware so I typically upgrade everything all at once anyway.
AMD has been pretty good about this though, AM4 lasted 2016-2022. Compare to Intel changing the socket every 1-2 years, it seems.
Actually AMD is still releasing new AM4 CPUs now. 5700x3D was just announced.
Oh, now that sounds like something I might like
I don't have the fastest RAM out there, so whenever I upgrade from my 1600, I want an X3D variant to help with that
There's a 5600x3D s well.
I think I'm gonna get one of the higher end models, since it'll be the last possible upgrade I can do on my motherboard.
So 5700X3D or 5800X3D, depending on what the prices look like whenever I'm gonna be in the market for them. And then I'll be set for a looong while. Well, an appropriately fast GPU would be nice to go along with it, but you know.
But it's pretty cool that they made a 5600 variant too. Might as well use the chips they evidently still have left over
Do it!
That's true but I'm excited about the future of laptops. Some of the specs are getting really impressive while keeping low power draw. I'm currently jealous of what Apple has accomplished with literal all day battery life in a 14inch laptop. I'm hopeful some of the AMD chips will get us there in other hardware.
Could you not just slot in a dedicated video card if you needed one, keeping the integrated as a backup?
Yeah, maybe. I commented on that elsewhere here. If we follow a possible path for IG - the elimination of a large GPU could result in the computer being sold with a smaller case and lower-power GPU. Why would you need a full tower when you can have a more compact PC with a sleek NVMe/SSD and a smaller motherboard form factor? Now there’s no room to cram a 3080 in the box and no power to drive it.
Again, someone depending on CPU IG to play Fortnite probably isn’t gonna be looking for upgrade paths. this is just an observation of a limitation imposed on users should CPU IG become more prominent. All hypothetical at this point.
Or y'know, upgrade the case at the same time.
Or even build the computer yourself. Outside of the graphics card shortage a couple of years back, it's usually been cheaper to source parts yourself than pay an OEM for a prebuilt machine.
A small side note: If you buy a Dell/Alienware machine, you're never upgrading just the case. The front panel IO is part of the motherboard, and the power supply is some proprietary crap. If you replace the case, you need to replace the motherboard, which also requires you to replace the power supply. At that point, you've replaced half the computer.
Same thing with HP. Their "Pavillion" series of Towers contains a proprietary motherboard and power supply. Also, on the model a friend of mine had, the CPU was AMD, but the cooler scewed on top was designed for intel-purposed boards, so it looked kinda frankensteined.
So in essence, it's the same with HP.
And the shared RAM. Games like Star Trek Fleet Command will crash your computer by messing with that/memory leaks galore. Far less crashy with a dedicated GPU. How many other games interact poorly with integrated GPUs?
AMD keeps the same sockets for ages. I was able to upgrade a 5 year old Ryzen 5 2600G to a 5600G last month. Can't do that with Intel in general.
or it may end up making for a push for longer lifetimes for motherboards