62
you are viewing a single comment's thread
view the rest of the comments
[-] Hirom@beehaw.org 26 points 1 week ago* (last edited 1 week ago)

They were years ahead of the curve with AI hardware, and they're well placed to benefit from the AI craze.

Regardless of whether a company's AI product is useful, or profitable, they need lot of hardware to make it run.

[-] DdCno1@beehaw.org 11 points 1 week ago

To illustrate your point, my old GPU, a GTX 1080 from 2016 (basically ancient history - Obama was still president back then) remains a very useful for ML-applications today - and this isn't even their oldest card that is still relevant for AI. This card was never meant for this, but thanks to Nvidia investing into CUDA and CUDA being useful for all sorts of non-gaming applications, the API became a natural first choice when ML tools that run on consumer hardware started to get developed.

My current GPU, an RTX 2080, is just two years younger and yet it's so powerful (for everything I throw at it, including ML) that I won't have to upgrade it for years to come.

Whatever makes RTX work is what accelerations a lot of AI tasks. I’d argue the 1080 is bordering on irrelevant if it wasn’t for the 8 gigs of ram to save it. The 2060 should be much faster despite for gaming being about in par.

this post was submitted on 19 Jun 2024
62 points (100.0% liked)

Technology

37208 readers
58 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS