this post was submitted on 17 Jul 2024
681 points (99.0% liked)

PC Gaming

8205 readers
1005 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] eran_morad@lemmy.world 201 points 1 month ago (4 children)

I’d pay extra for no AI in any of my shit.

[–] BlueLineBae@midwest.social 108 points 1 month ago (15 children)

I would already like to buy a 4k TV that isn't smart and have yet to find it. Please don't add AI into the mix as well :(

[–] HATEFISH@midwest.social 38 points 1 month ago (2 children)

Look into commercial displays

[–] Diplomjodler3@lemmy.world 46 points 1 month ago (4 children)

The simple trick to turn a "smart" TV into a regular one is too cut off its internet access.

[–] HATEFISH@midwest.social 25 points 1 month ago (1 children)

Except it will still run like shit and may send telemetry via other means to your neighbors same brand TV

[–] Diplomjodler3@lemmy.world 13 points 1 month ago (2 children)

I've never heard of that. Do you have a source on that? And how would it run like shit if you're using something like a Chromecast?

load more comments (2 replies)
[–] pedz@lemmy.ca 11 points 1 month ago

Mine still takes several seconds to boot android TV just so it can display the HDMI input, even if not connected to internet. It has to be always plugged on the power because if there is a power cut, it needs to boot android TV again.

My old dumb TV did that in a second without booting an entire OS. Next time I need a big screen, it will be a computer monitor.

load more comments (2 replies)
load more comments (1 replies)
[–] notnotmike@programming.dev 12 points 1 month ago (1 children)

I was just thinking the other day how I'd love to "root" my TV like I used to root my phones. Maybe install some free OS instead

load more comments (1 replies)
load more comments (13 replies)
load more comments (3 replies)
[–] rtxn@lemmy.world 81 points 1 month ago (7 children)

The dedicated TPM chip is already being used for side-channel attacks. A new processor running arbitrary code would be a black hat's wet dream.

[–] MajorHavoc@programming.dev 50 points 1 month ago (1 children)

It will be.

IoT devices are already getting owned at staggering rates. Adding a learning model that currently cannot be secured is absolutely going to happen, and going to cause a whole new large batch of breaches.

The “s” in IoT stands for “security”

load more comments (6 replies)
[–] NounsAndWords@lemmy.world 66 points 1 month ago (20 children)

I would pay for AI-enhanced hardware...but I haven't yet seen anything that AI is enhancing, just an emerging product being tacked on to everything they can for an added premium.

[–] DerisionConsulting@lemmy.ca 26 points 1 month ago

In the 2010s, it was cramming a phone app and wifi into things to try to justify the higher price, while also spying on users in new ways. The device may even a screen for basically no reason.
In the 2020s, those same useless features now with a bit of software with a flashy name that removes even more control from the user, and allows the manufacturer to spy on even further the user.

[–] Fermion@feddit.nl 18 points 1 month ago

It's like rgb all over again.

At least rgb didn't make a giant stock market bubble...

[–] ryathal@sh.itjust.works 12 points 1 month ago

Anything AI actually enhanced would be advertising the enhancement not the AI part.

load more comments (17 replies)
[–] PenisWenisGenius@lemmynsfw.com 57 points 1 month ago* (last edited 1 month ago) (1 children)

I'm generally opposed to anything that involves buying new hardware. This isn't the 1980s. Computers are powerful as fuck. Stop making software that barely runs on them. If they can't make ai more efficient then fuck it. If they can't make game graphics good without a minimum of a $1000 gpu that produces as much heat as a space heater, maybe we need to go back to 2000s era 3d. There is absolutely no point in making graphics more photorealistic than maybe Skyrim. The route they're going is not sustainable.

[–] reev@sh.itjust.works 27 points 1 month ago* (last edited 1 month ago) (2 children)

The point of software like DLSS is to run stuff better on computers with worse specs than what you'd normally need to run a game as that quality. There's plenty of AI tech that can actually improve experiences and saying that Skyrim graphics are the absolute max we as humanity "need" or "should want" is a weird take ¯\_(ツ)_/¯

[–] warm@kbin.earth 10 points 1 month ago* (last edited 1 month ago) (4 children)

The quality of games has dropped a lot, they make them fast and as long as it can just about reach 60fps at 720p they release it. Hardware is insane these days, the games mostly look the same as they did 10 years ago (Skyrim never looked amazing for 2011. BF3, Crysis 2, Forza, Arkham City etc. came out then too), but the performance of them has dropped significantly.

I don't want DLSS and I refuse to buy a game that relies on upscaling to have any meaningful performance. Everything should be over 120fps at this point, way over. But people accept the shit and buy the games up anyway, so nothing is going to change.

The point is, we would rather have games looking like Skyrim with great performance vs '4K RTX real time raytracing ultra AI realistic graphics wow!' at 60fps.

load more comments (4 replies)
load more comments (1 replies)
[–] the_post_of_tom_joad@sh.itjust.works 43 points 1 month ago (2 children)

Only 7% say they would pay more, which to my mind is the percentage of respondents who have no idea what "AI" in its current bullshit context even is

[–] taiyang@lemmy.world 10 points 1 month ago (3 children)

Or they know a guy named Al and got confused. ;)

load more comments (3 replies)
load more comments (1 replies)
[–] rainynight65@feddit.de 39 points 1 month ago (1 children)

I am generally unwilling to pay extra for features I don't need and didn't ask for.

load more comments (1 replies)
[–] UltraGiGaGigantic@lemm.ee 31 points 1 month ago (5 children)

We're not gonna make it, are we? People, I mean.

load more comments (5 replies)
[–] crazyminner@lemmy.ml 30 points 1 month ago (1 children)

I was recently looking for a new laptop and I actively avoided laptops with AI features.

[–] lamabop@lemmings.world 18 points 1 month ago

Look, me too, but, the average punter on the street just looks at AI new features and goes OK sure give it to me. Tell them about the dodgy shit that goes with AI and you'll probably get a shrug at most

[–] n3m37h@sh.itjust.works 24 points 1 month ago (1 children)

Let me put it in lamens terms..... FUCK AI.... Thanks, have a great day

[–] iAmTheTot@sh.itjust.works 21 points 1 month ago (1 children)

FYI the term is "layman's", as of you were using the language of a layman, or someone who is not specifically experienced in the topic.

[–] krashmo@lemmy.world 19 points 1 month ago (1 children)

Sounds like something a lameman would say

load more comments (1 replies)
[–] cygnus@lemmy.ca 23 points 1 month ago (2 children)

The biggest surprise here is that as many as 16% are willing to pay more...

Acktually it's 7% that would pay, with the remainder 'unsure'

load more comments (1 replies)
[–] kemsat@lemmy.world 23 points 1 month ago (3 children)

What does AI enhanced hardware mean? Because I bought an Nvidia RTX card pretty much just for the AI enhanced DLSS, and I’d do it again.

[–] WhyDoYouPersist@lemmy.world 27 points 1 month ago

When they start calling everything AI, soon enough it loses all meaning. They're gonna have to start marketing things as AI-z, AI 2, iAI, AIA, AI 360, AyyyAye, etc. Got their work cut out for em, that's for sure.

load more comments (2 replies)
[–] alessandro@lemmy.ca 21 points 1 month ago

I don't think the poll question was well made... "would you like part away from your money for..." vaguely shakes hand in air "...ai?"

People is already paying for "ai" even before chatGPT came out to popularize things: DLSS

[–] UnderpantsWeevil@lemmy.world 21 points 1 month ago (2 children)

Okay, but here me out. What if the OS got way worse, and then I told you that paying me for the AI feature would restore it to a near-baseline level of original performance? What then, eh?

load more comments (2 replies)
[–] bouldering_barista@lemmy.world 21 points 1 month ago (4 children)

Who in the heck are the 16%

[–] Honytawk@lemmy.zip 16 points 1 month ago (3 children)
  • The ones who have investments in AI

  • The ones who listen to the marketing

  • The ones who are big Weird Al fans

  • The ones who didn't understand the question

[–] Glytch@lemmy.world 10 points 1 month ago

I would pay for Weird-Al enhanced PC hardware.

load more comments (2 replies)
load more comments (3 replies)
[–] AVincentInSpace@pawb.social 17 points 1 month ago

I'm willing to pay extra for software that isn't

[–] qaz@lemmy.world 17 points 1 month ago* (last edited 1 month ago) (5 children)

I would pay extra to be able to run open LLM's locally on Linux. I wouldn't pay for Microsoft's Copilot stuff that's shoehorned into every interface imaginable while also causing privacy and security issues. The context matters.

load more comments (5 replies)
[–] smokescreen@lemmy.ca 17 points 1 month ago

Pay more for a shitty chargpt clone in your operating system that can get exploited to hack your device. I see no flaw in this at all.

[–] capital@lemmy.world 14 points 1 month ago (1 children)

My old ass GTX 1060 runs some of the open source language models. I imagine the more recent cards would handle them easily.

What’s the “AI” hardware supposed to do that any gamer with recent hardware can’t?

load more comments (1 replies)
[–] ArchRecord@lemm.ee 13 points 1 month ago* (last edited 1 month ago) (4 children)

And when traditional AI programs can be run on much lower end hardware with the same speed and quality, those chips will have no use. (Spoiler alert, it's happening right now.)

Corporations, for some reason, can't fathom why people wouldn't want to pay hundreds of dollars more just for a chip that can run AI models they won't need most of the time.

If I want to use an AI model, I will, but if you keep developing shitty features that nobody wants using it, just because "AI = new & innovative," then I have no incentive to use it. LLMs are useful to me sometimes, but an LLM that tries to summarize the activity on my computer isn't that useful to me, so I'm not going to pay extra for a chip that I won't even use for that purpose.

load more comments (4 replies)
[–] UltraMagnus0001@lemmy.world 12 points 1 month ago (1 children)

Fuck, they won't upgrade to TPM for windows 11

load more comments (1 replies)
[–] FMT99@lemmy.world 12 points 1 month ago (3 children)

Show the actual use case in a convincing way and people will line up around the block. Generating some funny pictures or making generic suggestions about your calendar won't cut it.

load more comments (3 replies)
[–] chicken@lemmy.dbzer0.com 11 points 1 month ago

I can't tell how good any of this stuff is because none of the language they're using to describe performance makes sense in comparison with running AI models on a GPU. How big a model can this stuff run, how does it compare to the graphics cards people use for AI now?

load more comments
view more: next ›