79
submitted 9 months ago by JRepin@lemmy.ml to c/technology@beehaw.org

cross-posted from: https://lemmy.ml/post/5400607

This is a classic case of tragedy of the commons, where a common resource is harmed by the profit interests of individuals. The traditional example of this is a public field that cattle can graze upon. Without any limits, individual cattle owners have an incentive to overgraze the land, destroying its value to everybody.

We have commons on the internet, too. Despite all of its toxic corners, it is still full of vibrant portions that serve the public good — places like Wikipedia and Reddit forums, where volunteers often share knowledge in good faith and work hard to keep bad actors at bay.

But these commons are now being overgrazed by rapacious tech companies that seek to feed all of the human wisdom, expertise, humor, anecdotes and advice they find in these places into their for-profit A.I. systems.

all 24 comments
sorted by: hot top controversial new old
[-] jarfil@beehaw.org 72 points 9 months ago

serve the public good — places like Wikipedia and Reddit forums

Sorry, lost me at Reddit.

[-] storksforlegs@beehaw.org 47 points 9 months ago

Reddit is a flaming monetized dumpster now but it used to be what the article is describing. (mostly)

[-] Franzia@lemmy.blahaj.zone 18 points 9 months ago

Exactly and the idea they serve the public good? They don't serve us, we built what's good there.

[-] lol3droflxp@kbin.social 5 points 9 months ago

I think they meant the community. Same applies to Wikipedia, not like the site would be worth anything without dedicated users.

[-] noctisatrae@beehaw.org 7 points 9 months ago

Reddit has been welcoming for some years tbh

Now I don’t want to hear about it!

[-] sculd@beehaw.org 46 points 9 months ago

The problem is neoliberalism that seeks to turn everything into profit and sees money as the only valuable target in the world.

[-] Jummit@lemmy.one 16 points 9 months ago* (last edited 9 months ago)

Yes. The "tragedy of the commons" is a myth.

Without any limits, individual cattle owners have an incentive to overgraze the land, destroying its value to everybody.

This is factually false, because the land will be destroyed and individuals don't benefit, not even in the short term. Commons work great (see open source software), but capitalism and power structures abuse and destroy them for short-term profit.

[-] conciselyverbose@kbin.social 6 points 9 months ago

The fact that the land is destroyed is literally the point.

It doesn't matter what time scale the land is destroyed in. At every individual point, you having your cattle eat more is better for you than you having your cattle eat less, because you individually starving your cattle completely still won't stop the destruction.

The fact that you somehow don't understand the very simple metaphor is not a failing of the metaphor.

[-] Jummit@lemmy.one 7 points 9 months ago* (last edited 9 months ago)

If this is how everyone would act in their daily life, you would see crime, theft and abuse on an unimaginable level. No, people don't always do what benefits them "at every individual point". We are social creatures, acting as a community where the individuals benefit from working together. Although this has been successfully undermined by capitalism and other hierarchies.

This whole concept is also called, the Prisoner's Dilemma, one of my favorite thought experiments because it shows how being rational can result in everyone being worse off.

[-] conciselyverbose@kbin.social 1 points 9 months ago* (last edited 9 months ago)

It is never not advantageous, as an individual, to graze as much as possible. Your "analysis" ignores that very basic, unarguable fact.

[-] Jummit@lemmy.one 4 points 9 months ago

Sure, it's advantageous in the short-term. I think this is where we misunderstand each other. What I'm trying to say is that under normal circumstances, individuals aren't maximizing their output. They are just living as part of the community, following the unwritten rules and benefiting from that. (In the prisoner's dilemma, this would be choice A).

[-] conciselyverbose@kbin.social 1 points 9 months ago* (last edited 9 months ago)

That's the point. That's the entire (and entirely correct) metaphor.

People ignore the communal benefit because in the short term it's better to do so.

[-] CanadaPlus@lemmy.sdf.org 1 points 9 months ago

Although this has been successfully undermined by capitalism and other hierarchies.

Yeah, but when we've had no hierarchy we've always had constant warfare, which is also highly abusive (not to say that's inevitable). We're just as naturally capable of antisocial behavior as other species, when we can either socially get away with it (cow grazing, Easter Islanders killing all their trees) or do it to people we've decided are others (every time hunter-gatherer bands killed or enslaved each other).

You're right we can act cooperatively in the right situation, but let's not make it sound like human ignorance is new or unnatural.

[-] astraeus@programming.dev 30 points 9 months ago

Enshittification is the middle name of AI

[-] Kolanaki@yiffit.net 13 points 9 months ago

Artificial Enshittification Intelligence

[-] loops@beehaw.org 7 points 9 months ago

Aeeeiiiii!!

[-] lloram239@feddit.de 14 points 9 months ago* (last edited 9 months ago)

As far as I am concerned, the Internet has been in a downward spiral ever since smartphones got popular, with nothing in sight to stop that trend. The Web got crippled by content getting moved into proprietary apps and what's left of the Web is filled with so many ads that an adblocker is pretty much mandatory.

AI provides a way out of this darkness, as it can absorb raw information and regurgitate it in whatever form you desire. That's huge, that's like Adblocking, ReaderMode and a whole lot of other tools rolled into one, just even more flexible and controllable by natural language. You can finally separate the information from its (often malicious) presentation. Bots like AutoTL;DR are just the start of it, a lot more little helper like that will follow, especially once we get multi-modal models that can understand and navigate graphical elements.

That would be a good step forward, since many A.I. systems do not fully disclose the data they were trained on.

Neither do most journalists. Most of the articles out there are just copied from other blogs, lacking any originality or fact checking and not even providing links to those sources. Even this very article is just regurgitating the same tired old talking points that have been circulating for a year or so.

[-] StereoTypo@beehaw.org 3 points 9 months ago* (last edited 9 months ago)

While I admire your optimism I think that AI will snuff out journalists/writers/artists regardless of the merit of their work. While that is possibly beneficial when looking for some factual data I think we will face a crisis when it comes to creative, investigative and critical content. Like garden hose of original work spraying against a tsunami of undisclosed generative media.

[-] MummifiedClient5000@feddit.dk 9 points 9 months ago

It's already much worse. But it's about to get much worser.

[-] autotldr@lemmings.world 8 points 9 months ago

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summaryThanks to artificial intelligence, however, IBM was able to sell Mr. Marston’s decades-old sample to websites that are using it to build a synthetic voice that could say anything.

A.I.-generated books — including a mushroom foraging guide that could lead to mistakes in identifying highly poisonous fungi — are so prevalent on Amazon that the company is asking authors who self-publish on its Kindle platform to also declare if they are using A.I.

But these commons are now being overgrazed by rapacious tech companies that seek to feed all of the human wisdom, expertise, humor, anecdotes and advice they find in these places into their for-profit A.I.

Consider, for instance, that the volunteers who build and maintain Wikipedia trusted that their work would be used according to the terms of their site, which requires attribution.

A Washington Post investigation revealed that OpenAI’s ChatGPT relies on data scraped without consent from hundreds of thousands of websites.

Whether we are professional actors or we just post pictures on social media, everyone should have the right to meaningful consent on whether we want our online lives fed into the giant A.I.


Saved 83% of original text.

[-] fixmycode@feddit.cl 8 points 9 months ago

I don't want to sound like a pessimist but the Internet has never been the open grass field that the OP paint. Everytime you connect to the Internet, you're connecting to a server that some entity is providing, through a connection that another entity has set up. Even this Lemmy instance is paid by somebody's pocket. Servers and network infrastructure have always represented cost to providers. Maybe in times of olde when AOL and others offered services attached to their core service, we had services that were directly paid by the fee we paid for the connection. The owner of this Lemmy instance don't see a dime of what you pay you ISP.

I know this is not at the core of this discussion, but if content is something that entities find valuable and somehow, the owner of this instance can directly receive monetary incentive from me to keep posting these inadequate long texts, by all means, I'm happy to be part of training data. I type this while I'm bored as hell and need my upvote-provided dopamine hit. I will be the grass on the field.

[-] StereoTypo@beehaw.org 3 points 9 months ago

I think you make some valid points about server and network ownership but I think the mid to late 2000s had an interesting blend of corporate and personal internet spaces. Yes there were some leviathans MySpace and Facebook, but a lot of the places that had relevancy were forums, portals or imageboards. Hell, YTMND was an early example of a individual hosting and building a massive site out of complete nonsense. That landscape has certainly changed significantly since.

[-] CanadaPlus@lemmy.sdf.org 5 points 9 months ago* (last edited 9 months ago)

For the trust issues: Yeah, but on the other hand.

We'll just have to go by figuring out who we can trust again. That's a loss, but not as catastrophic as this article makes it sound.

As for the creativity issues, that's not a new problem at all. Portrait painter used to be a lucrative profession before cameras, for example. If it's only a few types of content (namely the kind that doesn't rely on concept originality in this case) society adjusts with only short term pain. Bug me about when we have GAI in a few years that can do everything or how the shift this will encourage in our society could cause secondary problems instead.

I'm sure glad we're talking about this stuff, though! If AI happened quietly with no debate I'd be much more afraid.

this post was submitted on 23 Sep 2023
79 points (100.0% liked)

Technology

37208 readers
119 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS