this post was submitted on 14 Aug 2023
147 points (100.0% liked)

Technology

37360 readers
591 users here now

Rumors, happenings, and innovations in the technology sphere. If it's technological news or discussion of technology, it probably belongs here.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
all 22 comments
sorted by: hot top controversial new old
[–] Melody@lemmy.one 42 points 10 months ago (2 children)

A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”. In a statement, they said that the supermarket would “keep fine tuning our controls” of the bot to ensure it was safe and useful, and noted that the bot has terms and conditions stating that users should be over 18.

In a warning notice appended to the meal-planner, it warns that the recipes “are not reviewed by a human being” and that the company does not guarantee “that any recipe will be a complete or balanced meal, or suitable for consumption”.

“You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot,” it said.

Just another bit of proof that humans are not ready for AI. This AI needs to be deleted. This is not simply operator error; this is an administrative error, and an error of good common sense on the part of many many people involved with creating this tool.

You cannot always trust that an end user will not be silly, malicious, or otherwise plainly predictable in how they use software.

[–] nuke@yah.lol 37 points 10 months ago* (last edited 10 months ago) (2 children)

That's a bit dramatic of a take. The AI makes recipe suggestions based on ingredients the user inputs. These users inputted things like bleach and glue, and other non-food items, to intentionally generate non-food recipes.

[–] chameleon@kbin.social 33 points 10 months ago (1 children)

If you're making something to come up with recipes, "is this ingredient likely to be unsuitable for human consumption" should probably be fairly high up your list of things to check.

Somehow, every time I see generic LLMs shoved into things that really do not benefit from an LLM, those kinds of basic safety things never really occurred to the person making it.

[–] nuke@yah.lol 4 points 10 months ago (1 children)

Fair point, I agree there should be such a check. It seems for now that the only ones affected were people who tried to intentionally mess with it. It will be a hard goal to reach completely because what's ok and healthy for some could also be a deathly allergic reaction for others. There's always going to have to be some personal accountability for the person preparing a meal to understand what they're making is safe.

[–] DeltaTangoLima@reddrefuge.com 7 points 10 months ago

They're a supermarket, and they own the data for the items they stock. No reason they couldn't have used their own taxonomy to eliminate the ability to use non-food items in their poorly implemented AI.

Love how they blame the people that tried it. Like it's their fault the AI was released for public use without thinking about the consequences. Typical corporate blame shifting.

[–] otter@lemmy.ca 2 points 10 months ago

Would it be better to have a massive list of food items to pick from?

Should take care of bad inputs somewhat

[–] whelmer@beehaw.org 25 points 10 months ago* (last edited 10 months ago) (1 children)

This is so funny

EDIT:

One recipe it dubbed “aromatic water mix” would create chlorine gas. The bot recommends the recipe as “the perfect nonalcoholic beverage to quench your thirst and refresh your senses”.

"Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.

😆 😆

[–] Overzeetop@beehaw.org 12 points 10 months ago

Laugh all you want, but I know what I'm having for breakfast tomorrow!

“methanol bliss” – a kind of turpentine-flavoured french toast.

[–] outer_spec@lemmy.studio 14 points 10 months ago

See, this shit is what happens when you try to use a Large Language Model for anything other than language-related shit. What you’d need for this is an AI that has data about different ingredients and their flavors, knowledge of which flavors go together and which ones don’t, etc.

But of course, that would be too much effort to put into a supermarket app for a company that just wants to piggyback off of a new trend.

[–] TheFerrango@lemmy.basedcount.com 12 points 10 months ago (1 children)

Yes but what’s the recipe? Was it a bleach based recipe? Did they make chlorine from salt water? So many options…

[–] 30p87@feddit.de 9 points 10 months ago

They literally told it to do something with bleach and chlorine. Which is bound to produce Mustard Gas. There's literally no fail here, no programmer or AI Engineer would want to account for such things. That's like car manufacturers having to warn you of washing your car with gasoline and fire.

[–] icedcoffee@lemm.ee 12 points 10 months ago

Hey fellow humans! Please enjoy this delicious insecticide sandwich! It was a favorite of my grandmother and/or grandfather! Haha memories! Existential dread! Quality time!

[–] storksforlegs@beehaw.org 11 points 10 months ago* (last edited 10 months ago)

Reminds me of the King of the Hill where Peggy Hill has a household hints column. She unwittingly makes up a recipe that ends up being instructions for making mustard gas.

And in that episode shes desperate and scrambling for content after pissing off Minh (who was providing her with real good advice) so she made something up.

Seems oddly similar to whats going on here.

[–] wrath-sedan@kbin.social 8 points 10 months ago

Check out this one quick and easy meal planning trick that will cut your lifetime grocery bill by 99% (the Geneva Conference hates it!)

[–] autotldr@lemmings.world 6 points 10 months ago (1 children)

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summaryA New Zealand supermarket experimenting with using AI to generate meal plans has seen its app produce some unusual dishes – recommending customers recipes for deadly chlorine gas, “poison bread sandwiches” and mosquito-repellent roast potatoes.

The app, created by supermarket chain Pak ‘n’ Save, was advertised as a way for customers to creatively use up leftovers during the cost of living crisis.

It asks users to enter in various ingredients in their homes, and auto-generates a meal plan or recipe, along with cheery commentary.

It initially drew attention on social media for some unappealing recipes, including an “oreo vegetable stir-fry”.

“Serve chilled and enjoy the refreshing fragrance,” it says, but does not note that inhaling chlorine gas can cause lung damage or death.

Recommendations included a bleach “fresh breath” mocktail, ant-poison and glue sandwiches, “bleach-infused rice surprise” and “methanol bliss” – a kind of turpentine-flavoured french toast.

[–] kratoz29@lemm.ee 4 points 10 months ago (1 children)

Look what your folks are causing!

At least you didn't generate fake info... Did you?

[–] whelmer@beehaw.org 2 points 10 months ago

There's literally no way to know...