this post was submitted on 18 Feb 2024
184 points (100.0% liked)

Technology

37599 readers
370 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Zworf@beehaw.org 24 points 7 months ago (1 children)

Experts told the Vancouver Sun that Air Canada may have succeeded in avoiding liability in Moffatt's case if its chatbot had warned customers that the information that the chatbot provided may not be accurate.

Just no.

If you can't guarantee it's accurate then don't offer it.

I as a customer don't want to have to deal with lying chatbots and then having to figure out whether it's true or not.

[–] intensely_human@lemm.ee 17 points 7 months ago

Exactly. The goal of a customer service is to resolve issues. If communication isn't precise and accurate, then nothing can be resolved.

Imagine this:

"Okay Mr Jones. I've filed the escalation as we've discussed and the reference number is 130912831"

"Okay, so are we done here?"

"You may end this conversation if you would like. Please keep in mind that 20% of everything I say is false"

"But we're done right?"

"Yes"

"What was that confirmation number again?"

"783992831"

"That's different than the one you gave me before before"

"Oh sorry my mistake the confirmation number is actually 130912831-783992831. Don't forget the dash! Is there anything else I can help you with?"