505
submitted 10 months ago by L4s@lemmy.world to c/technology@lemmy.world

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

you are viewing a single comment's thread
view the rest of the comments
[-] daikiki@lemmy.world 31 points 10 months ago

The difference is that cruise control will maintain your speed, but 'autopilot' may avoid or slow down for obstacles. Maybe it avoids obstacles 90% of the time or 99% of the time. It apparently avoids obstacles enough that people can get lulled into a false sense of security, but once in a while it slams into the back of a stationary vehicle at highway speed.

It's easy to say it's the driver's responsibility, and ultimately it is, of course, but in practice, a system that works almost all of the time but occasionally causally kills somebody is very dangerous indeed, and saying it's all the driver's fault isn't really realistic or fair.

[-] abhibeckert@lemmy.world 17 points 10 months ago* (last edited 10 months ago)

A lot of modern cruise control systems will match the speed of the car in front of you and stop if they stop. They'll also keep the car in the current lane. And even without cruise control, most modern cars will stop if a pedestrian steps onto the road.

It's frustrating that Tesla's system can't detect a stationary police car in the middle of the road... but at the same time apparently that's quite a difficult thing to do and it's not unique to Tesla.

It's honestly not too much to ask a driver to step on the brakes if there's a cop car stopped on the road.

[-] SpaceNoodle@lemmy.world 2 points 10 months ago* (last edited 10 months ago)

It's actually not that hard to do, but Tesla is not willing to spend the necessary time and resources to solve the hard problems.

[-] NeoNachtwaechter@lemmy.world 5 points 10 months ago

Maybe it avoids obstacles 90% of the time or 99% of the time.

99 is not enough!

99 means many many more dead people.

You need to go for 99.99%

[-] ilickfrogs@lemmy.world 3 points 10 months ago* (last edited 10 months ago)

Actually it's absolutely realistic and fair. I don't like Musk, or Tesla for that matter. But they make it pretty damn clear that you're 100% responsible for the vehicle when using that feature. Anyone who assumes they don't need to pay attention is a moron and should be held responsible. If a 747 autopilot system starts telling the pilot to take control of the plane and they don't... we wouldn't blame the manufacturer, we'd blame the shitty pilot that didn't do their job.

[-] ShittyBeatlesFCPres@lemmy.world 20 points 10 months ago

I can’t wait to get smacked by a Tesla beta tester and have everyone debate whether the car or the driver is responsible for my innards being spread across 4 lanes. Progress!

[-] daikiki@lemmy.world 9 points 10 months ago

If the driver gets lulled into a false sense of security by a convenience system like this and the automation fails, it's one thing to blame the driver, and that may or may not be fair depending on how much trust you place in the average driver's competence, but the (hypothetical) victim is still dead, and who we decide to blame won't make one iota of difference to that.

this post was submitted on 14 Aug 2023
505 points (96.7% liked)

Technology

55606 readers
4501 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS