this post was submitted on 22 Nov 2023
453 points (97.7% liked)

News

22877 readers
3820 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

A judge has found “reasonable evidence” that Elon Musk and other executives at Tesla knew that the company’s self-driving technology was defective but still allowed the cars to be driven in an unsafe manner anyway, according to a recent ruling issued in Florida.

Palm Beach county circuit court judge Reid Scott said he had found evidence that Tesla “engaged in a marketing strategy that painted the products as autonomous” and that Musk’s public statements about the technology “had a significant effect on the belief about the capabilities of the products”.

The ruling, reported by Reuters on Wednesday, clears the way for a lawsuit over a fatal crash in 2019 north of Miami involving a Tesla Model 3. The vehicle crashed into an 18-wheeler truck that had turned on to the road into the path of driver Stephen Banner, shearing off the Tesla’s roof and killing Banner.

top 50 comments
sorted by: hot top controversial new old
[–] bedrooms@kbin.social 42 points 10 months ago* (last edited 10 months ago) (7 children)

The concept of autonomous cars might be game over.

As always, advocates forgot about corporate greed. Do you trust your manufacturer to not lie to you? So much you risk killing yourself, your family and people on the road?

[–] Ottomateeverything@lemmy.world 37 points 10 months ago (25 children)

Yeah, the scary part of this is that as much as I absolutely would never go near this shit with a ten foot pole when it's clearly still woefully inadequate and over hyped.... They very frequently drive withing ten feet of me because for some reason it's legal to put this shit on roads with unwilling participants.

load more comments (25 replies)
[–] Fedizen@lemmy.world 8 points 10 months ago* (last edited 10 months ago) (1 children)

Autonomous vehicles ten years ago: Human drivers are slow and prone to lapses in judgement

Autonomous vehicles today: Elon musk, a guy who famously destroyed a rare vehicle like a dumbass, will be training the AI that drives you around. It won't know how to respond to an event not encountered in the training data and it will occasionally run into an ambulance

[–] jopepa@lemmy.world 5 points 10 months ago

And his employees hate him so much I wouldn’t be surprised if there’s a patch released that makes one sustained fart noise when airbags deploy.

[–] mateomaui@reddthat.com 4 points 10 months ago

Maybe at least until there’s a better comprehensive infrastructure of external sensors on the road, at intersections, etc etc etc, to control and limit vehicle movement, but that probably will be a long while before getting those improvements considering normal routine road and bridge maintenance is far behind as it is.

[–] RushingSquirrel@lemm.ee 2 points 10 months ago (1 children)

To me, autonomous vehicles are like AI (it actually is AI in the case of Tesla): the public perception is that it's way better than it really is because it's really good in 80% of cases. But to get to 90-95% will take many many years still. That doesn't mean we shouldn't use them, neither abandon them. To progress, we have to keep using them with caution. Learn the limits and work within it. Don't start firing people to be replaced with AI because in a few months and years you'll realize that the 20% left to improve will be hurting more than your thought. The same way you shouldn't remove drivers just yet.

[–] IphtashuFitz@lemmy.world 2 points 10 months ago* (last edited 10 months ago) (1 children)

But it’s not true AI. In my decades of experience driving cars I’ve encountered numerous edge cases that I never explicitly learned about during my drivers ed days. One recent case in point - I pulled up to a red light at a fairly busy intersection and stopped. While the light was still red a police officer on the corner at a construction site walked out and tried to wave me through the intersection. I was watching the red light so I didn’t even see him until he yelled at me.

How would an autonomous AI car handle that situation if it’s not explicitly trained to recognize it? It would need to recognize the police officer as an authority that legitimately overrides the red light.

Same intersection a few years earlier I saw a car engulfed in flames right in the middle of it. I saw & heard the fire trucks rapidly approaching as I got to the intersection. I, and others, realized we needed to get out of the way quickly. Would a Tesla AI(or any other) recognize the car is on fire and safely move away, or would it just recognize the shape of the car and patiently wait for it to move out of the intersection before proceeding?

The point is that it’s virtually impossible to predict for, and program an AI to handle, every single situation it might ever encounter. A true AI would be trained on a lot of these sorts of scenarios but would need to be capable of recognizing edge cases it hasn’t encountered before as well. It would then need to react as safely as possible to those edge cases in a manner similar to how a human would.

Edit: Downvotes must be from Tesla fanbois who can’t face reality. If the had legitimate arguments they would have replied…

[–] RushingSquirrel@lemm.ee 1 points 10 months ago

This is why AI is a solution, not coding everything. How does one learn how to react in these situations? Either you've learned from watching your parents, by taking lessons, reading the code or by simply following the others. The goal of an AI is to be able to do just that. Coding every single use case is way too complex.

I know Tesla has worked on improving emergency vehicles situations, but I don't know how and what's the current state.

Why are you being downvoted?

[–] CmdrShepard@lemmy.one 2 points 10 months ago

Hell yeah, let these drivers behind the wheel plow into more semi trucks. They deserve it after all. /s

[–] stolid_agnostic@lemmy.ml 2 points 10 months ago (2 children)

I think you need protected ways where no people or non autonomous vehicles may enter. Shy of that, I think you’re right.

[–] Stache_@lemmy.ml 3 points 10 months ago

You’ve just described a train line sir, and we need way more of that shit around here

[–] Patches@sh.itjust.works 1 points 10 months ago (1 children)

Yes it's called the entire public road system. Alpha test your product on your own roads!

[–] stolid_agnostic@lemmy.ml 1 points 10 months ago (1 children)

That’s the point. The road system has cyclists and pedestrians that these cars like to kill.

[–] Patches@sh.itjust.works 1 points 10 months ago (1 children)

Yes but when you said sounded like we needed to make protected ways. It should be them making protected ways.

[–] stolid_agnostic@lemmy.ml 1 points 10 months ago

I’m saying that this is stupid technology that will only work if you separate it from the public.

[–] Death_Equity@lemmy.world 1 points 9 months ago

The Wright brothers first flight was less than the wingspan of a Boeing 747, an aircraft with a rang of over 8,000 miles. The Internet was once called a fad.

Autonomous cars will be the future and people will die before they become the defacto method of personal transport. The unwilling sacrifices of a public alpha test of the technology are worth the losses we must endure to achieve the unparalleled safety of ubiquitous autonomous vehicles that mitigate traffic congestion, pedestrian deaths, unwieldy public transit, and the shortcomings of urban sprawl.

The deaths caused by early adoption benefit the greater good and we should be willing to accept their loss as a necessary evil for a greater good.

Not that I would ever trust a computer to drive my car. I will drive my own car until it kills me, financially or literally, but I can see what good an imperfect system struggling with growing pains will create.

[–] CanadianCorhen@lemmy.ca 29 points 10 months ago (1 children)

Its definitly insane how hearly Tesla started selling their "self driving" cars. the fact that there are cars that paid for self driving, and then never got more than a Level 3 system is insane.

[–] polygon6121@lemmy.world 10 points 10 months ago

Even the beta is considered a level 2 system still. Level 3 would require the system to conditionally take over in certain situations, you will quickly win a Darwin award it you consistently trusted the fsd for any given situation.

[–] bedrooms@kbin.social 19 points 10 months ago (1 children)

Maybe they should start a new motor racing series where autonomous cars race 24 hours in 80km/h with random people walking on the circuit. Then we can trust autonomous cars.

[–] Nobody@lemmy.world 13 points 10 months ago

“We decided to bring the issue to Mr. Musk after the 5000th child died in the simulations. He asked if the children were going to be white.”

[–] leaky_shower_thought@feddit.nl 10 points 10 months ago

AI employee: "we can't release the cars, senior muskrat! we just don't have enough test data to guarantee the algorithm works!"

Senior Muskrat: "test data, you say..."

galaxy brain intensifies

[–] dan1101@lemm.ee 7 points 10 months ago (1 children)

"Full Self Driving" was such bullshit, call it Tesla Driving Assistant or something.

[–] CmdrShepard@lemmy.one 2 points 10 months ago

This article is about Autopilot which is advanced cruise control.

[–] mateomaui@reddthat.com 3 points 10 months ago

Poor guy is having a rough week.

[–] little_hermit@lemmus.org 3 points 10 months ago

The real elephant in the room with AI is that when it works, the network has been over-fitted to the output. And when something completely novel is fed into it, it spits out nonsense that runs over your dog because it looked like a shadow on the asphalt. Poor fido.

[–] Mark132012@lemmy.world -2 points 10 months ago

Ya'll read like you finally found a foothold while in a larger pr campaign and trying to fill in your quota.