Hacker News new | past | comments | ask | show | jobs | submit login
Tesla ‘full self-driving’ triggered an eight-car crash, a driver tells police (cnn.com)
78 points by petilon on Dec 22, 2022 | hide | past | favorite | 63 comments



> Tesla’s driver-assist technologies, Autopilot and “full self-driving” are already being investigated by the National Highway Traffic Safety Administration following reports of unexpected braking that occurs “without warning, at random, and often repeatedly in a single drive.”

> The agency has received hundreds of complaints from Tesla drivers. Some have described near crashes and concerns about their safety. This summer NHTSA upgraded the investigation to what it calls an engineering analysis, an indication that it’s seriously considering a recall.

Very curious about how Tesla could continue to justify selling full self driving if NHTSA demanded a recall.


Your quoted quote states they didn’t demand a recall, only that they’re seriously considering it. It’s not a recall until it’s a recall.

https://www.nhtsa.gov/technology-innovation/automated-vehicl... (NHTSA: Automated Vehicles for Safety)

> If a vehicle is driving itself, who is liable if the vehicle crashes? How is the vehicle insured?

> It is vital to emphasize that drivers will continue to share driving responsibilities for the foreseeable future and must remain engaged and attentive to the driving task and the road ahead with the consumer available technologies today. However, questions about liability and insurance are among many important questions, in addition to technical considerations that, policymakers are working to address before automated driving systems reach their maturity and are available to the public.


I believe that's why it would be phrased 'could ... if' instead of 'will ... as'


That's interesting, I actually read it the unintended way the first go. I guess 'would ... if' might be more clear, but it's also kinda a different question.


The article doesn't say it but I believe this issue (of going from highway speed suddenly down to ~20mph) has been called 'phantom braking' by Tesla owners.


Yep, and after my five years of owning a Model S they still haven't been able to fix it. At least it keeps me from getting too complacent when having AP activated. It's not really a full-on slamming of the brakes for me, just a very brisk slowdown. And I can usually catch it after slowing down by like 20kph. But yeah, AP/FSD really hasn't been making any progress and I assume legacy automakers' distance- and lane-keeping systems must be safer and more reliable than Tesla's is by now.


I guess all 8 cars were Teslas with FSD enabled because obviously attentive human drivers would have been paying attention with adequate distance and been able to handle the sudden slow down.


New cars without self driving have assists to avoid exactly these kind of crashes - it takes over and brakes for you.


I have a MY with FSD, and I do not trust it all. I don’t want to make excuses for the car, but if you’re paying attention then your car won’t scrub 35 mph and cause a pileup. My experience is I can hit the accelerator within 5-8 mph and the instant acceleration can recover. I don’t trust FSD, so therefore I can recover quickly.


I’ve been in a Model 3 with 1 fender bender and a near head-on collision due to using FSD in just two days. I wouldn’t trust FSD myself for a few more years at least. A major breakthrough has to happen because the current version feels not safe enough to be in public beta.


How many here believe todays robot cars are fully safe and why? Because Tesla/Musk says so..he says a lot on Twitter and changes his mind constantly!

Personally I'd never trust him with my life and the lives of other drivers surrounding me. Hes a showman/car salesman.


I can only speak from my experience; I’ve driven over 100k miles with Autopilot (2.5 rev hardware, ‘18 Model S) over the last 4.5 years, sometimes at speeds upwards of 70mph-80mph through mountainous terrain in driving rain at night (both in Appalachia and the Rockies). Disengagement is rare, and I will purposely push the vehicle harder to find its limits. Fully safe? Hardly, nothing is risk free, but 38k people a year die in accidents caused by humans, so the bar is not perfection. It’s your average human, which is fairly terrible at the task. They get tired, they don’t pay attention, they drink too much and get behind the wheel, and so on.

https://www.tesladeaths.com/

Tesla Deaths Total as of 12/21/2022: 335

Tesla Autopilot Deaths Count: 19 (my note: this total death count is 1/5th the amount of people who die daily [104] from human caused auto accidents)

I trust the vehicles and Tesla engineers enough to put my family in them. The evidence is clear how well they stand up in an accident, receiving highest crash test marks both in the US and Europe (with the Model Y with the Vision only stack receiving some of the highest Euro NCAP scores ever), which I can confidently decision with versus emotion about its leadership team. I really don’t care what Musk says, as long as my family is going to walk away from the worst accidents they might encounter, and the data shows they likely will.

> Euro NCAP Secretary General Michiel van Ratingen issued a statement praising the Model Y for its stellar safety scores. “Congratulations to Tesla for a truly outstanding, record-breaking Model Y rating. Tesla has shown that nothing but the best is good enough for them, and we hope to see them continue to aspire to that goal in the future,” van Ratingen remarked.

https://www.tesla.com/blog/model-s-receives-5-star-euro-ncap...

https://evannex.com/blogs/news/tree-falls-on-a-tesla-model-3...

https://www.teslarati.com/tesla-vision-model-y-highest-euro-...

(No experience with the “full self driving” software stack, so I apologize if your questions were scoped specifically to that constraint)


>sometimes at speeds upwards of 70mph-80mph through mountainous terrain in driving rain at night

You shouldn't be on the roads.


I must say I am a bit skeptical of your claim about driving 100K with autopilot at 70-80MPH on mountainous, rainy roads. That's not consistent with anything that has been reported about the capabilities of the device.

Note that 38K people dying a year in the US is 1 per 10 million miles driven. So obviously humans can't be that bad at driving.

Edit: why did you delete your reply?


good to hear that you are speeding in poor visibility conditions on mountain roads at night just so you can push the car to its limits


Tesla Autopilot Deaths Count: 19 (my note: this total death count is 1/5th the amount of people who die daily [104] from human caused auto accidents)

Come on, you’re way too smart to make this comparison. The percentage of Autopilot-driven miles in the US per year rounds to zero.


I think you’re misunderstanding my assertion or I’m not communicating it properly. People will die behind the wheel, this is inevitable. You will not get to zero auto fatalities in our lifetime. 19 deaths for how many Tesla miles are driven autonomously, in your words, rounds down to zero. For comparison, GM’s ignition switch recall (which they knew was deadly for a decade), required a 30 million vehicle recall, resulted in 124 deaths, and a $3B loss in GM shareholder value.

https://en.wikipedia.org/wiki/General_Motors_ignition_switch...

~5 billion miles have been driven on Autopilot (and that growth rate accelerates as Tesla continues to ramp manufacturing), and if it was truly wildly dangerous, we’d be discussing far more than 19 deaths from 16 incidents. Check my math: assuming the above, that’s 1 death every 263,157,894 Autopilot miles driven.

You are more likely to be killed by a shark attack or a falling coconut than Autopilot, statistically speaking.

https://www.tesladeaths.com/miles.html

https://electrek.co/2020/04/22/tesla-autopilot-data-3-billio...

(I appreciate the kind words by the way, thank you)


Autopilot is typically driven only in the newest cars by the richest people in ideal conditions. Heck, Autopilot disengages when it gets confused, which is often. And even with all of that, the fatality rate (at least the ones we know if) is only a third of the U.S. total population on a per-mile basis. Autopilot is likely more dangerous than the same set of drivers on the same routes in the same Teslas with Autopilot disabled.


And yet, regulators across the world (US, Europe, Canada, China, Japan) allow its continued sale and use. Maybe they’re right and HN is wrong? How would you reconcile conservative regulators not taking action while participants here carry on about how unsafe it is? Inaction by one regulator might be explained, but all of them? Occam's razor leads us to a tolerance for the risk exposure, not a conspiracy.

It appears that it has to be proven dangerous, not foolproof, due to the driver still being responsible and assuming liability for its use, and that burden has not been met by critics.

Edit: I’m appealing to, broadly, transport safety professionals across various governments and continents (in aggregate) versus randos on an Internet forum. That seems like a pretty solid strategy from a risk perspective, as your risk math isn’t impacted by failures at a single regulatory body, nor by a layman without the professional body of domain knowledge that might be missing something critical.


> And yet, regulators across the world … allow its continued sale and use.

Well, for one thing, regulators generally don’t “allow sales” but disallow things for various reasons.

By this logic the continued sale and use of tobacco is perfectly safe because the regulators allow it.


I don't understand why you are appealing to authority while earlier pointing out the GM ignition switches that killed 124 people. If authority are a valid judge of safety, why didn't they act on the GM ignition switches earlier?


> And yet, regulators across the world (US, Europe, Canada, China, Japan) allow its continued sale and use

The article is about Tesla FSD, not Autopilot (fancy word for cruise control and lane keeping).

I'm not aware of this feature being allowed for sale outside of North America, is this not the case?


The regulators probably don't have the expertise or access to sufficiently evaluate autopilot safety. It's not in Musk's interest to be transparent about the dangers. This is the guy who keeps cutting sensors to increase profits over safety.


> And yet, regulators across the world (US, Europe, Canada, China, Japan) allow its continued sale and use.

I wouldn't be so sure, with more and more agencies signaling their intent to escalate...


I guess you could trust Cruise ADS from GM, the same company that killed people with a faulty ignition switch (124 deaths) or literally burned down customer's homes (see the Chevy Bolt). You might find that NHTSA is investigating Cruise ADS too. (https://static.nhtsa.gov/odi/inv/2022/INOA-PE22014-4871.PDF)


Tesla stock taking another beating today ($126 right now), is that because of this article or is Elon selling again? Was Elon aware that this would hit the news last week? If so that might explain some of his stock sales.


These articles pop up regularly and don’t move the stock. It’s not related. There have also been times when it’s reported autopilot caused the accident but NHS investigation reveals it wasn’t even engaged.


I'm not sure there's not at least some correlation. When there was lots of upwards momentum with TSLA, then certainly these kinds of articles didn't have much sway. But with everyone apparently looking for a reason to sell, news like this can certainly add to the pressure.

But in general, yes, TSLA has been taking a beating for the last few weeks.


The total deaths caused by Tesla AP/FSD is 1900% higher than the self-driving of every other automaker in the world combined.

The total number of accidents caused by Tesla AP/FSD is 2.5x higher than the company with the next worst self-driving system, and is more than 60x higher than Toyota's self-driving systems.

Anecdotally speaking, on LA freeways self-driving Teslas are easy to spot: they drive like drunk teenagers getting behind the wheel the first time. My coworker has a Tesla and had so many disengagements on his daily commute (3+ each way) that he no longer uses the system.

Normalizing the data (i.e., scaling it up), Teslas are only safer than drunk drivers.


I’m not a fan of Tesla’s self driving claims either (I specifically avoided a Tesla completely when buying this year) but total deaths seems like the least relevant piece of information that could be used here. It needs some sort of adjustment like at least deaths/mile or similar before throwing up comparisons. As is one can’t tell if this is just like saying cities are ridiculously unsafe because most people die in them vs the rest of the entire world combined or if it’s actually how much worse it is.


The additional aspect is that other manufacturers do not exaggerate capabilities that much - and consequently their customers are less likely to engage autopilot in situation where it is about to perform badly.


I imagine Tesla has more than 19x the self-driving miles of every other automaker in the world combined too.


Yeah we need deaths per ownership or something.


Okay, here's a better comparison: Since 2015, Toyota has sold more than 18 million cars in the U.S., all of which have advanced cruise control functionality comparable to AP. Tesla has sold approximately 3 million vehicles worldwide since 2015.

Teslas using AP are more than 60x as likely to get into an accident compared to Toyotas. That's just from the U.S. data. It gets even worse, for Tesla, if you take into account Toyota's worldwide sales.

However way you crunch the numbers, Tesla's advanced driving systems are objectively worse than any other car maker's competing systems.


> advanced cruise control functionality comparable to AP

No… not at all comparable. Not even close. Advanced cruise control doesn’t navigate you anywhere it only assists a driver by keeping them centered in the lane or keeping pace with the car ahead of them. Telsa has this too, and it is not the same thing as autopilot. You clearly have no idea what you are talking about.


I’m sure there’s no incentive for a driver who’s responsible for an eight car crash to pass buck


You can’t even be responsible for an 8 car crash like this, because they shifted lanes, they are responsible for the first car, but the 6-7 after that are their own faults for tailgating.


It's really a travesty of how financial liability is handled in this country that your comment is not falsehood.

If this were a normal negligence tort vs an automobile crash there would be a very strong case to be made that the driver who set the whole chain of events in motion is responsible for all of it under something akin to the eggshell skull rule.


no. The driver is responsible to keep speed and distance from previous car such that he/she can stop if necessary.

When you hit someone from behind, it is your fault, because you should have been driving more slowly or further away

And that is also absolutely correct way of evaluating it - the one in the back controls attention, speed and distance. The one in front does not.


>no. The driver is responsible to keep speed and distance from previous car such that he/she can stop if necessary.

And pretty much every state also has a law drivers not to do things like panic stop in the highway without a reason. The rules of the road have redundant responsibility baked in almost everywhere on purpose

We abridge a lot of normal rules and legal doctrine for handling civil liability when it comes to car stuff because we have to have quick rules of thumb to deal with the sheer volume of incidents. What you are arguing in favor of here is a very sloppy rule of thumb and such advocacy is likely because you have assessed yourself to be far more likely to be the source of a multi-car wreck than one of the people caught up in it.

Ask yourself, if you brake checked someone and caused a chain reaction would you be morally culpable? The overwhelming majority of the population would say yes. So why should you not be financially culpable?

Now ask yourself if instead of brake checking someone you were stopping because a child wandered into the road how would that change things.

On what grounds do you justify using the same allocation of liability for both these situations?


In this case there was a reason. The self driving system determined it was required for safety to stop. It wasn’t just to cause an accident. Had the other drivers been following the law and driving safely, they wouldn’t have crashed.


Presumably the same incentive as Tesla has.


And the article doesn’t state Tesla is making the opposing claim. If it were I’d be equally critical of that press report.

Or maybe to the folks here “some guy claims” is a valid source. Maybe I’m in the wrong lol


“Some guy claims”, is a valid source when the claim is credible. We can’t be sure it’s true of course, until we learn more, but it’s just silly to dismiss it out of hand.


Same? It’s way more liability to handle as a single person than a large company. So


If Tesla admits "FSD" can cause this accident, they admit that it can cause any Tesla accident. It could result in millions in fines and potentially much more in lost sales.


Publicly traded companies have fiduciary duty to protect their reputation.


Tesla should have logs that show the driver is responsible for the braking, if so


My uncle works for an insurance company, and a tesla is one of the absolute cheapest vehicle to insure.

I hear all these "tesla hit pieces", but my understanding is that they are much safer than human drivers, and insurance reflects that.

I don't see the issue as autopilot killing people, I see the issue as its saving lives, but the insurance aspect and AI aspect and responsibility need to be sorted out.


this sounded off, as I've driven my dad's Tesla and its combination of blazing speed and total silence, not to mention that Teslas are well within the high end luxury category, does not seem like insurance would be cheap. They are also expensive to repair if accidents occur.

Looked it up and found https://www.nerdwallet.com/article/insurance/tesla-insurance

"The high cost of a Tesla doesn’t stop at its price tag. Insurance for a Tesla tends to be significantly more expensive than that of other popular vehicles, despite high safety ratings."

"Still, rates can vary drastically, even among different Tesla models. Below are the average annual insurance rates by Tesla model, ranked from cheapest to most expensive:"

    Model Y ($2,040).

    Model 3 ($2,115).

    Model S ($3,008).

    Model X ($3,044).

That is not cheap at all. I own three cars at the moment, one of which is also a new (non-Tesla) EV and I pay $3600 /yr in total.


It looks like the cheapest rates per state are a fare bit cheaper. There might be other reasons besides safety like repair cost/parts availability. How much damage they cause (usually a function of weight and speed) is generally factored in, too.

Still high, though.


I stand corrected. 30% more than average.

I believe that must have been a value judgement then.

Would need the average car price as well to get a true understanding, but at best that's going to make it middle of the pack.


I can't believe people pay these carrying costs. Two Toyota's in CA is $600/yr for me, 100k/300k bodily, 100k property.


OT but you are way under insured, unless you think you are judgment proof. At 100/300 your insurance company will pay the policy limit and you will be in court on your own.


Interesting, I have Geico and my understanding when I filled out the form was 100/300 was the max option, unless you meant needing umbrella.


Most Teslas (and most cars overall) are presumably financed, with the associated comprehensive coverage costs.


Mine include comprehensive, but not collision. True, a big part of the cost must be collision on a $50k asset. I think that's a hidden cost for people. They don't realize they're paying so much more to risk $50k every time they drive and have to spread that risk, which is still hefty.


Tell us more about the insurance company


Geico


geez what insurance company is that


Geico


thats who I have! havent looked in awhile to see what I'm paying for


That's hilarious, because in the UK we have car insurance groups that anyone can check. They rank from 1-50, with 50 being the most expensive to insure, and 1 being the least expensive. Let's see how Tesla's entire range do:

- Roadster - Group 50

- Model 3 - Groups 48-50 depending on exact model

- Model S - Group 50 across all models

- Model X - Group 50 across all models

- Model Y - Groups 46-49 depending on exact model

Yeah, seems pretty bad to me!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: