> but I really want to see apples-to-apples comparison.
EDIT: Luminar's car is on the other lane, and there's also a balloon-child in the Luminar's lane. You can see Luminar's car clearly stop in the head-to-head test.
There's also the "advanced" test, where the kid moves out from behind an obstacle here. Luminar's tech does well:
This "tech" can't even see a firetruck in broad daylight. Why do you think it can see a child?
This isn't a one-off freak accident either. "crashing into stopped emergency vehicles with flashing lights in broad daylight" is common enough that NHTSA has opened up an investigation into this rather specific effect: https://static.nhtsa.gov/odi/inv/2021/INOA-PE21020-1893.PDF
I'm in Sweden, and the sun shining directly into your eyes from barely above the horizon while the road is wet/covered with snow and reflects that sun at you is a regular occurence during winter months. I odubt Tesla's camera will be able to see anything.
This is the reason why a single camera alone is not capable of being the sole source of information for a self-driving system. The technology currently available for camera systems does not capture a high enough dynamic range to be able to see details in darkness when the Sun in in frame. You could use multiple cameras all with different sensitivities to light and combine them, but it's going to be very difficult.
I really don't see what's difficult. You don't even need multiple cameras, you can simply use very short exposures and combine short exposure shots into a longer exposure one when needed. Multiple cameras are useful to handle glare though.
Why would it be very difficult? You can split the same light beam after the lens, and send it to two cameras with different diaphragm or sensitivity. You'd then synthesize a perfectly aligned HDR picture.
Its because Tesla cars are regularly causing "phantom braking" events.
Tesla is trapped between a rock and a hard place. Their "phantom braking" events are causing a lot of dismay to their drivers (https://electrek.co/2021/11/15/tesla-serious-phantom-braking...). But if they reduce phantom-braking, they increase the chance of hitting that child on the road.
Elon claims that the radar was the primary source of phantom braking. He said that matching up a high fidelity sensor (the cameras) with a lower fidelity sensor (the radar) was proving near impossible. I also suspect the supply chain pains massively factored into his decision to remove the radar from all vehicles since roughly late January of last year.
Anyone in the car industry would know this as obviously false? Radar based emergency breaking is availability and works really well in many cars from 5+ years ago.
Radar was removed in May 2021, which predates the article I quoted by multiple months.
I'm sure Elon was blaming Radar for phantom braking in the April / May time period. We can give a few months for the cars to update to the newest version as well.
But by November 2021, RADAR was no longer a relevant excuse. I think you may be mistaken about when Elon said what and when. You gotta keep the dates in mind.
Respectfully, you’re incorrect on the date of the Tesla vision only hardware release. My wife got a model y in early Feb 2021 and it was in the first batch of Tesla vision vehicles that did not ship with a radar. It was manufacturered in January as that’s when we got the VIN. This is first hand experience, not heresay. Elon announced it after they’d been shipping those vehicles for a bit. I was both amused and surprised. She was pissed off that Autopilot was nerfed compared to my 2018 model 3 for max speed as they were working out bugs in the Tesla Vision branch of the code.
I also never said a date about when Elon said those things in my comment, but now understand what you mean about post-vision. But the FSD Beta and Autopilot codebases are so different I am not sure I’d compare them for phantom braking (though recent FSD Beta appears to have way less of this occurrence).
But maybe I’m biased. We have two Teslas, one with, and one without a radar. We’ve seen much more phantom braking with my radar equipped model 3. Anecdotally, I find it happening less in the Y. Also, I didn’t click the article originally as Fred is a click diva and generally disliked by the Tesla community for his questionable reporting. Electrek is an EV fan blog, not much else.
WashPo reports a huge spike of federal complaints from Tesla owners starting in Oct 2021, well into the Vision-only Tesla technology
These are some pretty respectable sources. Federal complaints are public.
> “We primarily drove the car on two-lane highways, which is where the issues would show themselves consistently,” he said in an email. “Although my 2017 Model X has phantom braked before, it is very rare, the vision-based system released May 2021 is night and day. We were seeing this behavior every day.”
So we have Electrek, Washington Post, and the official NHTSA Federal registry in agreement over these phantom braking events spiking in October / November timeframe of 2021. I don't think this is an issue you can brush off with anecdotal evidence or anti-website kind of logic.
That’s totally fair. I’m not pretending it isn’t a problem. Phantom braking is scary as hell when you’re on the highway. I misread your comment on the date and think that’s the thing you really focused on, when I didn’t. You’re right. This is a serious problem.
Tesla partnered with Luminar by the way and even tested their LiDAR on a model 3 last year. I guess they weren't impressed though, since they seem to still be all-in on passive optical recognition.
> I guess they weren't impressed though, since they seem to still be all-in on passive optical recognition.
That's one take - the other take is that they have been selling cars claiming that they are capable of full FSD because they are going to sell it without Lidar, and have been selling FSD as a $5k bolt on, so swapping to Lidar at this point would be a PR nightmare even if it was a better solution....
That's the cynical view though... (Although I also wouldn't be the one to tell the people that have spent lots of money on Autopilot that they have bought total vaporware - or be the CFO that announces they are back-fitting Lidar cameras). Once you are all-in on 'lidar is shit' it makes it hard to reverse the trend, despite rapidly falling costs.
>Once you are all-in on 'lidar is shit' it makes it hard to reverse the trend
It can be done, if there's good cause. Just partner with your lidar oem of choice, get them to do a white paper about how the latest point increase version of hardware or firmware is "revolutionary!" and then claim that your earlier criticisms of lidar have been fully addressed by the groundbreaking new lidar tech.
I've actually been suspecting this will happen once solid state LIDAR technology crossed a certain threshold.
Traditional old school LIDAR units with spinning scan heads are why quite a few self driving cars have the odd bumps and protrusions on them. It's very easy to see someone who wants to make a "cool car" looking at these protrusions, deciding "lidar is shit" and doing everything possible to avoid it. There are some good engineering reasons to avoid traditional lidar units. Meanwhile solid state LIDAR tech has only been on the market for a few years and is still quite expensive compared to traditional LIDAR models, but its definitely superior for a lot of places people want to be able to use LIDAR or where LIDAR would be an excellent competitor to other technology currently in use such as 3D depth mapping and Time of Flight cameras. I briefly looked into some of this stuff when considering work on an "art game" using VR and various 3D scanning technologies in order to make a "fake" Augmented Reality experience as part of constructing the deliberate aesthetic choices of the project.
Solid state LIDAR will definitely be pushed forward by market demand for wider fields of view, lower costs, and smaller module size. All of which will eventually lead to a situation where it will be stupid not to augment the self driving technology due to the massive benefits with zero downsides.
One way out of the LIDAR PR dead end would be for Tesla:
1.) When solid state LIDAR is ready, re-brand it something like SSL technology (Solid State LIDAR) and put it on new high end Teslas.
2.) Wait for all 'camera only' enabled Teslas with FSD beta to age out of service and upsell the owners on a heavily discounted FSD subscription for their brand new Teslas with SSL.
A third path would be to frame the addition of solid state LiDAR as purely an enhancement to their existing cameras, framing it as a camera upgrade instead of a new separate sensor.
That's straight out of Apple's playbook. I recall how Tim Apple ridiculed the OLED displays, until it became impossible to ignore. So I guess it can be done.
> The accusations could be valid or totally baseless
Read the listed report. All 11 accidents were confirmed to be:
1. Tesla vehicles
2. Confirmed to be on autopilot / full self driving.
3. Against a stopped emergency vehicle with flashing lights or road flares.
These facts are not in dispute. The accusations aren't "baseless", the only question remaining is "how widespread" is this phenomenon.
These 11 accidents have resulted in 1-fatality and 11 injuries.
--------
We are _WAY_ past "validity" of the claims. We're at "lets set up demos at CES to market ourselves using Tesla as a comparison point", because Tesla is provably that unreliable at stopping in these conditions.
EDIT: Luminar's car is on the other lane, and there's also a balloon-child in the Luminar's lane. You can see Luminar's car clearly stop in the head-to-head test.
There's also the "advanced" test, where the kid moves out from behind an obstacle here. Luminar's tech does well:
https://twitter.com/PatrickMoorhead/status/14787645152609116...
> I would expect Tesla to also stop if a child was running across the movement path in broad daylight.
Nope.
https://jalopnik.com/this-clip-of-a-tesla-model-3-failing-an...
https://www.latimes.com/business/story/2019-09-03/tesla-was-...
This "tech" can't even see a firetruck in broad daylight. Why do you think it can see a child?
This isn't a one-off freak accident either. "crashing into stopped emergency vehicles with flashing lights in broad daylight" is common enough that NHTSA has opened up an investigation into this rather specific effect: https://static.nhtsa.gov/odi/inv/2021/INOA-PE21020-1893.PDF