Hacker News new | past | comments | ask | show | jobs | submit login

There were a bunch of articles a couple of days ago how emergency responders are having an increasing amount of trouble with Waymo and Cruise. The root issue seems to be that the cars can't handle exceptional situations on their own, and there are not enough human operators to deal with them quickly enough.



Cruise and Waymo have slightly inconvenienced emergency responders. Tesla crashed into a firetruck and killed someone.


How? FSD means being constantly alert and ready to take over.

If the driver didn‘t do this, he killed someone, not the car. It‘s abundantly clear to anyone opting into the beta.


How then is that FULL self driving? Cruise and Wayno cars have driven millions of miles with nobody in the driver's seat and they have avoided killing anyone. They don't require a human being constantly alert and ready to take over. That is full self driving.


Legal liability is not the same as causation. If a Tesla using FSD crashes into something, then the Tesla software crashed into something.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: