Hacker News new | past | comments | ask | show | jobs | submit login

While this is very encouraging, the TC article is also very misleading. For example: "There have, of course been some accidents that involved Google’s self-driving cars in the past. All of these, however, happened while humans were in control of the cars." - isn't this far more likely to be the case, since the human only takes control when the car can't figure it out?

Also, who's commute doesn't involve temporary construction signs?




Good points. Without any statistics about the frequency and nature of situations where the human driver still has to intervene, numbers like '300K miles without an accident' are meaningless. For all I know the Google cars might have had a near-accident every 10 miles if it weren't for the human driver that's always there to save the AI when it fails. The fact that Google never discloses anything besides mile counts, and only lets outsiders take a ride with the Google cars under extremely controlled situations, is quite telling. Statements about Google car achievements like this are not much more than propaganda to me, until Google starts to share actual in-depth details about the performance of the cars. If it really works as well as they claim it does, what can they lose by disclosing everything?

Myself, I'm extremely skeptical about self-driving cars, and I've never understood why I seem to be massively outnumbered by people who are utterly convinced it is possible to create AI that can handle the billions of different and sometimes completely arbitrary traffic situations you'll encounter while driving on current road infrastructure. Sure enough it will work great 90% of the time in some places, but that's not enough. It needs to work 99% of the time everywhere. Simply looking at the achievements made in AI in the last 50 years or something doesn't really warrant a whole lot of optimism we'll get there anytime soon.


Would you be ok with a computer that augments your own driving (similar to those cars that automatically brake when they sense an accident)? If so, then it's just a matter of continuously improving on this augmentation.

Even if it never reaches the level where you can truly step away from the wheel while the computer drives 100 miles in rush hour traffic, attaching an AI to help avoid accidents in many common situations still sounds revolutionary. I forsee something similar to a plane's autopilot -- where it greatly reduces user error yet sometimes still needs a human to take over when it doesn't know what to do.


That's exactly what I think will be the future: AI-assisted cars, not AI-driving cars. But I don't think it will progress beyond that, for the simple fact that I think it's either all or nothing. You cannot build a car that drives itself sometimes, but still needs someone to take the wheel every now and then. People will stop paying attention to driving which will inevitably lead to accidents when they have to take over from the AI. It more or less defeats the purpose of a 'car that drives itself' anyway.

Don't get me wrong by the way, I'm not 'against self-driving cars' or anything, I simply think it can't be done. Not unless we replace all road infrastructure with something that supports autonomous vehicles, and introduce some form of centralized traffic control, ie: more like air traffic.


We are being promised a self-driving car. Either it drives itself, or it doesn't. If it doesn't, it's not what it claims. And if it does, what are the driving aids for?! That's like wearing a hat indoors.


Who promised you that? To me it's not a binary yes/no on whether this project is successful. Even a marginal reduction in the number of accidents will save lots of money and lives -- even if you still end up behind the wheel helping it along.



> All of these, however, happened while humans were in control of the cars." - isn't this far more likely to be the case, since the human only takes control when the car can't figure it out?

That's a really good point, all though I'd wager a guess that most catastrophic accidents (e.g. crashes at highway speeds and t-bones at left-hand turns) happen when the driver is lulled into complacency by normal conditions.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: