Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

VW could have given the EPA the complete source code for its system, and it would likely have made no difference.

The main thing preventing inspect-ability isn't lack of access to the code. It's the incredible complexity of the software. Even barring deliberate attempts at code obfuscation, it would be prohibitively time-consuming and expensive for the EPA to gain any sort of understanding of a codebase of this size.

Comparing a complex software system to an elevator is absurd.



"The main thing preventing inspect-ability isn't lack of access to the code."

No, I'm pretty sure you can't inspect code that you don't have access too.

I get what you are saying about complexity, but I don't think that's an argument for prohibiting inspection altogether. This story directly illustrates that there was at least for this particular car a group of people will and able to inspect at least a particular part of the code, if they had access to it.


>This story directly illustrates that there was at least for this particular car a group of people will and able to inspect at least a particular part of the code, if they had access to it.

How does it illustrate that?

They discovered the scam by doing better emissions testing not by reverse engineering the code.


That's true, but then there followed a long time (18 months?) in which VW denied any wrongdoing, and nobody could really prove them wrong. If the source code were available, the investigators could have quickly gone from "this looks wrong" to "hey, check out this really sketchy source code!" It might also prove that it was deliberate, rather than a complicated accident.

Also, I have a hunch that they wouldn't have done it in the first place. If you know your code is going to be public that's an incentive not to do bad things, even if there's a chance nobody will ever read it.


>Also, I have a hunch that they wouldn't have done it in the first place.

That's a good point. But, I see no practical way to enforce that the source code that has been published is actually the same as that running on the car. Not without prohibitively slowing the pace of development.

> the investigators could have quickly gone from "this looks wrong" to "hey, check out this really sketchy source code!"

I really don't think this is true. Not quickly. The investigators would have be a big team of expensive code audit experts to achieve this. And, even then, if the authors wanted to, they could easily obfuscate the code to the point of making it effectively unfindable.

Apple can't even guarantee that the apps it's auditing for inclusion on the app store are malware free. And that's Apple. And apps are relatively simple compared with the codebase for a modern automobile.

The real way to fix this is just to have better and harder-to-defeat testing procedures.


>>I see no practical way to enforce that the source code that has been published is actually the same as that running on the car.

I was going to say that you could dump the code from a random vehicle and compare the hashes. But I suppose that would require having the entire toolchain to go from source to shipped code.

>>Not quickly.

Perhaps I should've said "more quickly." You've got a point that it's difficult to verify that code doesn't do anything bad. However, it's relatively easy to start from the notion that it's doing one particular bad thing and then find the code that does it, which would've been the case here.


What I mean is that changing that variable would not have much effect.


> Comparing a complex software system to an elevator is absurd.

No it's not, it's a metaphor, a simplification for the purpose of illustrating a point. It's true that many software systems are complex, but it's still important that they be inspectable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: