Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree completely, but on the other hand I'm not convinced that tighter government regulation of ECU code would be better. Can a bunch of government bureaucrats come up with a set of standards and regulations that would actually be beneficial? Given the track record with similar projects, it looks doubtful.

Really I'd say that part of the problem here is that academia has been letting us down. CS programs are universally of fairly low quality, in my opinion, and proper software engineering programs are very rare. There has been insufficient pure research into software development practices, software design patterns and features, and so on in regards to what is required and what is beneficial when it comes to creating control software and firmware. Industry too has been letting us down with their lack of pure research in general, but that's been obvious for a while.

We're starting to reap what we've been sowing for the last several decades in software engineering. We got out of the first "software crisis" where many software projects didn't even deliver anything worthwhile or functional, but now we are in another perhaps even more severe software crisis. One where shipping software that "works" isn't a problem, but where making sure that it does the "right thing" and is sufficiently secure, robust, etc. for the intended use is becoming a huge issue. And not just a financial one, but one that can (and will, and has) result in injury, death, and destruction. We very much need to wake up to the seriousness of this problem, it's not going to get better without concerted efforts to fix it.



I develop safety critical software for railway applications. We have to follow some ISO norms that contain some sensible rules. For example, code reviews are mandatory, we need to have 100% test coverage, the person who writes the tests must be different from the person who writes the code etc. This leads to reasonably good code.

It also makes some things a lot more difficult. For example the compiler must be certified by a government authority. This means we're stuck with a compiler nobody ever heard of that contains known (and unknown) bugs that can't be fixed because that would mean losing the certification.

I assume the car industry has a similar set of rules and the problem is not a lack of rules, but a lack of enforcement.


> We have to follow some ISO norms that contain some sensible rules. For example, code reviews are mandatory, we need to have 100% test coverage, the person who writes the tests must be different from the person who writes the code etc.

The exact same thing happens in the car industry.

> I assume the car industry has a similar set of rules and the problem is not a lack of rules, but a lack of enforcement.

Bingo! Right now I'm staring at some ECU code(not safety relevant, thankfully) that looks like it's been written by a monkey, but I'm a new addition to the team, have no authority here yet and we have to ship it like yesterday.

Guess what will happen.

Truth be told, for safety relevant applications, I've seen the code and it's quite good. And the issue in this case is not that the software was badly built, it's that it was built with deceit on their mind.


>some ECU code(not safety relevant, thankfully) //

What parts of the running of an automobile engine aren't safety relevant?

Sounds like "oh we made the stock for that shotgun from cheap, brittle plastic as the stock isn't safety relevant; how were we to know that it would crack and embed itself in someones shoulder?".

You're right that the primary issue here is deceit but the issue of closed source code in such systems is how that deceit was possible [edit, should probably say "facilitated that deceit" as the deceit would still be possible, just harder and move discoverable with open source]; and that leads to questions of safety as if companies will screw over the environment against the democratic legislation then they're unlikely to be mindful of other morally negative consequences.


Infotainment, air conditioning, etc. There are many many more ECUs in a car than just the one in the engine.


When you have an organizational culture that places meeting deadlines without sufficient planning or resourcing above quality and safety ... the result is inevitable.


I work in a different industry. We have to follow some sensible rules: code reviews, 80% minimum coverage. What happens in practice is that the test verify that the result is not null (and nothing else) and the code reviews pass... God knows how. I have seen methods with a cyclomatic complexity of 65 and methods a few hundred lines long. Oh, and this is in Python - the Java code is worse.

[I was also told by my team leader "no, you can't fix that code, it belongs to X from the main office and he will get angry and not help us anymore".]


> This means we're stuck with a compiler nobody ever heard of that contains known (and unknown) bugs that can't be fixed because that would mean losing the certification.

This is why regulators should embrace formal methods as an alternative to process-heavy regulation. They're actually measuring ground truth, and today are not that much more expensive after accounting for all the costs associated with certification processes.


... or highly intensive 8-years-of-in-service-operation-equivalent testing at the system level ...


System-level testing doesn't always suffice; see the Toyota UA case.

Or, more topically, see the VW case for examples of why testing "in-service-operation-equivalent" requires a certain level of trust that's not ideal in a regulatory relationship.


Governments just need to regulate one thing on the ECU: access to the code. They don't have to make any specific laws. Of course to be able to proof, this is the code uses, one would need access to upload it to the ECU. (And compile it.)

Which gives another issues: people breaking the law by changing their ECU map, to exhaust more pollutants.

But I suppose this is no more of an issue then it was/is with WiFi access points, not a lot of people do it, and you don't want to brick your car :D


I'd say there are some successful efforts to regulate software in safety critical areas. The FAA comes to mind. I worked in the avionics industry for a while, and there are strict standards to which flight management and avionics displays software must be adhered. The DO-178 family of documents defines these standards/guidance/whatever. As a young engineer at the time, I remember two of thinks were were not allowed to do under DO-178B... Pointer math of any kind, and dynamic memory allocation.

These standards have been around a long time too.

https://en.wikipedia.org/wiki/DO-178B


I find your comment about CS programs a bit misplaced. Are you aware that a research team from WVU actually uncovered this emissions problem in the first place?

The "VW diesel-gate" aside, I do share your feelings about the quality of CS programs in general. There is nowhere near enough education about real-time systems, high-reliability systems, and formal verification methods. All of these topics are completely appropriate academic material, in addition to being fundamentally useful for business needs. I'm not sure any of these topics are covered in the usual undergraduate curriculum.


Auto makers have crash tests and even pay private independent companies for it, AFAIR. Let's create a software certification entity that gives stars and stuff that auto makers can display in their ads.


What about mandatory "preferred form for modification" source code releases and mandatory bug bounties?

That is, if anyone finds a bug impacting safety in the ECU code, the manufacturer has to pay $1 million to them.

If any employee shows they release obfuscated source code, or the binary is not compiled from that source code, they get $100 million reward paid by the company and criminal charges are filed against those responsible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: