Safety or security? An embedded wake-up call from Toyota's misfortunes

It started with the Camry that allegedly “surged” with no apparent accelerator input from the driver. Before long, the media reported customer complaints about many Toyota products, from Corollas to the vanguard Prius and even some Lexus models. As of press time, the only things certain about Toyota’s drive-by-wire vehicles are that the Feds are poking around, Toyota has lost face (and revenue), and digital electronics in vehicles need to be thought of the same way as avionics systems controlling a Boeing 777 at 35,000 feet. Safety-critical systems – which certainly ought to include automobiles – demand special attention by embedded developers.

With so much software controlling life’s everyday conveniences like cars, cell phones, medical monitors, and the world’s banking systems atop the Internet, have we yet come to grips with embedded security? Worrying about the latest DDoS attack on a server is commonplace – just ask the political leadership of Estonia how their banking system was shut down by foreign hackers. But we should be particularly terrified at what a bad actor could do with access to your life via your iPhone – or your Toyota.

According to Robert Dewar, Emeritus Professor of Computer Science at New York University and president of AdaCore, there’s a big difference between safety and security. A secure embedded system needs to have assurance of meeting security constraints per an established profile, while a safety-critical system needs to always work. For example, the NSA’s Tokeneer Project established the requirements for a secure biometric door entry system and used COTS software with formal methods to establish and prove compliance to the security profile (for details, including all the code written in SPARK Ada, see: www.adacore.com/tokeneer).

In the defense industry, myriad standards exist for developing secure software and systems, including separation kernels in operating systems, Multiple Independent Levels of Security (MILS), and time- and space-partitioned environments per ARINC 653, to name a few. But are these ever used in commercial applications? Doubtful.

Similarly, for safety-critical systems such as avionics, the FAA’s DO-178B (software) and DO-254 (hardware) specifications establish criticality levels and formal proof methods to absolutely guarantee that software and systems behave 100 percent predictably. It also follows that every safety-critical system must be secure, because the ability to gain access might compromise safety.

Safety and security are both hugely complicated subjects and ones we routinely cover in our sister magazine Military Embedded Systems. Yet ample standards, resources, and tools such as static analysis and code verification suites can improve tomorrow’s embedded systems. With Intel’s goal of “15 billion mobile Internet devices” by 2015 – including those used in automobiles – how much longer can we ignore the issues of security and safety?

Chris A. Ciufo, Group Editorial Director cciufo@opensystemsmedia.com