Countering CAN bus vulnerability

Physical security and cybersecurity are needed to protect avionics in modern small airplanes and helicopters.

Instruments and avionics in the cockpit of Northeast Helicopters' Robinson R44 helicopter at the Ellington, Connecticut, Airport. Instruments like these could be fed false information by a hacker exploiting the CAN bus vulnerability.
Images courtesy of Kelser Corp.

Poor visibility conditions force the pilot of a small aircraft to operate on an instrument flight plan, navigating and maintaining aircraft control on instrument readings alone. It’s OK – she’s trained for this. Suddenly, the plane crashes into an obstacle that should not have been a factor. The instruments were giving false readings, guiding the pilot directly into a target.

It sounds like something out of an action movie – a similar scene plays out in Die Hard 2 when a plane crashes after villains modified the instrument landing system. In reality, the recent discovery of a cybersecurity vulnerability in the Controller Area Network (CAN bus) that allows instruments and avionics to communicate reveals that this type of scenario is technically possible. An attacker with unsupervised physical access to a plane could compromise the data it shows the pilot or alter its flight path.

Thankfully, this vulnerability has not been exploited – it’s simply been exposed by cybersecurity researchers. The risk is serious enough that the U.S. Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) issued an informational alert in July 2019. The CAN bus raises important questions for the aerospace industry and highlights concepts central to a sound cybersecurity strategy.

CAN bus vulnerability

The CAN bus is basically a system of wires that allow instruments and avionics in modern small airplanes and helicopters to share information without a central computer. On aircraft, the CAN bus is often located in an easily accessible compartment for maintenance.

In tests conducted by cybersecurity firm Rapid7, a small device plugged into the CAN bus could alter information displayed such as altitude, airspeed, and engine readings. It could even control or disable autopilot.

A hacker would need the knowhow and a few minutes with access to the aircraft, requiring special planning and knowledge, especially since aircraft are stored in secure facilities. While it’s not something the run-of-the-mill hacker could pull off, the potential for destruction is great, and the worst-case outcome is more spectacular than hacking the automation system in a car.

So far, the only fix for the vulnerability is keeping planes under lock and key. Savvy aviators likely would be able to notice and compensate for bogus data showing up on the instruments. For instance, there is almost always an old-school, non-powered magnetic compass in the aircraft that would be unaffected by hacking of CAN bus-connected avionics. Many pilots also fly with extra navigation equipment that isn’t networked into the aircraft, such as an iPad running Foreflight.

While flying under instrument conditions, a properly trained pilot would likely notice the discrepancy between the in-aircraft avionics and the non-networked, supplemental systems. There are, however, times during an instrument flight where the margin for error is very small, especially while on precision instrument approach into an airport with low visibility.

Cybersecurity oversight

Aside from finance, aerospace manufacturing is now the most cybersecure industry in the U.S. NIST 800-171’s strict and sweeping cybersecurity guidelines took effect Dec. 31, 2017. The guidelines apply to any manufacturer handling controlled unclassified information (CUI) anywhere in the federal government supply chain. NIST 800-171 covers more than a dozen criteria on how to configure networks, how to store and track data, and how employees receive access.

Though the deadline has long passed, many smaller manufacturers are still getting up to speed, spurred by compliance requirements from larger companies they supply. Plus, the guidelines are so complex and thorough, it takes months to achieve compliance.

With increased focus on aerospace manufacturers protecting their environments and data, the CAN bus vulnerability is a reminder to keep cybersecurity top of mind when designing software and networked systems. Engineers must put themselves in hackers shoes and imagine how a given system could be exploited, now and hypothetically in the future. With new Internet-of-Things (IoT) exploits coming to light regularly, it’s easy to see the dangers of designing an open CAN bus system – which dates to the 1980s – even when no immediate threat is apparent. The best outcomes are achieved when security is part of the design spec, not an afterthought.

Physical cybersecurity

The CAN bus vulnerability illustrates a concept that data must be physically and electronically secured. A manufacturing facility’s physical security is covered in the NIST 800-171 cybersecurity guidelines. The flip side of this is that the physical security of aircraft, airports, and hangars is the only thing keeping the CAN bus vulnerability from being exploited. We would not be having this discussion if the CAN bus had a robust authentication mechanism that would only communicate with good devices.

Effective cybersecurity strategies have multiple layers. Physical security is often ignored as some companies fail to secure server closets or monitor facility access. Although the aerospace industry does this quite well, any cybersecurity system with a single layer is bound to fail eventually. Higher security environments restrict physical access to their network and logically control which devices can connect to it.

In the long term, manufacturers of instruments and avionics connected to the CAN bus will likely need to add security features that limit how components interact. The challenge is going to be finding a way to make this change before a catastrophic cyberattack. Since the CAN bus vulnerability was announced in July, there hasn’t been much talk of designing components to compensate for it. The notion seems to be that planes are stored securely, alerts have been issued to pilots, and everything should be fine. I hope that’s the case. When it comes to cybersecurity, hoping for the best is often a way to get the worst.

Kelser Corp.

About the author: Jonathan Stone is an instrument- rated commercial helicopter pilot and CTO of Kelser Corp., a Connecticut-based IT consulting firm. He can be reached at jstone@kelsercorp.com

November December 2019
Explore the November December 2019 Issue

Check out more from this issue and find your next story to read.