The Liability Nightmare of Proprietary Software in Critical Systems
Share this article
The Invisible Crisis: When Proprietary Software Controls Our Lives
In a stark address to the Software Sustainability and Compliance Lab, Columbia Law Professor Eben Moglen—architect of the GPLv3 and founder of the Software Freedom Law Center—sounded the alarm on an escalating technological crisis. As software embeds itself into automobiles, medical implants, aircraft, and infrastructure, Moglen argues that proprietary code’s lack of transparency isn’t just inconvenient; it’s a lethal liability time bomb.
The Specter of Uninspectable Code
Moglen’s case rests on chilling realities:
- Automotive Disasters: Mysterious vehicle accelerations (like Toyota’s infamous cases) often deflect blame onto drivers via "expert" testimony about pedal confusion—while proprietary throttle-control software evades scrutiny.
- Medical Perils: Implanted devices like pacemakers run software tested under protocols Moglen deems "inadequate" for even trivial applications, let alone life-critical systems.
- Aviation Catastrophes: Air France Flight 447’s crash, potentially caused by conflicting sensor software after icing, remains unresolved 13 months later—with black boxes lost in ocean depths. Regulators lacked the tools to audit the code pre-disaster.
"Proprietary software is an unsafe building material. You can’t inspect it. You can’t assess its complex failure modes... If you were aware of a catastrophic failure mode, you couldn’t do anything about it."
— Eben Moglen
Regulatory Paralysis Meets Corporate Incentives
Even diligent agencies like the U.S. NHTSA (auto safety) and FDA (medical devices) face impossible hurdles. When investigating Toyota, NHTSA admitted it had no in-house engineers capable of analyzing embedded software, outsourcing to NASA. The FDA, meanwhile, delegates medical-device testing to contractors using protocols Moglen’s team will soon reveal as dangerously insufficient.
Manufacturers compound the problem:
- Using third-party code with limited liability indemnities
- Illegally incorporating software without clear provenance
- Prioritizing cost-cutting over catastrophic-risk prevention (echoing BP’s pre-disaster negligence)
Linus’s Law as Lifesaver
Moglen’s antidote is radical simplicity: Mandate inspectable, modifiable free software for all safety-critical systems. This leverages:
1. Transparency: Code auditable by regulators, researchers, and white-hat hackers worldwide.
2. Collective Vigilance: Harnessing "Linus’s Law"—given enough eyeballs, bugs become shallow. A global community can spot flaws proprietary vendors miss.
3. Rapid Response: Critical patches can be developed without vendor bottlenecks.
"If you tell everybody on Earth, ‘the software that could fail, killing your mother the next time she takes an airplane, is on the Web, you might want to have a look at it,’ there’s a remarkably high number of talented people who will do exactly that."
The Democratic Imperative
Current regulations often backfire—like the EU banning user-modifiable software in medical devices, pushing manufacturers toward secrecy and even GPL violations. Moglen urges societies to demand:
- Legal requirements for inspectable code in critical infrastructure
- Regulatory frameworks that embrace open collaboration, not proprietary black boxes
"We shouldn’t allow people to build black-box elevators," he notes wryly, referencing a recent Edinburgh hotel incident. Why tolerate them in pacemakers or aircraft?
The stakes transcend profit margins: When software governs physical safety, opacity equals ethical malpractice. As Moglen concludes: Freedom isn’t ideological—it’s a survival mechanism.
Source: Adapted from Eben Moglen’s 2010 lecture at the Software Sustainability and Compliance Lab. Full Transcript