There’s an unfolding story right now, of how software killed nearly 400 souls over 5 months.
Its lessons speak to why we take a careful approach to TextBlade software.
TextBlade is all-new technology. It makes keyboards far smarter, less work, and more powerful than ever before.
A big foundation that makes this possible is extensive software engineering.
There is perhaps 100X more software inside TextBlade, than any other real keyboard ever made.
The way we architected that software defines the user experience.
Software produces advances in ease of use, reduction of stress, and many new conveniences.
Done well, you never want to go back to legacy keyboards. It feels magical.
But if it had been done too casually, it would have risked frustrating users. We all hate autocorrect software when it comes up with some absurd overwrite of a very reasonable entry. Which is why TextBlade has no autocorrect agents at all.
Software run amok is much, much worse than no software at all. Which is why we validate so extensively with real, everyday use by our customers.
And it is also why we architect our software with lots of checks and balances, and self monitoring, and auto-recovery to make it resilient, even for quirks in your computer, that are totally outside of TextBlade’s job.
Your life is not at risk from a keyboard. It does not hold your body in mid air, 6 miles up.
But you rely on a keyboard every day to do what you do, more than you realize. So it’s wise to think through what could possibly go wrong, and carefully design that out, through the fundamental architecture itself.
For the Boeing jet however, not doing that right, actually killed people.
What went wrong?
The new jet has bigger, more efficient engines, placed farther forward, which make it more prone to stall. They compensated by adding software to automatically resist stalls.
They fast-tracked approval by making the pilot training unchanged from their popular prior models, to get a leg up on competing more quickly with Airbus.
But the software took control away from the pilot. And it made crucial decisions relying on only a single sensor.
It violated a basic rule of robustly-designed systems - you never want to be vulnerable to a single-point failure, especially in life safety applications.
So when the sensor gave a false reading, that software trusted the sensor more that the pilot.
It put the jet into a hard dive, which the pilots could not overcome. It forced the plane into a high speed nose dive straight into the ground, blowing a large crater, and killing everyone instantly.
This was not some weird fluke event. It was a fundamental, systemic error, in design.
There was organizational failure at multiple levels in Boeing. The engineers at Boeing are plenty smart to understand the design rule of redundancy, and manual override for any system failure. But commercial pressure on managers got them to overrule the engineers, to push the product out faster.
Boeing is not a small company, and we have all flown in their trusted, excellent jets for half a century.
But size won’t save you. Hubris is how even the mighty fall.
When building great change, it pays to stay humble.
Below is an article from the Wall Street Journal that explains what happened to the jet.
Boeing needed the launch of its 737 MAX to go quickly and smoothly. This is how it went wrong.