Companies can build software any way they want
Currently, there are no software safety laws. Yet, today, software is everywhere and even though it’s invisible, it is making decisions on your behalf.
There is no problem, per se, with software running machines and making decisions. The problem rests with the fact that there are no regulations that direct companies to build safe software. None. There are “coding standards” but these are optional standards generally defined over time by software developers or by industry groups — but, again, all voluntary.
Here at Glitch Watch, we capture and reference various media reports of software problems, including auto industry recalls where, one-by-one, people were killed because a piece of software didn’t work.
One of the more deadly software glitches took the lives of at least 89 Americans — all the while the company responsible for the software faults used its financial and media power to continuously assert that there was nothing wrong with their vehicles, claiming driver error.
It took the enormous willpower of the dead drivers’ loved ones to fight the multi-national automobile company for years before the truth was finally revealed.
Automobile software: companies test, but who’s making sure the tests go well?
Certainly, automobile companies test their software. In the business world, though, testing is always the “poor cousin.” Testing time always gets reduced. Even when car companies think they’ve got it right and go out of their way to show the media how well their software is working . . well. things go wrong. Very wrong.
Take a look at what happened to both Volvo and Mercedes when they wanted to show that their automated braking system was ready for the world to see.
And, by the way, Volvo and Mercedes are two companies that actually DO have a safety-focused corporate cultures…imagine what’s happening at car companies that DON’T fixate on safety and quality.
Now that Google wants to have cars that drive themselves
. . . We must ask the question: Who confirms that these cars have been properly tested and who will be responsible — truly accountable — when the software fails?
In a democratic society, citizens have a right to expect that technology is built safely.
History repeating itself
When electricity was first introduced, people regularly got electrocuted because there were no regulations governing electrical standards. One jurisdiction after another began to pass laws ensuring that certain safety standards were adhered to. It is now extraordinarily rare that someone is electrocuted simply doing an every-day household chore.
Similarly, when those new-fangled “horseless carriages” became increasingly popular, there were no regulations about how and where and when people could sell gas. What happened in the early days is that lots of people were burned alive just because they were filling up a tank — sometimes with a cigarette in their mouth, or because gasoline was dumped into tanks with a funnel and a bucket of gasoline.
Regulations were passed that protected people from the dangerous ways that gasoline was handled in the early part of the 20th century.
We are now in the 21st century. People doing ordinary, everyday things like driving with their children in the car are finding that their software makes the car go out of control or stops the brakes from working or causes the doors to pop open or the airbags not to work or the headlights to dim.
Software safety regulations are needed to prevent this from happening.
Software is the invisible killer of the 21st century. Let’s work together to change that. Let’s help our politicians and regulators to understand that just because it’s invisible to most people, it’s not impossible to make it a safe and reliable tool.