Until Consumer Watchdog pressed Google at a shareholders’ meeting, whenever a robot car crashed on public roads, Google would say very little or claim that it was the fault of the other driver.
But then a 13th car crash involving a Google robot car occurred. Now the company says it will provide online updates for each of its crashes …. but a summary of them.
Google recently provided a summary report of the first 12 of their crashes but would not provide either Associated Press or Consumer Watchdog detailed report findings.
“Crash reports are essential to understanding how the robot cars interact with human drivers, which likely will be the biggest challenge the vehicles will face, Consumer Watchdog said. In most of the crashes the Google robot cars were rear-ended. That could mean that the [robot] vehicles tend to stop more quickly than human drivers expect.” — Consumerwatchdog.org
Associated Press also requested a full report from the California Department of Motor Vehicles but was refused. The government agency cited “privacy rules”.
Privacy rules? Google is using public roads.
Tests on Public Roads Require More Rigorous Scrutiny
On public roads, there should be no secrets and, in a democracy, car crash analysis should be reported by a government agency responsible for protecting human life.
Companies other than Google are chomping at the bit to get their own driverless solutions on the road, so now is the time to structure a group accountable for public safety to monitor how the new technology is developing and how it’s impacting public health and safety.
Corporate use of public property that puts humans at risk should be fully monitored by a joint group of corporate representatives, government safety agencies, external software experts and the public. This group should then provide following information for every robot car crash:
- understanding of the “root cause” of the accident — driver error (which driver?), software error, road design . . . there are a number of things that could have happened.
- if the root cause was software error, confirmation that it has been identified and fixed (or a timeline for the fix)
- if the root cause was how another human was “thrown off” by the way a robot car functions, confirmation that a software solution will be investigated and a timeline for this correction — after all, this is (so far) a human world and the robot cars need to accomodate human expectations
The future deserves our attention and care today.