A U.S. safety official is worried that automakers will not be able to make their self-driving cars “ethical.” The issue relates whether artificial intelligence powering self-driving cars can make sound and ethical decisions, like avoiding an accident that might harm a person that is not contained inside the vehicle.
National Transportation Safety Board Chairman Christopher Hart says that federal regulations will be necessary to create the moral basis behind the decisions made by autonomous cars. Regulators will also need to create special safety standards concerning the reliability of self-driving cars. The NHTSA will also need to demand that designers include fail-safes in the important components of self-driving vehicles, not unlike the kind that aircraft manufacturers are required to have.
Hart says that rules for ethical prerogatives must be built into the software in self-driving cars, prerogatives that would give a self-driving car direction when it comes to deciding if it will drive onto a sidewalk — and potentially strike a pedestrian — in order to avoid striking an out-of-control 18-wheeler.
The NHTSA is currently investigating how it plans to regulate self-driving automobiles. It plans to release its guidelines soon. However, the agency has yet to deliberate the ethical concerns regarding decisions made by self-driving automobiles.
In the years ahead, the U.S. court system, and Alabama state courts, will have to address issues of liability when it comes to an auto accident blamed on a self-driving vehicle. Lawyer, judges, juries and lawmakers alike will need to come to a decision regarding what appears to be the inevitable near-future of U.S. roadways being clogged with autonomous cars that are completely piloted by artificially intelligent computers.
Source: technologyreview.com, “Top Safety Official Doesn’t Trust Automakers to Teach Ethics to Self-Driving Cars,” Andrew Rosenblum, Sep. 02, 2016