By Kenneth Wyatt
Sr. Technical Editor, Interference Technology
[email protected]
One of the most exciting technologies today is the current development of semi-autonomous (driver assisted) and fully-autonomous vehicles. Dozens of manufacturers are rapidly developing autonomous, or self-driving, vehicles and related intra-vehicle and vehicle-to-vehicle communications and control.
There are numerous trial programs ongoing throughout the U.S. and Europe. Google’s self-driving program was among the first, and Autoliv (a German company) has been running a pilot program in the Boston area for several months now, gathering data in both rural and congested city streets.
Federal and state authorities are also rethinking the regulations regarding autonomous vehicles. For example, California’s department of motor vehicles said on September 30th, 2016, that the most advanced self-driving cars will no longer be required to have a licensed driver aboard, so long as federal approval is given. The U.S. Department of Transportation has also recently issued a comprehensive policy on self-driving cars.
Autonomous Trucks
In Europe, driverless cars may actually start with trucks. There’s been some research on the concept of convoys of autonomous semi-trucks.
Here in Colorado just two weeks ago, Uber teamed up with Otto, a San Francisco startup developing self-driving trucks, and delivered a trailer load of 50,000 cans of Budweiser beer from the Anheuser-Busch brewery in Fort Collins to distributors 120 miles away in Colorado Springs. With an observer on board, this is believed to be the first commercial use of an autonomous delivery.
Ethics With Autonomous Vehicles
But, with all this new technology being developed, an interesting ethical question is brought to the table. How will driverless vehicles make life or death choices?
For example, let’s say an autonomous vehicle rounds a corner and there is a school bus, kids crossing the street, and trees on either side of the street. Assuming the brakes won’t stop the vehicle in time, how will the software decide what to do; hit the bus, the kids, or the trees?
Here are some additional scenarios and questions from some recent discussion on the IEEE PSES email reflector:
With a little imagination, I can come up with many scenarios that appear “no-win”. Imagine you are driving down a mountain road with a rock face on one side and a long drop off a cliff on the other. Vehicle to vehicle communications allow your self-driving vehicle to stay close to the car in front of you. It is a straight road and high speeds are allowed. Now imaging a rock slide starts dropping a large boulder onto the roadway. The vehicle in front of yours may hit the rocks, but it remains intact enough to protect its occupants. Your vehicle can either hit the vehicle in front of you potentially injuring its passengers or take evasive action risking your health. What does the vehicle do? […] These aren’t situations that are new with self-driving cars. They just create a new issue of liability. – Ted Eckart
“[…] Mercedes has made it clear that if a situation arises where a car has to choose between saving the lives of its occupants or those of bystanders, it will save the occupants. ‘If you know you can save at least one person, at least save that one. Save the one in the car,’ Christoph von Hugo, manager of driver assistance systems and active safety at Mercedes, told the Paris Motor Show recently.” – As reported by James Pawson.
Aside from the obvious concerns about vehicle safety, it occurs to me that there two problem[s] that presently are missing in recent media reporting. […] (1) I understand that these vehicles, such as the fully automated Budweiser truck have avoidance systems. Given the human condition of today, I foresee the distinct possibility of drivers in other vehicles “playing around” in such a way as to try and force a response from the avoidance algorithms and cause these vehicles to crash themselves. This kind of sport would be exactly what some types would enjoy. What sort of preventative measures have been taken in this regard? (2) Given the lack of attention to hacking we have already witnessed in the Internet of Things (IoT) crowd, how are the driverless vehicle people doing with regard to the cyber security of these vehicles. That is, is it conceivable that someone may try to hack the truck’s operating system and hijack it? – Doug Powell
Commercial airliners largely fly themselves. And, they have automated anti-collision systems. Most airline accidents are the result of pilot error. […] Cars will drive themselves. I hope that the self-driving scheme will drop out and demand that the driver take over in case the self-driving features cannot cope with the situation. As with train locomotives, I would hope that self-driving cars would have some sort of “dead-man” switch that would keep the driver awake and alert, ready to take over in the event the self-driving car cannot handle the situation. I don’t (yet) have much confidence in the self-driving beer truck with the human driver in the back seat. – Rich Nute
What level of modification to a vehicle will allow the manufacturer to shed liability if something goes wrong? Will the manufacturer require the vehicle to immobilize itself if one of the sensors isn’t working properly? What happens if a sensor fails while you are driving through the middle of the Outback in Australia, hundreds of kilometers from anywhere? What data will the manufacturer be allowed to collect from your personal vehicle? Things will occasionally go wrong. We know that there are clever lawyers who will find a new avenue to launch law suits. We know that there are politicians eager to please their constituencies that will propose heavy-handed regulations without giving it enough thought. There will be purchasers of self-driving cars who will find ways to bend the rules to their own ends. In other words, humans will be humans. In my opinion, it isn’t the robots we should worry about. – Ted Eckert
These are all excellent questions and ones that our keynote speaker, Robert Neff, will be addressing during the upcoming EMC Live Test & Design Bootcamp, November 16th. Please tune in!