Autonomous cars are not mature enough to assist us in achieving zero traffic fatalities

Posted by on Oct 25, 2016 in Writing Assignment 4 | No Comments

Human error has always been the leading cause of car accidents, accounting for about 90% of such occurrences. Therefore, eliminating the need for human involvement in the driving process has the potential to significantly reduce the number of traffic fatalities, which is the goal and the direction of the “Vision Zero” project. The question is: “will we ever be able to safely enjoy completely autonomous driving?”

Tesla’s recent incident proved that the automated driving technology is far from perfect, since sensors and cameras are not yet intelligent enough to perceive and interpret unusual situations in traffic. Furthermore, it is an extremely difficult task to define and predict all “unusual” scenarios in the development of an autopilot system.

Tesla, an American automaker that releases semi-autonomous cars, has underestimated the importance of special education that would enable drivers to learn and get used to the new “autopilot” update: its benefits, deficiencies, and dangers. Even though Tesla didn’t leave its drivers without any details or information, it “clearly needs to go beyond the pages of fine print that’s displayed on-screen when a driver installs an Autopilot software update (Wayne, 2016).” On the other side, Tesla warns drivers that they must remain alert and in control of their cars at all times (see Figure 1 below) – the “autopilot” is a beta program still in development, which means that the driver that died in the fatal crash in Florida was not supposed to leave the full control to the automated system. Furthermore, it was later found that the driver was watching Harry Potter at the time of the crash (Carla, 2016) instead of paying close attention as he should have. As the truck driver made an improper left turn into the path of the Tesla (see Figure 2 below), the Tesla driver was distracted by the car’s entertainment system, which placed him out of the loop at the moment of crash. Distracted driver’s mind could not react fast enough to the car’s failure to distinguish between the white side of the truck from the bright sky, which allowed the car to run under the trailer without even breaking. Certainly, the issue remained in the center of people’s attention for a long period of time, since the autopilot system failed stop the vehicle in a dangerous situation – failure of the most important feature of an automated vehicle, and, as studies show, self-braking cars that actually work according to the design reduce collisions by 38% (Nicas, 2015).

Figure 1: Tesla's Autopilot Safety Notice to Users Source: Tesla Motors Website

Figure 1: Tesla’s Autopilot Safety Notice to Users
Source: Tesla Motors Website

 

Figure 2: Tesla Fatal Crash Diagram Source: Florida Traffic Crash Report

Figure 2: Tesla Fatal Crash Diagram
Source: Florida Traffic Crash Report

 

Ethical analysis of the evolution of autonomous cars leads to a very important question of liability: “who is to be held responsible for accidents involving autonomous cars?” Even though it makes sense to blame manufacturers for errors of their products, doing so diminishes or even eliminates incentives to make improvements in the safety of products to prevent liability, which means that manufacturers will not be willing to develop new technology (Hevelke, 2015). On the other side, we can hold the users liable for the accidents, calling it a “duty” for the user to pay attention to the traffic and intervene when necessary. However, in this case, autonomous cars lose their value, since it wouldn’t be possible for users to let their cars find parking spots to park by themselves, send children to school, use cars to get home safely when drunk, or relax and even sleep while traveling (Borowsky and Oron-Gilad, 2016). The user is essentially liable for taking the risk of using and autonomous car. In any case, many aspects of automated driving are still yet to be developed; therefore, autonomy can’t be the key in achieving zero traffic fatalities as of today – we can’t deal with risks within the society by exposing it to even greater risks.

Even though the fatal crash may slow the advance of self-driving cars, manufacturers are not going to stop the advancement of their technology unless they are faced with high pressure from possible liability issues. The first autonomous Freightliner Inspiration truck legal in the U.S. was unveiled in Las Vegas in May, 2015, which indicates a good momentum of the advancement of autonomous technology. Hopefully, autonomous cars will soon indeed be so advanced that they will bring the number of traffic fatalities to zero using numerous safety features and vehicle-to-vehicle communication techniques.

Figure 3: Tesla Model S Source: New York Times

Figure 3: Tesla Model S
Source: New York Times

 

 

Works Cited (MLA Format)

 

Borowsky, A., and T. Oron-Gilad. “The Effects of Automation Failure and Secondary Task on Drivers’ Ability to Mitigate Hazards in Highly or Semi-automated Vehicles.” Advances in Transportation Studies Journal 1.1 (2016): 59-70.

Carla, Diana. “Don’t Blame the Robots; Blame Us.” Popular Science 288.6 (2016): 46-47.

Hevelke, Alexander. “Responsibility for Crashes of Autonomous Vehicles: An Ethical Analysis.” Science & Engineering Ethics 21.3 (2015): 619-30.

Nicas, Jack. “NCAP Finds Self-Braking Cars Reduce Collisions by 38%.” Professional Safety Journal 3.4 (2015): 12-13.

Tudor Van, Hampton. “Freightliner Debuts First Autonomous Truck Licensed in the US.” Engineering News-Record Journal 274.14 (2015): 5.

Wayne, Rash. “Autopilot Feature Likely Not at Fault in Tesla Model S Fatal Crash.” National Highway Traffic Safety Journal 1.1 (2016): 1.

 

 

 

 

 

 

Leave a Reply