The Road to the Future

On September 19th, the United States government released new guidelines for self-driving cars in an attempt to make self-driving technology more transparent. To avoid over regulation, the guidelines are ambiguous and intended to evolve alongside the technology. Yet a number of accidents caused by the self driving software raise questions about the technology‘s actual capability.

The guidelines address issues of safety, user privacy, software malfunctions, and hardware regulation. The vagueness of the guidelines reflects the desire of the programmers, who argue that too much regulation will hinder innovation. The guidelines also urge states to not only create regulations for self-driving vehicles, but also standardize laws across borders. Today, state regulations range from permitting fully autonomous cars without a driver for non-testing purposes to no laws at all. Such inconsistencies set the stage for confusion, and states without regulations may face a dilemma if a self-driving car gets into an accident in that state.

Google's self-driving car
Google’s self-driving car https://flic.kr/p/C8fz4j

So far, there has been a number of instances where the self-driving software has caused accidents. In February, one of Google’s self-driving cars was in its first self-inflicted accident. In an attempt to avoid sandbags blocking the road while turning right, the car crashed into the back of a city bus, damaging both the car and the bus. While Google hasn’t explicitly admitted the accident is the car’s fault, it’s clear that its self-driving car was incapable of handling such a tricky situation. Similarly in May, a Tesla Model S being operated in autopilot on a highway failed to recognize a white truck against the bright sky, consequently driving underneath the truck and killing the driver. In response, Tesla wrote a blog post, claiming “the system is a new technology and still in a public beta phase. […]Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times.’”

In the wrecked Tesla Model S, investigators found a portable DVD player. While it’s unclear whether it was in use at the time of the accident, the fact that the driver may have been using it while driving brings attention to hypocrisy in the marketing of self-driving cars. One glaring point of confusion is the differentiation between autonomous and automated vehicles, who tend to be grouped together in the category of “self-driving” cars. Autonomous vehicles should be able to drive the entire route without human interference, from calculating the distance to reacting to sudden emergencies. Automated, or driver assistance, vehicles require active driving and can only control the car for a short period of time. Automated cars include technologies like parallel parking and object detection, and have been incorporated into commercially available vehicles for years. In other words, automated system are intended to enhance the driving experience, while autonomous driving hopes to replace it.

Tesla’s Model S autopilot system is somewhere between levels two and three of the U.S. Department of Transportation’s National Highway Safety Administration five levels of vehicle automation. This means that while the car can drive autonomously under certain conditions, the driver’s full attention is still required. Therefore, if the driver from the fatal accident in May was in fact watching a DVD while driving, he was not operating the vehicle appropriately. In this case, misleading marketing of self-driving vehicles may be to blame. For instance, an ad released in May for Mercedes’ 2017 E-Class sedan has come under scrutiny for its deceptive claims. The slogan says, “Is the world truly ready for a vehicle that can drive itself? An autonomous-thinking automobile that protects those inside and out… The all new E-Class: self-braking, self-correcting, self-parking. A Mercedes-Benz concept that’s already a reality.” It suggests that the car is fully autonomous, which it isn’t. After initially dismissing the criticism, Mercedes eventually pulled the ad.

Technology isn’t the only barrier autonomous cars will have to overcome before its mass public adoption. In the US, the truck, delivery, and tractor driving industry attracts the most employment in 29 states. Replacing these jobs with self driving vehicles will put millions out of work. Truck driving is one of the few remaining jobs where a college degree isn’t required to earn a middle-class salary. Nonetheless, self driving trucks are extremely appealing to truck companies. Although initially expensive to install, truck companies will no longer have to pay their workers. Additionally, accidents as a result of the driver’s over-exhaustion are common, and self-driving trucks can eliminate this concern. 

In a perfect world, self-driving vehicles will eliminate virtually all vehicle accidents. However, as recent car crashes have shown, the technology is not yet advanced enough. Although developers stress how critical real-world testing is, is it really fair to them to do it if it puts civilians at risk? Because self-driving vehicles are such a recent concept, car companies should strive to educate people on the different types of self-driving cars and be honest about their capabilities. Misleading marketing promotes unsafe driving habits, which as observed in the fatal accident in May, can potentially lead to a collision. While there’s no doubt that self-driving vehicles will soon be an everyday reality, eagerness of the automakers shouldn’t take precedent over the safety of citizens.