At the beginning of July 2016, a Tesla vehicle that has a built-in Autopilot system crashed with a tractor trailer on a Florida highway, killing the driver. Neither the driver (who allegedly was watching Harry Potter while letting the Autopilot do the driving), neither the Tesla hit the brakes when the trailer appeared in front of the Tesla. Pending investigation by the National Highway Transportation Safety Administration (NHTSA), Tesla CEO Elon Musk said that the radar of the vehicle “tunes out what looks like an overhead road sign to avoid false braking events.” Just days following that incident, another driver was reported to escape a fatal incident thanks in part to the Autopilot.
As with all new and exciting technologies, there are people who are afraid of a “Skynet” scenario, and who plead for excessive regulation. However, emotions should not rule in deciding on this matter: heavily regulating the sector is not the desired outcome we should strive for. If we are to believe Musk, the Autopilot would save half a million lives per year if it was universally available. The numbers corroborate this statement. The fatal accident involving a Tesla vehicle is the first known fatality in over 130 million miles where Autopilot was activated. In comparison, there’s at least one fatality per approx. 60 million miles globally, making the Tesla’s around 50% safer than vehicles manufactured by other companies. This is why it is important to proceed with cool heads. And we are talking about a system that’s not yet perfected – just imagine its potential when it will be! Of course, dealing with the matter of regulating driverless vehicles is a daunting task. However, without a robust modern legal framework addressing this matter, we are left dealing with legal uncertainties in the event of accidents such as those described above.
Only recently has a declaration been adopted on an EU level that recognizes the need for deliberate action required in this sector. Countries are still operating according to the Vienna Convention on Road Traffic (1968) that restricts driverless cars from driving on open roads.
Liability in the event of an accident
The Autopilot sparked a debate over the legal implications concerning liability issues in the event of a traffic accident involving or directly caused by a Tesla car, or other self-driving cars. Musk himself (rightly) asserted that for the time being, the responsibility rests solely on the driver of the vehicle, and not on the company that manufactures them, since the Autopilot relies on human input in order to be able to reach the desired destination “I think we’re going to be quite clear with customers that the responsibility remains with the driver. We’re not asserting that the car is capable of driving in the absence of driver oversight.”
On the other hand, representatives of Google’s self-driving Car Project have taken the stance that any responsibility in accidents involving their self-driving cars should go to Google rather than the individual behind the wheel, who isn’t in charge of driving the car when the accident occurred “What we’ve been saying to the folks in the DMV, even in public session, for unmanned vehicles, we think the ticket should go to the company. Because the decisions are not being made by the individual”, said Ron Medford, safety director for Google’s self-driving car program.
There are other issues that need to be considered. Bryant Walker Smith, in his essay “My Other Car is a…Robot? Defining vehicle automation“, notes the subtle distinction between the terms “automation” and “autonomous”, whereby the prevailing term in Europe is “automation”, which is described as the replacement of human labor with technology, and “automated driving” as driving performed by a computer. This is contrasted to “autonomous driving”, a term prevalent in the US, which could be described as driving performed by itself. The Autopilot system included in the Tesla cars would therefore be considered as automated driving, due to the fact that a human driver is required to give directions to and monitor the computer software, ensuring its proper functioning.
With cars akin to the automated driving systems now present in Tesla cars, the situation is murky (no laws clearly regulate the matter, with some rules to be introduced in near future), but sufficiently clear – the driver should be generally responsible for any accidents occurred on the road, barring any technical and/or mechanical error. The Autopilot is disabled by default, so drivers of Tesla cars can only activate it after acknowledging that it’s still in beta testing. In the event of an accident involving a fully autonomous driving vehicle, however, the driver should not be held responsible. The manufacturer has a much closer connection to the day-to-day operation of the vehicle and is responsible for the development of the software which controls its vehicles. Therefore, companies should be held responsible in this case.
Future of policy
First, it is crucial to update the international documents governing the area. The Vienna Convention needs to be updated to accommodate the novel technologies surrounding road traffic.
Second, it is extremely important that the law catches up with technological advances and breakthroughs so that it can prevent a legal chaos when the number of accidents involving driverless vehicles rises. EU needs to adopt a regulation that will encompass the basic principles of driver liability in the event of an accident, and ensure an adequate level of data protection and privacy, in conjunction with the existing privacy regulations being adopted across EU. The Joint declaration is a step forward, but a more robust legal framework is needed.
Third, an awareness campaign that will educate citizens about the benefits and safety of relying on driverless vehicles shall be implemented, so that the adoption these technologies could be eased into society. The adoption of adequate legal regulations depend on the trust the citizens will place in driverless technology, i.e. how safe they would feel to “hand the wheel” to an artificially intelligent system.
If we fail to properly and timely address these matters, we are risking a situation that would create confusion both on a policy, and on a practical level, hindering the growth of the industry. More importantly, it would prevent the development of a technology that could possibly save millions of lives. In the meantime, keep your hands firmly on the steering wheel.
1 comment
1 Comment
The future of traffic: (de)regulation of driverless vehicles | Politheor: European Policy Network
29/07/2016, 5:20 pm[…] The future of traffic: (de)regulation of driverless vehicles […]
REPLY