What happens when you’re injured by a car on autopilot?
Technological advances are coming thick and fast in all areas of our lives — healthcare, industry, agriculture, genetics, and travel. Now that driverless cars have become a real possibility, we have all begun to wonder whether the future will be safer or more dangerous without humans at the wheel. We’re still wondering. In 2016, after a man from Ohio became the first person to die while driving a Model S Tesla on Autopilot when he took his hands off the steering wheel for a few seconds, Tesla added safeguards to its system. Drivers, already dealing with heavier and heavier traffic, more and more distracted, impaired or enraged drivers, and roads in need of repair, now have the added burden of worrying about the legal implications of autopilot cars. So do the personal injury attorneys who will be called upon to react to the injuries resulting from collisions with such cars.
How Autopilot Works
Autopilot uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver. In addition,
drivers are given warnings on the dashboard and in the owner’s manual to remain engaged and alert while using the Autopilot system. The company also designed a method to prevent drivers from looking away from the road or keeping their hands off the steering wheel for any extended period of time.
In spite of the improvement, however, there was a fatal crash this past March involving another (improved) Autopilot Tesla. Investigators determined that the driver of this car failed to obey audible warnings to put his hands back on the steering wheel. Since the driver died in the collision with a road divider, no one knows precisely what caused the driver not to respond appropriately. No matter how many technological advances occur, it seems that we will never be able to avoid all accidents. If a car on autopilot causes serious injuries to its “passenger,” or to another driver or a pedestrian, whom will we hold responsible?
Who will solve the problems associated with autopilot cars?
Legislators, insurance companies, personal injury attorneys, drivers and pedestrians are examining this question from various angles. According to Mike Ramsey, a Gartner analyst who focuses on self-driving technology, “The system as it is now tricks you into thinking it has more capability than it does. It’s not an autonomous system. It’s not a hands-free system. But that’s how people are using it, and it works fine, until it suddenly doesn’t.” Tesla readily points out that Autopilot — despite the implications in its name — is only a driver-assistance system and is not intended to pilot cars on its own.
While the company acknowledges that Autopilot “does not prevent all accidents,” it insists that the system “makes them much less likely to occur” and “unequivocally makes the world safer.” The rest of us have to be convinced. Interestingly, in spite of the three deaths in Autopilot-enabled vehicles, the National Highway Transportation Safety Administration has determined that these cars do not have to be recalled since no flaws in the system were found to have led to any of the crashes.