Driverless cars may be the future of the auto industry — but there are still some dings to be worked out before mainstream adoption.
Although some experts have predicted that self-driving cars could be common by 2025, a recent crash of a driverless Uber vehicle in Arizona raises questions about the safety of this technology. The vehicles are loaded with sophisticated devices that aim to prevent human errors, but they appear just as susceptible to accidents as autos driven by real people.
In late March, Uber Technologies, Inc.’s self-driving Volvo SUV approached an intersection in Tempe as the light switched to yellow. Meanwhile, a human-driven Honda approached in the opposite direction and attempted to make a left-hand turn at the same intersection. The two cars collided, and the Uber vehicle crashed into a traffic signal pole. The SUV then flipped on its side and hit two other cars before it came to a stop. Fortunately, no injuries were reported.
Although two test drivers sat in the front of the Uber SUV, which was outfitted with autonomous driving sensors, one of the drivers told police a blind spot created by traffic prevented him from reacting quickly enough to override the technology and avoid the collision. A spokesperson for the Tempe Police Department absolved the Uber SUV of any fault in the crash, saying the other car failed to yield to oncoming traffic, according to Bloomberg News. Some ambiguity remains, however: one witness told police that the Uber car was “trying to beat the light and hitting the gas so hard.”
Even On Autopilot, Accidents Still Happen
Uber briefly suspended its self-driving tests after the Arizona incident, but its fleet soon returned to roadways in Tempe, Pittsburgh, and San Francisco. Nevertheless, crashes like the one in Tempe highlight the challenges confronted by autonomous cars when sharing the road with human drivers. After all, streets overflow with drivers who speed, cut off other vehicles, or abruptly change lanes. Can an autonomous car detect those driving behaviors quickly enough to prevent an accident, or to allow a human driver to take control?
Driving through a yellow light is another judgment call that, if judged incorrectly, can also lead to accidents. An Uber spokesperson said its driverless cars will only proceed through a yellow light if the vehicles sense they can do so safely, and at the proper speed. But as the witness in the Tempe crash hinted, the Uber SUV might have sped through the intersection to beat the red light — an action frequently undertaken by human drivers.
The Uber crash in Tempe isn’t the first one involving driverless cars. Last year, the test driver behind the wheel of a Tesla car on autopilot was killed on a Florida highway. The National Highway Traffic Safety Administration later determined that the car’s autonomous driving systems were working properly, but that the driver may have been too slow to avoid the fatal collision with a tractor-trailer.
Another potential problem with driverless cars is that such vehicles tend to drive too slow or too cautiously, making them more susceptible to crashes with aggressive drivers. For instance, autonomous vehicles developed by Alphabet Inc.’s Waymo have experienced rear-end collisions for that reason, according to a report from Reuters.
Should Your Clients Expect Autonomous Vehicles Soon
Although Uber and other tech companies are working on perfecting their driving sensors using data gathered from accidents like the one in Arizona, the technology has yet to be proven 100% reliable in preventing collisions. That doesn’t mean that driverless cars aren’t in your client’s futures, but it’s going to be several years before driverless cars are widely available for consumer use.
Meanwhile, insurance companies stung by soaring underwriting losses for auto insurance claims continue to wrestle with insurance policies for autonomous cars. For as long as insurers have provided auto policies, premiums have been based on a driver’s age or other demographic information, including credit score. But how can insurers calculate a rate when there is no human driver? Who is at fault when a driverless car hits another car — the manufacturer, or the owner?
Those questions will eventually be answered by tech companies, insurers, and regulators. For now, if your clients want to take advantage of new auto technologies, you might suggest that they install a telematic device in their cars. These appliances monitor each driver’s actual driving habits, and some insurers set premiums based on those behaviors. For your safety-conscious or urban-dwelling clients, this could significantly lower their premiums — or nudge them toward better driving habits.