Advertisement
Canada markets open in 5 hours 49 minutes
  • S&P/TSX

    21,656.05
    +13.18 (+0.06%)
     
  • S&P 500

    5,022.21
    -29.20 (-0.58%)
     
  • DOW

    37,753.31
    -45.66 (-0.12%)
     
  • CAD/USD

    0.7270
    +0.0006 (+0.09%)
     
  • CRUDE OIL

    82.27
    -0.42 (-0.51%)
     
  • Bitcoin CAD

    83,889.73
    -3,512.55 (-4.02%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • GOLD FUTURES

    2,391.40
    +3.00 (+0.13%)
     
  • RUSSELL 2000

    1,947.95
    -19.53 (-0.99%)
     
  • 10-Yr Bond

    4.5850
    0.0000 (0.00%)
     
  • NASDAQ futures

    17,748.00
    +89.50 (+0.51%)
     
  • VOLATILITY

    17.89
    -0.32 (-1.76%)
     
  • FTSE

    7,895.87
    +47.88 (+0.61%)
     
  • NIKKEI 225

    38,079.70
    +117.90 (+0.31%)
     
  • CAD/EUR

    0.6804
    +0.0002 (+0.03%)
     

The real reason we get freaked out by self-driving car accidents (TSLA, GOOG)

uber arizona self driving car crash
uber arizona self driving car crash

National Transportation Safety Board/Handout via REUTERS

  • Self-driving cars have the potential to save lives.

  • But as the technology is being tested in public, the companies leading in the space need to do a better job at responding when one of their vehicles are involved in an accident.

  • Tesla's response to accidents involving Autopilot — blaming the victim and citing statistics — makes it harder for the public to place trust in the technology.


It's hard to dispute the upside to autonomous vehicles.

Fewer accidents. Fewer deaths and injuries. No more worries about speeding or people driving under the influence of drugs or alcohol. Increased accessibility to affordable transportation in communities that need it most.

ADVERTISEMENT

So far, the data show that an autonomous future is full of benefits with very few drawbacks. Humans are flawed creatures and make mistakes behind the wheel. And when they do, people can die. Self-driving technology has the potential to save tens of thousands of lives each year in the US alone.

But in the meantime, the companies testing autonomous and semi-autonomous vehicles — the Waymos, Ubers, and Teslas of the world — are setting themselves up for greater scrutiny than traditional automakers with each accident they're involved in.

It's not because their robotic vehicles aren't technically safer than human-operated vehicles. They almost certainly are. It's because when there is an accident involving a self-driving or semi-autonomous vehicle, especially one where there's a death or injury involved, there's an added level of discomfort. It's technology making — or at least contributing to — the accident. It's easy and understandable to blame a human for a car accident. It's not as easy to understand when a car powered by a bunch of algorithms and AI is to blame.

In these cases, it's a company's product that's contributing to death, injury, or property damage. And when a company's product is involved, it's up to the company to take responsibility, not shift the blame back to its own customers.

Tesla is the biggest culprit here.

The handful of accidents involving Tesla's semi-autonomous system Autopilot have happened because the driver wasn't using it properly. (Drivers have to keep their hands on the steering wheel while Autopilot is engaged in case they have to take over, for example.) But the problem with Autopilot is that it blends autonomous driving with human driving, which sets itself up for misuse and error. And as we've seen in a few cases, that misuse can result in an injury or even death.

Tesla's response to each of these accidents has been to blame everyone but itself. The company routinely points out how a driver misused autopilot and that the data and statistics show that Autopilot-equipped cars are far safer than regular cars.

Here's part of the statement Tesla gave after one of its customers, Walter Huang, died while using Autopilot in his Tesla Model X in April:

"We empathize with Mr. Huang's family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive."

Tesla's rebuke may be technically and factually correct, but it's wrong in spirit and lacks empathy. When one of your users dies, it's not time to blame the victim or the media for covering the incident. It's time to talk about your plan for preventing it from happening again.

But Tesla's stance seems to be to fight back against its critics, naysayers, and the media covering each of these accidents.

Here's Tesla CEO Elon Musk's tweet from earlier this week about an accident involving an Autopilot-equipped Tesla that resulted in a broken ankle:

elon musk Tesla autopilot accident tweet
elon musk Tesla autopilot accident tweet

Elon Musk/Twitter

I'll take a stab at answering Musk's implied question.

The outsized media coverage of these accidents happen because there's an extra level of creepiness when humans aren't involved, or in Tesla's case, not supposed to be involved. There's going to be outrage when a corporation could be at fault for creating a flawed and dangerous product, especially when those products are effectively being beta tested in public where human lives may be at risk.

It's easy to understand when a human driver causes an accident by driving while drunk. We're still exploring the ramifications of a traffic accident caused by a robot.

Even a recent accident involving a self-driving car from Google's sister company Waymo gained a lot of attention, even though the Waymo vehicle clearly wasn't at fault. (A video from the Waymo vehicle's dashboard camera showed a driver swerving over a median and hitting the Waymo vehicle head on.) 

Uber took a better approach recently. After one of its self-driving vehicles hit and killed a woman in March, Uber pulled all of its self-driving vehicles off the road until it could study the problem and figure out what to do next. Maybe Uber didn't need to make such an extreme move, but it was a demonstration that the company was taking responsibility for the accident instead of blaming the victim and citing a bunch of statistics. 

The problem isn't the fundamental technology behind self-driving cars. It's the attitude of the companies operating those vehicles and a failure to come to terms with the publics unease and lack of knowledge of this growing trend. They're not beta testing a new version of iOS or a new Snapchat filter. They're testing vehicles carrying human beings on roads where other human beings drive, walk, and cycle.

And the communication from the leaders in the space should reflect that new reality.

NOW WATCH: Steve Jobs made 3 AM phone calls to argue about Apple ads

See Also:

SEE ALSO: Elon Musk says flying cars are a bad idea because they could 'guillotine' people on the ground