Advertisement
Canada markets closed
  • S&P/TSX

    24,822.54
    +132.06 (+0.53%)
     
  • S&P 500

    5,864.67
    +23.20 (+0.40%)
     
  • DOW

    43,275.91
    +36.86 (+0.09%)
     
  • CAD/USD

    0.7246
    -0.0004 (-0.05%)
     
  • CRUDE OIL

    69.34
    -1.33 (-1.88%)
     
  • Bitcoin CAD

    94,459.92
    -25.12 (-0.03%)
     
  • XRP CAD

    0.74
    -0.01 (-1.40%)
     
  • GOLD FUTURES

    2,736.40
    +28.90 (+1.07%)
     
  • RUSSELL 2000

    2,276.09
    -4.76 (-0.21%)
     
  • 10-Yr Bond

    4.0730
    -0.0230 (-0.56%)
     
  • NASDAQ

    18,489.55
    +115.94 (+0.63%)
     
  • VOLATILITY

    18.03
    -1.08 (-5.65%)
     
  • FTSE

    8,358.25
    -26.88 (-0.32%)
     
  • NIKKEI 225

    38,981.75
    +70.56 (+0.18%)
     
  • CAD/EUR

    0.6666
    -0.0024 (-0.36%)
     

Tesla Autopilot Crashes Linked to Overreliance on Computer Vision, Says WSJ

Motor vehicle, Display device, Electronic device, Technology, Trip computer, Vehicle audio, Electronics, Luxury vehicle, Gauge, Center console,
Report: Tesla Autopilot Crashes Linked to CamerasCar and Driver
  • Tesla has been the subject of a great deal of criticism for its Autopilot semi-autonomous technology.

  • The newest report, a video series by the Wall Street Journal, shows footage of several crashes that have been reported to be linked to the use of the Autopilot system.

  • The Journal concludes that reliance on computer sensors and cameras, rather than lidar, is one reason for its problems.

Despite being investigated by the National Highway Traffic Safety Administration (NHTSA) over its controversial semi-autonomous drive mode, Autopilot, Tesla hasn't faced any substantial consequences. Tesla models new and old continue to roam city streets and interstates with technology that—while technically an SAE Level 2 semi-autonomous drive mode—can be misused as a fully autonomous system.

Hence the controversy over the name Autopilot, and among the reasons for the myriad investigations into Tesla by the federal government as well as news outlets. The most recent investigation by the Wall Street Journal attempts to identify why some Teslas have crashed.

The WSJ's roughly 11-minute video, which requires a subscription to view, is the second in a series that puts Tesla's Autopilot system under the microscope. It links the cause of some crashes to Autopilot's overreliance on computer vision, which is basically a way of teaching computers to understand information based on digital inputs such as video.

1000 Tesla Crashes Reported to NHTSA

Automakers in the U.S. have had to report all serious real-world crashes involving SAE Level 2 or higher automated driving systems since NHTSA issued a General Order on crash reporting in June 2021. Tesla has reportedly submitted over 1000 crashes to NHTSA since 2016, but the WSJ claims most of that data is hidden from the public because Tesla considers it proprietary. However, the news outlet says it worked around that by gathering reports from individual states and cross-referencing them with crash data that Tesla submitted to NHTSA.

Among the 222 crashes the WSJ says it pieced together for this report, the paper said 44 occurred when a Tesla with Autopilot activated suddenly veered, while another 31 crashes reportedly happened when Autopilot failed to yield or stop for an obstacle. Incidents where the Tesla failed to stop are said to result in the most serious injuries or death. The WSJ had experts analyze one fatal accident where Autopilot didn't recognize an overturned truck on the highway and the car crashed into it.

That's what some experts who were interviewed by the Journal said is evidence of Autopilot's gravest flaw. Unlike some other automakers that have radar computer vision and lidar laser imaging to detect objects, Tesla mainly relies on camera-based computer vision with radar as a backup on some models. John Bernal, who was fired from Tesla in 2022 for posing videos of Autopilot failing, tells the WSJ that he has found that the cameras used on some Tesla models are not calibrated properly. He says that when the cameras don't see the same thing, they can have problems identifying obstacles. And as the investigation suggests, Tesla's overreliance on cameras to control Autopilot can lead to crashes.

One thing is certain: This investigation, and Tesla's responses to it, will be important to follow.

You Might Also Like