Welcome to your source of quality news, articles, analysis and latest data.

Hyperdrive Daily: Car Crashes Have a Black Box Problem

Welcome to the Hyperdrive daily briefing, decoding the revolution reshaping the auto world, from EVs to self-driving cars and beyond.

On Monday, the U.S. National Transportation Safety Board released a preliminary report on last month’s fatal Tesla crash in Texas. While it revealed some key facts about the incident, there’s still a lot we don’t know.

The report confirmed one of the victims was in the driver’s seat when he got into the car, though it didn’t explain how he got to the back seat, where his body was found. It also suggested the driver may not have been able to engage Tesla’s driver-assistance system Autopilot prior to the crash. CEO Elon Musk made this case  a few weeks ago.

The ambiguities of the incident, which the NTSB is investigating along with the National Highway Traffic Safety Administration, almost a full month after it happened highlight just how much catching up U.S. authorities have to do to evaluate the safety of self-driving technology.

The NTSB has no power to regulate, and instead issues safety recommendations to agencies and industry. Ever since a 2016 fatal crash involving a Tesla in Florida where Autopilot was engaged, the NTSB has called for tougher industrywide standards to prevent abuse. The real action must come from NHTSA, which is authorized to enact auto-safety rules.

Is NHTSA up to the task? It has examined 28 Tesla crashes in recent years – and concluded four of those investigations. But safety advocates say NHTSA has been slow to update guidelines to address both driver-assist features in consumer vehicles now as well as standards for autonomous vehicles. NHTSA said at the time of the crash that it was engaging with local law enforcement and Tesla “to learn more about the details of the crash.” An agency spokesperson declined to comment for this column.

Part of the problem might be that NHTSA doesn’t have the budget nor the software expertise to effectively investigate accidents involving driver-assist technology, said Philip Koopman, a professor of computer engineering at Carnegie Mellon University and co-founder of AV consultancy Edge Case Research. This has led NHTSA to ascribe blame for crashes to human drivers without fully exploring the role of software, said Koopman, who has published academic papers to back up his claims.

“They never ask whether it’s software,” he said. The agency “is going to have a problem when there is no human driver to blame.”

Koopman also wants NHTSA to require AV companies to adopt broader safety standards. Organizations like the Society of Automotive Engineers and the International Organization for Standardization already have published such guidelines, which NHTSA included in an AV rulemaking framework it floated last November as a starting point for consideration of future regulations. NHTSA’s stance is that it’s too early to impose regulations since there isn’t enough data to inform such standards. The position aligns with AV company lobbyists’ arguments. The use cases, the companies claim, are so diverse that any regulations now would stifle innovation.

The NTSB, which has issued opinions of accidents involving both consumer vehicles and autonomous test cars, has said NHTSA’s perception of the safety of AV testing is “probably unrealistic.”

The NTSB doesn’t limit its critique to test vehicles. There is a transparency issue when examining car crashes today. Regulators rely on automakers to retrieve software data, and in many instances, it’s insufficient to capture the full picture of an automated system in the moments before a crash.

Not only is there a lack of tools to independently retrieve and review vehicle data, that data is critical in determining whether human error or software is ultimately at fault, the NTSB says.

That question applies not just to Tesla, but to all automakers as they continue to add automated-driving features to cars.

The rendering might suggest otherwise, but come 2025, Volkswagen’s autonomous ID.Buzz minibus will be very real and and operational. The 21st century version of VW’s iconic microbus, popular in the 1960s, is being developed with Ford-backed startup Argo AI to take on commercial routes with Level 4 autonomy — meaning, it’ll drive itself under certain conditions. Ultimately, colleagues Christoph Rauwald and Keith Naughton write, VW will focus on densely populated urban areas for the bus — environments that, according to Christian Senger, VW’s head of autonomous driving, “offer the basis for intensive use of mobility offerings.”

Share Post
Written by
No comments

Sorry, the comment form is closed at this time.