Tesla drives on Autopilot through a regulatory gray zone

The fatal crash of a Tesla without anyone apparently getting behind the wheel has shed new light on the safety of semi-autonomous vehicles and the nebulous U.S. regulatory terrain they navigate.

Police in Harris County, Texas, said a Tesla Model S burst into a tree at high speed on Saturday after failing to negotiate a turn and burst into flames.

Tesla CEO Elon Musk tweeted on Monday that preliminary data downloaded by Tesla indicates that the vehicle did not work on Autopilot, and was not part of the carmaker’s Full Self-Driving (FSD) system not.

Tesla’s Autopilot and FSD, as well as the growing number of similar semi-autonomous driving functions in cars manufactured by other car manufacturers, pose a challenge to officials responsible for car safety and highway safety.

The US Federal Highway Traffic Safety Administration (NHTSA) has not yet issued specific regulations or performance standards for semi-autonomous systems such as Autopilot, or fully autonomous vehicles (AVs).

There are no NHTSA rules that require car manufacturers to ensure that systems are used as prescribed or to prevent drivers from abusing them. The only significant federal restriction is that vehicles have steering wheels and human controls required by federal rules.

With no performance or technical standards, systems like Autopilot exist in a gray area.

The crash in Texas follows a series of accidents involving driving Tesla cars on Autopilot, the partially automated driving system that performs a variety of functions, such as helping drivers stay on roads and drive on highways.

Tesla has also been launching around 2,000 customers since October, which it describes as a ‘beta’ version of its FSD system, enabling them to test how well it works on public roads.

Harris County police are now seeking a warrant for the Tesla data, saying witnesses told them the victims wanted to test the car’s automatic driving.

NO REQUIREMENTS

The regulatory confusion contributes to NHTSA traditionally regulating the safety of vehicles while motor vehicle departments (DMVs) in individual states oversee drivers.

As for the semi-autonomous functions, it may not be clear whether the on-board computer or the driver controls the car, or that oversight is shared, says the U.S. National Transportation Safety Board (NTSB).

California has introduced AV regulations, but they only apply to cars equipped with technology that can perform the dynamic driving task without the active physical control or monitoring of a human operator, the state’s DMV told Reuters.

It is said that Tesla’s complete self-driving system does not yet meet the standards and is considered a kind of Advance Driver Assistance System that does not regulate it.

This allows Tesla’s Autopilot and its FSD system in California while the automaker rolls out new versions of the systems for its customers to test.

NHTSA, the federal body responsible for vehicle safety, said this week it has opened 28 investigations into accidents involving Tesla vehicles, of which 24 remain active, and at least four, including the fatal crash in Texas, have taken place since March.

NHTSA has repeatedly argued that its broad authority to require automakers that any vehicle that poses an unreasonable safety risk is sufficient to address driver assistance systems.

So far, NHTSA has not done any enforcement against Tesla’s advanced management systems.

White House spokesman Jen Psaki said NHTSA was “actively working with Tesla and local law enforcement” over the Texas crash.

The NTSB, a US government agency charged with investigating road accidents, criticized NHTSA’s practical approach to regulating cars with self-driving functions and AVs.

“NHTSA refuses to act for vehicles classified as partial or lower level automation, and continues to wait for higher levels of automation before requiring AV systems to meet minimum national standards,” said Robert Sumwalt, chairman of NTSB, in a letter dated February 1 to NHTSA.

“Because NHTSA does not set any requirements, manufacturers can drive and test vehicles virtually anywhere, even if the restrictions on AV control systems have been exceeded,” the letter reads.

REVISION OF REGULATIONS

NHTSA told Reuters that with a new administration, it is reviewing the regulations surrounding AVs and welcomes the input of the NTSB as it promotes policy on automated management systems.

It is said that the most advanced vehicle technologies for sale require a fully observant human driver at all times.

“Abuse of these technologies is at least derivative driving. Every state in the country holds the driver responsible for the safe operation of the vehicle,” NHTSA told Reuters.

NTSB also says that NHTSA has no method to verify whether car manufacturers have accepted system safeguards. For example, there are no federal regulations that require drivers to touch the steering wheel within a specified time.

“NHTSA is drafting rules on autonomous vehicles, but it has been slow to regulate semi-autonomous vehicles,” said Bryant Walker Smith, a law professor at the University of South Carolina. “There is a growing awareness that they deserve more priority and regulatory action.”

New York has a law that requires drivers to hold at least one hand at the wheel, but no other state has legislation that could prevent the use of semi-autonomous cars.

As for the AVs, 35 states have enacted legislation or state governors have signed executive orders covering AVs, according to the National Conference of State Legislators.

Such rules allow companies such as Alphabet (GOOGL.O) Google and General Motors (GM.N) to test their Waymo and Cruise vehicles on public roads.

However, regulations differ according to the state.

AV regulations in Texas stipulate that vehicles must comply with NHTSA processes, although there are no such federal regulations. The Texas Department of Public Safety, the regulator overseeing AVs, did not respond to a request for comment.

The Arizona Department of Transportation requires businesses to submit regular submissions to verify, among other things, that vehicles can operate safely if autonomous technology fails.

Although most automakers offer vehicles with different types of auxiliary driving, there are no autonomous vehicles sold to customers in the United States.

RED FLAGS

However, concerns about the safety of autonomous driving technology have increased over the past few years, and Tesla has warned of its limitations.

In February 2020, Tesla’s director of autonomous driving technology, Andrej Karpathy, identified a challenge to his Autopilot system: how to recognize when a parked police car’s emergency lights are on.

“This is an example of a new task we would like to know about,” Karpathy said at a conference during a speech on Tesla’s effort to deliver FSD technology.

In just over a year since then, Tesla vehicles have crashed into police cars on four separate occasions and since 2016, at least three Tesla vehicles operating on Autopilot have been fatal.

U.S. security regulators, police and local government have investigated all four incidents, officials told Reuters.

At least three of the cars were on Autopilot, police said. In one case, a doctor was watching a movie on a telephone when his vehicle crashed into a North Carolina police force.

Tesla did not immediately respond to a request for comment.

Accidents and investigations have not delayed Musk’s efforts to promote Tesla cars as self-driving.

In a recent tweet, Musk said that Tesla “is almost ready with FSD Beta V9.0. The improvement in step changes is great, especially for odd angles and bad weather. Pure vision, no radar.”

Tesla also says it has used 1 million cars on the road to collect image data and improve Autopilot using machine learning and artificial intelligence.

Tesla’s Karpathy said he had been driving in his Tesla for 20 minutes to get coffee in Palo Alto without intervention.

“It’s not a perfect system, but it’s getting there, ‘he said in a’ Robot Brains’ podcast in March. “I definitely keep my hands on the steering wheel.”

Our Standards: The Thomson Reuters Trust Principles.

.Source