The safety of Tesla autopilot is questioned again, and direct sunlight is easy to cause chaos

0
148

Tencent technology news on September 22, when purchasing Tesla Model s electric car in 2017, American owner Robin geoulla expressed doubts about the safety of the automatic driving technology equipped on the car. In January 2018, when driving the car to start the driver assistance function autopilot, he bumped into an empty fire truck parked on the California interstate highway from behind.
When asked by investigators from the National Transportation Safety Board (NTSB), he described his initial feelings about the technology: “you know, it’s a little scary to sit there and watch autopilot operate the car.” however, over time, GRA’s initial doubts about autopilot softened. When tracking the vehicle in front of him, He found autopilot’s performance very reliable. But he noticed that autopilot sometimes seemed to get into chaos when facing direct sunlight or when the vehicle ahead changed lanes.
NTSB found that the autopilot’s design allowed GRA to get away from driving on the road. When the technology is activated, his hands can be completely off the steering wheel in about 30 minutes. NTSB has urged the National Highway Traffic Safety Administration (NHTSA) to investigate the limitations of autopilot, the possibility of driver abuse and possible safety risks. A series of accidents involving autopilot have occurred before, and some people even paid their lives for it.
“Past events show that the focus trained with regularity is always innovation rather than safety. I hope this situation can be reversed,” said NTSB Jennifer Homendy, Jennifer Homendy. Rules to solve fatigue problems, as well as drug and alcohol tests.
Tesla did not respond to a request for comment, but the company said on its website that autopilot is an advanced driver assisted driving function, and its current version cannot make the vehicle drive automatically. Tesla also said that the driver must etch his hand on the steering wheel and maintain control of the vehicle before enabling the system.
Visibility limit
Tesla began to introduce autopilot system in 2015. NHTSA officials are investigating 12 accidents involving autopilot, which is part of the most in-depth investigation conducted by the agency. Gra’s crash in 2018 is one of them.
NHTSA’s statement, NTSB documents and police reports show that most of the crashes under investigation occurred after dark or in the case of limited visibility, such as dazzling sunlight. Autopilot experts said that this raised questions about the ability of autopilot under challenging driving conditions.
“NHTSA has a wide range of law enforcement and authority. When we find that public security faces unreasonable risks, we will take action,” an NHTSA spokesman said in a statement
Since 2016, the U.S. automotive safety regulatory agency has separately sent 33 special crash investigation teams to review Tesla crashes involving 11 deaths. These crashes are suspected of using advanced driver assistance systems. NHTSA has ruled out the use of autopilot in three of these non fatal crashes.
At present, NHTSA’s investigation results on autopilot have actually rekindled the debate on whether the technology is safe or not. This is the latest major challenge faced by Elon Musk, chief executive of Tesla, who has made an effort to help his company become the world’s most valuable car manufacturer.
Tesla charges customers up to $10000 for its driver assisted driving features, including automatic lane change, and promises to eventually provide automatic driving capability for their cars using only cameras and advanced software. Other car manufacturers and autonomous driving companies not only use cameras, but also use more expensive hardware, including radar and lidar, on their current and upcoming cars.
Musk has said that Tesla cars with eight cameras will be much safer than human drivers. However, experts and industry executives said that camera technology was more affected by darkness and sunlight glare, as well as bad weather conditions such as rainstorm, snow and fog. Raj Rajkumar, Professor of electrical and computer engineering at Carnegie Mellon University, said: “today’s computer vision is far from perfect, and so is the foreseeable future.”
In 2016, Tesla automobile had its first fatal crash involving semi-automatic driving technology west of Williston, Florida. The company said that under the bright sky, neither the driver nor autopilot could see the white side of the tractor trailer. Tesla did not brake in time, but collided with the 18 wheel truck.
Driver error, brake failure
According to the available documents, NHTSA ended the autopilot investigation caused by the fatal car accident in January 2017. After some controversial exchanges with Tesla representatives, it was found that there were no defects in the performance of autopilot.
According to a special order issued by the regulatory authority to Tesla, as part of the investigation, in December 2016, the agency asked Tesla to provide details on the company’s response to any internal safety concerns raised by autopilot, including the possibility of misuse or abuse by drivers.
After an NHTSA lawyer found that Tesla’s initial response was insufficient, Todd Maron, Tesla’s then general counsel, tried again. According to the document, Malone told regulators that this requirement was “too broad” to catalog all the concerns raised during the development of autopilot.

However, Malone told regulators that Tesla wanted to cooperate. During the development of autopilot, the company’s employees or contractors have raised the problems Tesla is solving, such as unexpected or failed braking or acceleration, unexpected or failed steering, and some types of misuse and abuse by drivers, but he did not provide more details.
Malone did not respond to requests for comment. It is unclear how regulators respond. A former U.S. official said that Tesla generally cooperated with the investigation and quickly provided the required materials. On the eve of former US President Trump’s inauguration, regulators concluded their investigation and found that autopilot’s performance was in line with the design. Tesla has taken measures to prevent its abuse.
NHTSA leadership vacuum
NHTSA has not confirmed its head through the Senate for nearly five years, and US President Joe Biden has not nominated anyone to run the agency.
NHTSA documents show that regulators want to know how Tesla tried to identify flashing lights on emergency vehicles, or how to detect fire engines, ambulances and police vehicles on the road. The agency also obtained similar information from Tesla’s 12 competitors. “Tesla has been asked to provide and verify data and their interpretation of these data. NHTSA will independently verify and analyze all information,” NHTSA said
Musk has always tried to defend autopilot and refute the attacks of critics and regulators. Tesla uses autopilot’s ability to update vehicle software over the air to go beyond and bypass the traditional vehicle recall process.
Musk has repeatedly publicized the functions of autopilot, despite the opposite warnings in the owner’s manual, telling drivers to stay focused and outlining the limitations of this technology. Many critics say Tesla sometimes misleads customers into believing that its vehicles can drive automatically. Musk also continued to launch a beta version of what Tesla calls “fully automatic driving” (FSD) through air software upgrade.
Huomendi, chairman of NTSB, said: “some manufacturers will sell cars according to their own wishes. Now it’s time to control them.” (reviewed by Tencent technology / Jinlu)