Mondo News

All of the latest tech and science news from all over the world.

Tesla’s Autopilot Technology Faces Fresh Scrutiny

Federal regulators are investigating 23 recent accidents in which drivers were, or may have been, using the automatic steering and braking system.

“We need to see the results of the investigations first, but these incidents are the latest examples that show these advanced cruise-control features Tesla has are not very good at detecting and then stopping for a vehicle that is stopped in a highway circumstance,” said Jason Levine, executive director of the Center for Auto Safety, a group created in the 1970s by Consumers Union and Ralph Nader.

Bryant Walker Smith, a professor at the University of South Carolina who has advised the federal government on automated driving, said it was important to develop advanced technologies to reduce traffic fatalities, which now number about 40,000 a year. But he said he had concerns about Autopilot, and how the name and Tesla’s marketing imply drivers can safely turn their attention away from the road.

“There is an incredible disconnect between what the company and its founder are saying and letting people believe, and what their system is actually capable of,” he said.

Tesla, which disbanded its public relations department and generally does not respond to inquiries from reporters, did not return phone calls or emails seeking comment. And Mr. Musk did not respond to questions sent to him on Twitter.

The company has not publicly addressed the recent crashes. While it can determine if Autopilot was on at the time of accidents because its cars constantly send data to the company, it has not said if the system was in use.

The company has argued that its cars are very safe, claiming that its own data shows that Teslas are in fewer accidents per mile driven and even fewer when Autopilot is in use. It has also said it tells drivers that they must pay close attention to the road when using Autopilot and should always be ready to retake control of their cars.

A federal investigation of the 2016 fatal crash in Florida found that Autopilot had failed to recognize a white semi trailer against a bright sky, and that the driver was able to use it when he wasn’t on a highway. Autopilot continued operating the car at 74 miles per hour even as the driver, Joshua Brown, ignored several warnings to keep his hands on the steering wheel.

A second fatal incident took place in Florida in 2019 under similar circumstances — a Tesla crashed into a tractor-trailer when Autopilot was engaged. Investigators determined that the driver had not had his hands on the steering wheel before impact.

By comparison, a similar G.M. system, Super Cruise, monitors a driver’s eyes and switches off if the person looks away from the road for more than a few seconds. That system can be used only on major highways.

The new administration in Washington could take a firmer line on safety. The Trump administration did not seek to impose many regulations on autonomous vehicles and sought to ease other rules the auto industry did not like, including fuel-economy standards. By contrast, President Biden has appointed an acting NHTSA administrator, Steven Cliff, who worked at the California Air Resources Board, which frequently clashed with the Trump administration on regulations.

Despite their names, Autopilot and Full Self-Driving have big limitations. Their software and sensors cannot control cars in many situations, which is why drivers have to keep their eyes on the road and hands on or close to the wheel.

The system is not “not capable of recognizing or responding” to certain “circumstances and events,” Eric C. Williams, Tesla’s associate general counsel, wrote. “These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving paths, unmapped roads.”

Mr. Levine of the Center for Auto Safety has complained to federal regulators that the names Autopilot and Full Self-Driving are misleading at best and could be encouraging some drivers to be reckless.

“Autopilot suggests the car can drive itself and, more importantly, stop itself,” he said. “And they doubled down with Full Self-Driving, and again that leads consumers to believe the vehicle is capable of doing things it is not capable of doing.”

Category: Technology

Source: New York Times

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: