posted videos on YouTube showing that the camera sometimes fails to notice when drivers look away from the road and that it can be fooled if they cover the lens. When the camera notices a Tesla driver looking away from the road, it sounds a warning chime but does not turn Autopilot off.

G.M. and Ford systems use infrared cameras to monitor drivers’ eyes. If drivers look away for more than two or three seconds, warnings remind them to look straight ahead. If drivers fail to comply, the G.M. and Ford systems will shut off and tell drivers to take control of the car.

Ms. Benavides emigrated from Cuba in 2016 and lived with her mother in Miami. She worked at a Walgreens pharmacy and a clothing store while attending community college. An older sister, Neima, 34, who is executor of the estate, said Naibel had been working to improve her English in hopes of getting a college degree.

“She was always laughing and making people laugh,” Neima Benavides said. “Her favorite thing was to go to the beach. She would go almost every day and hang out with friends or just sit by herself and read.”

Neima Benavides said she hoped the lawsuit would prod Tesla into making Autopilot safer. “Maybe something can change so other people don’t have to go through this.”

Ms. Benavides had just started dating Mr. Angulo when they went fishing on Key Largo. That afternoon, she sent her sister a text message indicating she was having a good time. At 9 p.m., Ms. Benavides called her mother from Mr. Angulo’s phone to say she was on the way home. She had lost her phone that day.

On the 911 call, Mr. McGee reported that a man was on the ground, unconscious and bleeding from the mouth. Several times Mr. McGee said, “Oh, my God,” and shouted “Help!” When an emergency operator asked if the man was the only injured person, Mr. McGee replied, “Yes, he’s the only passenger.”

Mr. Angulo was airlifted to a hospital. He later told investigators that he had no recollection of the accident or why they had stopped at the intersection.

An emergency medical technician spotted a woman’s sandal under the Tahoe and called on others to start searching the area for another victim. “Please tell me no,” Mr. McGee can be heard saying in the police video. “Please tell me no.”

Ms. Benavides’s body was found about 25 yards away.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

Tesla’s Autopilot Technology Faces Fresh Scrutiny

Tesla faced numerous questions about its Autopilot technology after a Florida driver was killed in 2016 when the system of sensors and cameras failed to see and brake for a tractor-trailer crossing a road.

Now the company is facing more scrutiny than it has in the last five years for Autopilot, which Tesla and its chief executive, Elon Musk, have long maintained makes its cars safer than other vehicles. Federal officials are looking into a series of recent accidents involving Teslas that either were using Autopilot or might have been using it.

The National Highway Traffic Safety Administration confirmed last week that it was investigating 23 such crashes. In one accident this month, a Tesla Model Y rear-ended a police car that had stopped on a highway near Lansing, Mich. The driver, who was not seriously injured, had been using Autopilot, the police said.

In February in Detroit, under circumstances similar to the 2016 Florida accident, a Tesla drove beneath a tractor-trailer that was crossing the road, tearing the roof off the car. The driver and a passenger were seriously injured. Officials have not said whether the driver had turned on Autopilot.

crash near Houston in which a Tesla ran into a stopped police vehicle on a highway. It is not clear if the driver was using Autopilot. The car did not appear to slow before the impact, the police said.

Autopilot is a computerized system that uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver. Tesla has said it should be used only on divided highways, but videos on social media show drivers using Autopilot on various kinds of roads.

“We need to see the results of the investigations first, but these incidents are the latest examples that show these advanced cruise-control features Tesla has are not very good at detecting and then stopping for a vehicle that is stopped in a highway circumstance,” said Jason Levine, executive director of the Center for Auto Safety, a group created in the 1970s by Consumers Union and Ralph Nader.

This renewed scrutiny arrives at a critical time for Tesla. After reaching a record high this year, its share price has fallen about 20 percent amid signs that the company’s electric cars are losing market share to traditional automakers. Ford Motor’s Mustang Mach E and the Volkswagen ID.4 recently arrived in showrooms and are considered serious challengers to the Model Y.

The outcome of the current investigations is important not only for Tesla but for other technology and auto companies that are working on autonomous cars. While Mr. Musk has frequently suggested the widespread use of these vehicles is near, Ford, General Motors and Waymo, a division of Google’s parent, Alphabet, have said that moment could be years or even decades away.

played a major role” in the 2016 Florida accident. It also said the technology lacked safeguards to prevent drivers from taking their hands off the steering wheel or looking away from the road. The safety board reached similar conclusions when it investigated a 2018 accident in California.

By comparison, a similar G.M. system, Super Cruise, monitors a driver’s eyes and switches off if the person looks away from the road for more than a few seconds. That system can be used only on major highways.

In a Feb. 1 letter, the chairman of the National Transportation Safety Board, Robert Sumwalt, criticized NHTSA for not doing more to evaluate Autopilot and require Tesla to add safeguards that prevent drivers from misusing the system.

The new administration in Washington could take a firmer line on safety. The Trump administration did not seek to impose many regulations on autonomous vehicles and sought to ease other rules the auto industry did not like, including fuel-economy standards. By contrast, President Biden has appointed an acting NHTSA administrator, Steven Cliff, who worked at the California Air Resources Board, which frequently clashed with the Trump administration on regulations.

Concerns about Autopilot could dissuade some car buyers from paying Tesla for a more advanced version, Full Self-Driving, which the company sells for $10,000. Many customers have paid for it in the expectation of being able to use it in the future; Tesla made the option operational on about 2,000 cars in a “beta” or test version starting late last year, and Mr. Musk recently said the company would soon make it available to more cars. Full Self Driving is supposed to be able to operate Tesla cars in cities and on local roads where driving conditions are made more complex by oncoming traffic, intersections, traffic lights, pedestrians and cyclists.

Despite their names, Autopilot and Full Self-Driving have big limitations. Their software and sensors cannot control cars in many situations, which is why drivers have to keep their eyes on the road and hands on or close to the wheel.

a November letter to California’s Department of Motor Vehicles that recently became public, a Tesla lawyer acknowledged that Full Self-Driving struggled to react to a wide range of driving situations and should not be considered a fully autonomous driving system.

The system is not “not capable of recognizing or responding” to certain “circumstances and events,” Eric C. Williams, Tesla’s associate general counsel, wrote. “These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving paths, unmapped roads.”

Mr. Levine of the Center for Auto Safety has complained to federal regulators that the names Autopilot and Full Self-Driving are misleading at best and could be encouraging some drivers to be reckless.

“Autopilot suggests the car can drive itself and, more importantly, stop itself,” he said. “And they doubled down with Full Self-Driving, and again that leads consumers to believe the vehicle is capable of doing things it is not capable of doing.”

View Source