posted videos on YouTube showing that the camera sometimes fails to notice when drivers look away from the road and that it can be fooled if they cover the lens. When the camera notices a Tesla driver looking away from the road, it sounds a warning chime but does not turn Autopilot off.

G.M. and Ford systems use infrared cameras to monitor drivers’ eyes. If drivers look away for more than two or three seconds, warnings remind them to look straight ahead. If drivers fail to comply, the G.M. and Ford systems will shut off and tell drivers to take control of the car.

Ms. Benavides emigrated from Cuba in 2016 and lived with her mother in Miami. She worked at a Walgreens pharmacy and a clothing store while attending community college. An older sister, Neima, 34, who is executor of the estate, said Naibel had been working to improve her English in hopes of getting a college degree.

“She was always laughing and making people laugh,” Neima Benavides said. “Her favorite thing was to go to the beach. She would go almost every day and hang out with friends or just sit by herself and read.”

Neima Benavides said she hoped the lawsuit would prod Tesla into making Autopilot safer. “Maybe something can change so other people don’t have to go through this.”

Ms. Benavides had just started dating Mr. Angulo when they went fishing on Key Largo. That afternoon, she sent her sister a text message indicating she was having a good time. At 9 p.m., Ms. Benavides called her mother from Mr. Angulo’s phone to say she was on the way home. She had lost her phone that day.

On the 911 call, Mr. McGee reported that a man was on the ground, unconscious and bleeding from the mouth. Several times Mr. McGee said, “Oh, my God,” and shouted “Help!” When an emergency operator asked if the man was the only injured person, Mr. McGee replied, “Yes, he’s the only passenger.”

Mr. Angulo was airlifted to a hospital. He later told investigators that he had no recollection of the accident or why they had stopped at the intersection.

An emergency medical technician spotted a woman’s sandal under the Tahoe and called on others to start searching the area for another victim. “Please tell me no,” Mr. McGee can be heard saying in the police video. “Please tell me no.”

Ms. Benavides’s body was found about 25 yards away.

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

The Costly Pursuit of Self-Driving Cars Continues On. And On. And On.

It was seven years ago when Waymo discovered that spring blossoms made its self-driving cars get twitchy on the brakes. So did soap bubbles. And road flares.

New tests, in years of tests, revealed more and more distractions for the driverless cars. Their road skills improved, but matching the competence of human drivers was elusive. The cluttered roads of America, it turned out, were a daunting place for a robot.

The wizards of Silicon Valley said people would be commuting to work in self-driving cars by now. Instead, there have been court fights, injuries and deaths, and tens of billions of dollars spent on a frustratingly fickle technology that some researchers say is still years from becoming the industry’s next big thing.

Now the pursuit of autonomous cars is undergoing a reset. Companies like Uber and Lyft, worried about blowing through their cash in pursuit of autonomous technology, have tapped out. Only the most deep pocketed outfits like Waymo, which is a subsidiary of Google’s parent company Alphabet, auto industry giants, and a handful of start-ups are managing to stay in the game.

said that fully functional self-driving cars were just two years away. More than five years later, Tesla cars offered simpler autonomy designed solely for highway driving. Even that has been tinged with controversy after several fatal crashes (which the company blamed on misuse of the technology).

Perhaps no company experienced the turbulence of driverless car development more fitfully than Uber. After poaching 40 robotics experts from Carnegie Mellon University and acquiring a self-driving truck start-up for $680 million in stock, the ride-hailing company settled a lawsuit from Waymo, which was followed by a guilty plea from a former executive accused of stealing intellectual property. A pedestrian in Arizona was also killed in a crash with one of its driverless cars. In the end, Uber essentially paid Aurora to acquire its self-driving unit.

But for the most deep-pocketed companies, the science, they hope, continues to advance one improved ride at a time. In October, Waymo reached a notable milestone: It launched the world’s first “fully autonomous” taxi service. In the suburbs of Phoenix, Ariz., anyone can now ride in a minivan with no driver behind the wheel. But that does not mean the company will immediately deploy its technology in other parts of the country.

Dmitri Dolgov, who recently took over as Waymo’s co-chief executive after the departure of John Krafcik, an automobile industry veteran, said the company considers its Arizona service a test case. Based on what it has learned in Arizona, he said, Waymo is building a new version of its self-driving technology that it will eventually deploy in other geographies and other kinds of vehicles, including long-haul trucks.

The suburbs of Phoenix are particularly well suited to driverless cars. Streets are wide, pedestrians are few and there is almost no rain or snow. Waymo supports its autonomous vehicles with remote technicians and roadside assistance crews who can help get cars out of a tight spot, either via the internet or in person.

“Autonomous vehicles can be deployed today, in certain situations,” said Elliot Katz, a former lawyer who counseled many of the big autonomous vehicle companies before launching a start-up, Phantom Auto, that provides software for remotely assisting and operating self-driving vehicles when they get stuck in difficult positions. “But you still need a human in the loop.”

Self-driving tech is not yet nimble enough to reliably handle the variety of situations human drivers encounter each day. They can usually handle suburban Phoenix, but they can’t duplicate the human chutzpah needed for merging into the Lincoln Tunnel in New York or dashing for an offramp on Highway 101 in Los Angeles.

“You have to peel back every layer before you can see the next layer” of challenges for the technology, said Nathaniel Fairfield, a Waymo software engineer who has worked on the project since 2009, in describing some of the distractions faced by the cars. “Your car has to be pretty good at driving before you can really get it into the situations where it handles the next most challenging thing.”

Like Waymo, Aurora is now developing autonomous trucks as well as passenger vehicles. No company has deployed trucks without safety drivers behind the wheel, but Mr. Urmson and others argue that autonomous trucks will make it to market faster than anything designed to transport regular consumers.

Long-haul trucking does not involve passengers who might not be forgiving of twitchy brakes. The routes are also simpler. Once you master one stretch of highway, Mr. Urmson said, it is easier to master another. But even driving down a long, relatively straight highway is extraordinarily difficult. Delivering dinner orders across a small neighborhood is an even greater challenge.

“This is one of the biggest technical challenges of our generation,” said Dave Ferguson, another early engineer on the Google team who is now president of Nuro, a company focused on delivering groceries, pizzas and other goods.

Mr. Ferguson said that many thought self-driving technology would improve like an internet service or a smartphone app. But robotics is a lot more challenging. It was wrong to claim anything else.

“If you look at almost every industry that is trying to solve really really difficult technical challenges, the folks that tend to be involved are a little bit crazy and little bit optimistic,” he said. “You need to have that optimism to get up everyday and bang your head against the wall to try to solve a problem that has never been solved, and it’s not guaranteed that it ever will be solved.”

Uber and Lyft aren’t entirely giving up on driverless cars. Even though it may not help the bottom line for a long time, they still want to deploy autonomous vehicles by partnering with the companies that are still working on the technology. Lyft now says autonomous rides could arrive by 2023.

“These cars will be able to operate on a limited set of streets under a limited set of weather conditions at certain speeds,” said Jody Kelman, the executive of Lyft. “We will very safely be able to deploy these cars, but they won’t be able to go that many places.”

View Source

>>> Don’t Miss Today’s BEST Amazon Deals! <<<<

The Robot Surgeon Will See You Now

Sitting on a stool several feet from a long-armed robot, Dr. Danyal Fer wrapped his fingers around two metal handles near his chest.

As he moved the handles — up and down, left and right — the robot mimicked each small motion with its own two arms. Then, when he pinched his thumb and forefinger together, one of the robot’s tiny claws did much the same. This is how surgeons like Dr. Fer have long used robots when operating on patients. They can remove a prostate from a patient while sitting at a computer console across the room.

But after this brief demonstration, Dr. Fer and his fellow researchers at the University of California, Berkeley, showed how they hope to advance the state of the art. Dr. Fer let go of the handles, and a new kind of computer software took over. As he and the other researchers looked on, the robot started to move entirely on its own.

With one claw, the machine lifted a tiny plastic ring from an equally tiny peg on the table, passed the ring from one claw to the other, moved it across the table and gingerly hooked it onto a new peg. Then the robot did the same with several more rings, completing the task as quickly as it had when guided by Dr. Fer.

how surgeons learn to operate robots like the one in Berkeley. Now, an automated robot performing the test can match or even exceed a human in dexterity, precision and speed, according to a new research paper from the Berkeley team.

The project is a part of a much wider effort to bring artificial intelligence into the operating room. Using many of the same technologies that underpin self-driving cars, autonomous drones and warehouse robots, researchers are working to automate surgical robots too. These methods are still a long way from everyday use, but progress is accelerating.

where there is room for improvement — by automating particular phases of surgery.

significantly improved the power of computer vision, which could allow robots to perform surgical tasks on their own, without such markers.

The change is driven by what are called neural networks, mathematical systems that can learn skills by analyzing vast amounts of data. By analyzing thousands of cat photos, for instance, a neural network can learn to recognize a cat. In much the same way, a neural network can learn from images captured by surgical robots.

inserting a needle for a cancer biopsy or burning into the brain to remove a tumor.

“It is like a car where the lane-following is autonomous but you still control the gas and the brake,” said Greg Fischer, one of the Worcester researchers.

Many obstacles lie ahead, scientists note. Moving plastic pegs is one thing; cutting, moving and suturing flesh is another. “What happens when the camera angle changes?” said Ann Majewicz Fey, an associate professor at the University of Texas, Austin. “What happens when smoke gets in the way?”

For the foreseeable future, automation will be something that works alongside surgeons rather than replaces them. But even that could have profound effects, Dr. Fer said. For instance, doctors could perform surgery across distances far greater than the width of the operating room — from miles or more away, perhaps, helping wounded soldiers on distant battlefields.

The signal lag is too great to make that possible currently. But if a robot could handle at least some of the tasks on its own, long-distance surgery could become viable, Dr. Fer said: “You could send a high-level plan and then the robot could carry it out.”

The same technology would be essential to remote surgery across even longer distances. “When we start operating on people on the moon,” he said, “surgeons will need entirely new tools.”

View Source

Police Investigate Fatal Tesla Crash Near Houston

Federal safety officials and the Texas police are investigating a fatal crash of a Tesla vehicle that had no one behind the wheel, the authorities said Tuesday, as the company comes under heightened scrutiny over its automatic steering and braking system.

The National Transportation Safety Board sent two investigators to Texas on Monday to focus on the vehicle’s operation and a fire that followed the crash on Saturday, said Keith Holloway, a spokesman. The police in Precinct 4 of Harris County, Texas, are also investigating, according to Constable Mark Herman.

Constable investigators were working with the N.T.S.B., the National Highway Traffic Safety Administration and Tesla, which was “helping with our investigation,” Constable Herman said in a statement. “At this time, we will refrain from making any additional statements as the investigation continues to progress,” he said.

On Monday, Elon Musk, Tesla’s chief executive, wrote on Twitter that recovered data logs showed the vehicle had not enabled Autopilot.

are calling that claim into question as they investigate Saturday’s crash and more than 20 other recent accidents in which drivers were, or may have been, using the system. Tesla vehicles are not self-driving — they require “active driver supervision,” the company says on its website — but Autopilot can steer, accelerate and brake automatically within a lane.

In the crash on Saturday night, which occurred north of Houston, physical evidence from the scene and interviews with witnesses led officials to believe that neither of the men were driving, according to Constable Herman.

The vehicle, a 2019 Model S, was moving at a “high rate of speed” around a curve when it veered off the road and hit a tree, Constable Herman said. He also said that it had taken the authorities four hours to put out the fire. The N.T.S.B. said last year in a report that batteries used in electric vehicles can pose safety risks to emergency responders.

Two men, 59 and 69 years old, were killed in the crash. One was in the front passenger seat and one in the rear seat, officials said. “It is very early in the investigation,” said Mr. Holloway, the N.T.S.B. spokesman.

The National Highway Traffic Safety Administration is also looking into a February crash near Houston in which a Tesla ran into a stopped police vehicle on a highway. It was not clear whether the driver was using Autopilot. In another incident in February in Detroit, a Tesla drove beneath a tractor-trailer that was crossing the road, seriously injuring the driver and a passenger. Officials have not said whether the driver had turned on Autopilot.

highlighted a safety report from the company, writing on Twitter that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”

Tesla did not respond to a request for comment.

Bryan Pietsch contributed reporting.

View Source

2 Killed in Driverless Tesla Car Crash, Officials Say

Mitchell Weston, chief investigator at the Harris County Fire Marshal’s Office, said that while the batteries are “generally safe,” impacts at high speeds can result in “thermal runaway,” which causes an “uncontrolled contact” between different materials in the batteries.

Thermal runaway can lead to fires, as well as “battery reignition,” even after an initial fire is put out, the safety board warned in its report. Mitsubishi Electric warns that “thermal runaway can lead to catastrophic results, including fire, explosion, sudden system failure, costly damage to equipment, and possibly personal injury.”

The fire marshal’s office was investigating the fire in the crash, a spokeswoman said. Constable Herman said his department was working with the federal authorities to investigate.

He said that law enforcement officials had been in contact with Tesla on Saturday for “guidance on a few things” but declined to discuss the nature of the conversations.

Tesla, which has disbanded its public relations team, did not respond to a request for comment.

Elon Musk, Tesla’s chief executive, earlier on Saturday had promoted a recent safety report from the company, writing on Twitter that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”

Tesla, which on its website calls Autopilot the “future of driving,” says the feature allows its vehicles to “steer, accelerate and brake automatically within its lane.” However, it warns that “current Autopilot features require active driver supervision and do not make the vehicle autonomous.”

In 2016, a driver in Florida was killed in a Tesla Model S that was in Autopilot mode and failed to brake for a tractor-trailer that made a left turn in front of it.

View Source

Shares of TuSimple, a developer of autonomous trucks, fall sharply after I.P.O. before recovering.

Shares in TuSimple, a developer of autonomous trucks that is backed by Volkswagen and UPS, fell sharply on Thursday after its initial public offering, suggesting that investors have doubts about the company’s promise of putting its technology on the road by 2024.

The start-up, which is based in San Diego, raised more than $1 billion in an I.P.O. that valued it at nearly $8.5 billion. Shares started trading on the Nasdaq under the symbol TSP at $40 each around noon, but quickly fell as much as 19 percent before recovering those losses by the time the market closed.

TuSimple and other companies working on autonomous vehicles believe that long-haul trucks are particularly suited for self-driving technology. Routes along highways that trucks travel repeatedly are easier to map and present fewer challenges than local roads, where self-driving systems have to deal with unpredictable stop-and-go traffic, pedestrians and cyclists.

TuSimple’s self-driving technology relies on several sensors but is centered on long-range cameras, which it says can map objects within five centimeters of accuracy and see as far as 1,000 meters. The company has a fleet of about 70 trucks, with 50 in the United States and 20 in Europe and Asia. As of late March, the company said it had more than 5,700 reservations for vehicles, which typically require a deposit of just $500.

Walmart said it was investing in Cruise, the autonomous vehicle division of General Motors.

View Source

The C.E.O. of the self-driving car company Waymo will step down after more than 5 years.

Waymo, the autonomous car unit of Google’s parent company, Alphabet, said John Krafcik is stepping down as chief executive after five and a half years at the helm.

In a statement, Waymo said the chief executive duties will be divided between two current company executives — Tekedra Mawakana and Dmitri Dolgov. Ms. Mawakana was Waymo’s chief operating officer, and Mr. Dolgov was the company’s chief technology officer before the promotion.

In a blog post announcing the move, Mr. Krafcik, 59, did not specify a reason for why he was stepping down at this moment other than to say he was pursuing “new adventures.” Waymo said it was Mr. Krafcik’s decision and that he plans to remain an adviser to the company.

Mr. Krafcik, a longtime auto industry executive who oversaw Hyundai Motor’s U.S. operations, joined Waymo in 2015 when it was still part of Google. During his tenure, Google spun out Waymo into a separate subsidiary of Alphabet, and the company raised more than $3 billion from outside investors in a move that signaled a greater independence from its parent company.

Google and Waymo have pursued self-driving car technology for more than a decade. Waymo has launched its own autonomous taxi service in the greater Phoenix area called Waymo One, and the company has struck partnerships with a handful of car manufacturers, including Volvo and Jaguar Land Rover, to build its self-driving technology into their vehicles.

Ms. Mawakana joined Waymo four years ago as the global head of policy and has been the company’s operating chief for the last two years. Before that she worked in policy positions at eBay, Yahoo and AOL.

Mr. Dolgov is one of the original employees who started Google’s self-driving car project in 2009 and is considered one of the leading technical experts in autonomous vehicle technology.

View Source

Tesla’s Autopilot Technology Faces Fresh Scrutiny

Tesla faced numerous questions about its Autopilot technology after a Florida driver was killed in 2016 when the system of sensors and cameras failed to see and brake for a tractor-trailer crossing a road.

Now the company is facing more scrutiny than it has in the last five years for Autopilot, which Tesla and its chief executive, Elon Musk, have long maintained makes its cars safer than other vehicles. Federal officials are looking into a series of recent accidents involving Teslas that either were using Autopilot or might have been using it.

The National Highway Traffic Safety Administration confirmed last week that it was investigating 23 such crashes. In one accident this month, a Tesla Model Y rear-ended a police car that had stopped on a highway near Lansing, Mich. The driver, who was not seriously injured, had been using Autopilot, the police said.

In February in Detroit, under circumstances similar to the 2016 Florida accident, a Tesla drove beneath a tractor-trailer that was crossing the road, tearing the roof off the car. The driver and a passenger were seriously injured. Officials have not said whether the driver had turned on Autopilot.

crash near Houston in which a Tesla ran into a stopped police vehicle on a highway. It is not clear if the driver was using Autopilot. The car did not appear to slow before the impact, the police said.

Autopilot is a computerized system that uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver. Tesla has said it should be used only on divided highways, but videos on social media show drivers using Autopilot on various kinds of roads.

“We need to see the results of the investigations first, but these incidents are the latest examples that show these advanced cruise-control features Tesla has are not very good at detecting and then stopping for a vehicle that is stopped in a highway circumstance,” said Jason Levine, executive director of the Center for Auto Safety, a group created in the 1970s by Consumers Union and Ralph Nader.

This renewed scrutiny arrives at a critical time for Tesla. After reaching a record high this year, its share price has fallen about 20 percent amid signs that the company’s electric cars are losing market share to traditional automakers. Ford Motor’s Mustang Mach E and the Volkswagen ID.4 recently arrived in showrooms and are considered serious challengers to the Model Y.

The outcome of the current investigations is important not only for Tesla but for other technology and auto companies that are working on autonomous cars. While Mr. Musk has frequently suggested the widespread use of these vehicles is near, Ford, General Motors and Waymo, a division of Google’s parent, Alphabet, have said that moment could be years or even decades away.

played a major role” in the 2016 Florida accident. It also said the technology lacked safeguards to prevent drivers from taking their hands off the steering wheel or looking away from the road. The safety board reached similar conclusions when it investigated a 2018 accident in California.

By comparison, a similar G.M. system, Super Cruise, monitors a driver’s eyes and switches off if the person looks away from the road for more than a few seconds. That system can be used only on major highways.

In a Feb. 1 letter, the chairman of the National Transportation Safety Board, Robert Sumwalt, criticized NHTSA for not doing more to evaluate Autopilot and require Tesla to add safeguards that prevent drivers from misusing the system.

The new administration in Washington could take a firmer line on safety. The Trump administration did not seek to impose many regulations on autonomous vehicles and sought to ease other rules the auto industry did not like, including fuel-economy standards. By contrast, President Biden has appointed an acting NHTSA administrator, Steven Cliff, who worked at the California Air Resources Board, which frequently clashed with the Trump administration on regulations.

Concerns about Autopilot could dissuade some car buyers from paying Tesla for a more advanced version, Full Self-Driving, which the company sells for $10,000. Many customers have paid for it in the expectation of being able to use it in the future; Tesla made the option operational on about 2,000 cars in a “beta” or test version starting late last year, and Mr. Musk recently said the company would soon make it available to more cars. Full Self Driving is supposed to be able to operate Tesla cars in cities and on local roads where driving conditions are made more complex by oncoming traffic, intersections, traffic lights, pedestrians and cyclists.

Despite their names, Autopilot and Full Self-Driving have big limitations. Their software and sensors cannot control cars in many situations, which is why drivers have to keep their eyes on the road and hands on or close to the wheel.

a November letter to California’s Department of Motor Vehicles that recently became public, a Tesla lawyer acknowledged that Full Self-Driving struggled to react to a wide range of driving situations and should not be considered a fully autonomous driving system.

The system is not “not capable of recognizing or responding” to certain “circumstances and events,” Eric C. Williams, Tesla’s associate general counsel, wrote. “These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving paths, unmapped roads.”

Mr. Levine of the Center for Auto Safety has complained to federal regulators that the names Autopilot and Full Self-Driving are misleading at best and could be encouraging some drivers to be reckless.

“Autopilot suggests the car can drive itself and, more importantly, stop itself,” he said. “And they doubled down with Full Self-Driving, and again that leads consumers to believe the vehicle is capable of doing things it is not capable of doing.”

View Source