View Source

Tesla Earnings Set Record in First Quarter

Tesla on Monday is expected to report solid earnings for the first quarter of 2021, a result driven by continuing increases in sales and production around the world.

Analysts expect the automaker to report earnings of about 75 cents per share. That would be a significant increase from the 24 cents per share it reported for the fourth quarter of 2020. A year ago, it earned just 2 cents per share in the first quarter as the coronavirus pandemic hurt sales and forced the shutdown of its plant in Fremont, Calif.

Earlier this month, Tesla said it delivered a record 184,800 cars in the first three months of the year, more than double the total from the comparable period in 2020.

“Tesla continues to see growing pent-up demand throughout China and Europe,” Dan Ives, a Wedbush analyst, wrote in a report to investors. In the United States, the Biden administration’s push to reduce greenhouse gas emissions and support sales of electric vehicles is likely to help sustain demand for Tesla’s cars, Mr. Ives added.

the Tesla Model S they were riding in crashed into a tree on a residential street and burst into flames. Local police said one man was found in the passenger seat and the other in the rear seat and no one was at the steering wheel when the crash occurred.

The National Transportation Safety Board and the National Highway Traffic Safety Administration have sent teams to investigate the crash and see whether the men had relied on Autopilot to drive the car. A Texas Republican, Rep. Kevin Brady, has written to Elon Musk, Tesla’s chief executive, urging him to cooperate with safety regulators.

A week ago Mr. Musk posted a message on Twitter saying data from the car “so far” showed Autopilot was not enabled.

Tesla has also come under scrutiny in China, where authorities have looked into reports from consumers about battery fires and sudden acceleration by Tesla vehicles.

View Source

Police Investigate Fatal Tesla Crash Near Houston

Federal safety officials and the Texas police are investigating a fatal crash of a Tesla vehicle that had no one behind the wheel, the authorities said Tuesday, as the company comes under heightened scrutiny over its automatic steering and braking system.

The National Transportation Safety Board sent two investigators to Texas on Monday to focus on the vehicle’s operation and a fire that followed the crash on Saturday, said Keith Holloway, a spokesman. The police in Precinct 4 of Harris County, Texas, are also investigating, according to Constable Mark Herman.

Constable investigators were working with the N.T.S.B., the National Highway Traffic Safety Administration and Tesla, which was “helping with our investigation,” Constable Herman said in a statement. “At this time, we will refrain from making any additional statements as the investigation continues to progress,” he said.

On Monday, Elon Musk, Tesla’s chief executive, wrote on Twitter that recovered data logs showed the vehicle had not enabled Autopilot.

are calling that claim into question as they investigate Saturday’s crash and more than 20 other recent accidents in which drivers were, or may have been, using the system. Tesla vehicles are not self-driving — they require “active driver supervision,” the company says on its website — but Autopilot can steer, accelerate and brake automatically within a lane.

In the crash on Saturday night, which occurred north of Houston, physical evidence from the scene and interviews with witnesses led officials to believe that neither of the men were driving, according to Constable Herman.

The vehicle, a 2019 Model S, was moving at a “high rate of speed” around a curve when it veered off the road and hit a tree, Constable Herman said. He also said that it had taken the authorities four hours to put out the fire. The N.T.S.B. said last year in a report that batteries used in electric vehicles can pose safety risks to emergency responders.

Two men, 59 and 69 years old, were killed in the crash. One was in the front passenger seat and one in the rear seat, officials said. “It is very early in the investigation,” said Mr. Holloway, the N.T.S.B. spokesman.

The National Highway Traffic Safety Administration is also looking into a February crash near Houston in which a Tesla ran into a stopped police vehicle on a highway. It was not clear whether the driver was using Autopilot. In another incident in February in Detroit, a Tesla drove beneath a tractor-trailer that was crossing the road, seriously injuring the driver and a passenger. Officials have not said whether the driver had turned on Autopilot.

highlighted a safety report from the company, writing on Twitter that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”

Tesla did not respond to a request for comment.

Bryan Pietsch contributed reporting.

View Source

2 Killed in Driverless Tesla Car Crash, Officials Say

Mitchell Weston, chief investigator at the Harris County Fire Marshal’s Office, said that while the batteries are “generally safe,” impacts at high speeds can result in “thermal runaway,” which causes an “uncontrolled contact” between different materials in the batteries.

Thermal runaway can lead to fires, as well as “battery reignition,” even after an initial fire is put out, the safety board warned in its report. Mitsubishi Electric warns that “thermal runaway can lead to catastrophic results, including fire, explosion, sudden system failure, costly damage to equipment, and possibly personal injury.”

The fire marshal’s office was investigating the fire in the crash, a spokeswoman said. Constable Herman said his department was working with the federal authorities to investigate.

He said that law enforcement officials had been in contact with Tesla on Saturday for “guidance on a few things” but declined to discuss the nature of the conversations.

Tesla, which has disbanded its public relations team, did not respond to a request for comment.

Elon Musk, Tesla’s chief executive, earlier on Saturday had promoted a recent safety report from the company, writing on Twitter that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”

Tesla, which on its website calls Autopilot the “future of driving,” says the feature allows its vehicles to “steer, accelerate and brake automatically within its lane.” However, it warns that “current Autopilot features require active driver supervision and do not make the vehicle autonomous.”

In 2016, a driver in Florida was killed in a Tesla Model S that was in Autopilot mode and failed to brake for a tractor-trailer that made a left turn in front of it.

View Source

Tesla’s Autopilot Technology Faces Fresh Scrutiny

Tesla faced numerous questions about its Autopilot technology after a Florida driver was killed in 2016 when the system of sensors and cameras failed to see and brake for a tractor-trailer crossing a road.

Now the company is facing more scrutiny than it has in the last five years for Autopilot, which Tesla and its chief executive, Elon Musk, have long maintained makes its cars safer than other vehicles. Federal officials are looking into a series of recent accidents involving Teslas that either were using Autopilot or might have been using it.

The National Highway Traffic Safety Administration confirmed last week that it was investigating 23 such crashes. In one accident this month, a Tesla Model Y rear-ended a police car that had stopped on a highway near Lansing, Mich. The driver, who was not seriously injured, had been using Autopilot, the police said.

In February in Detroit, under circumstances similar to the 2016 Florida accident, a Tesla drove beneath a tractor-trailer that was crossing the road, tearing the roof off the car. The driver and a passenger were seriously injured. Officials have not said whether the driver had turned on Autopilot.

crash near Houston in which a Tesla ran into a stopped police vehicle on a highway. It is not clear if the driver was using Autopilot. The car did not appear to slow before the impact, the police said.

Autopilot is a computerized system that uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver. Tesla has said it should be used only on divided highways, but videos on social media show drivers using Autopilot on various kinds of roads.

“We need to see the results of the investigations first, but these incidents are the latest examples that show these advanced cruise-control features Tesla has are not very good at detecting and then stopping for a vehicle that is stopped in a highway circumstance,” said Jason Levine, executive director of the Center for Auto Safety, a group created in the 1970s by Consumers Union and Ralph Nader.

This renewed scrutiny arrives at a critical time for Tesla. After reaching a record high this year, its share price has fallen about 20 percent amid signs that the company’s electric cars are losing market share to traditional automakers. Ford Motor’s Mustang Mach E and the Volkswagen ID.4 recently arrived in showrooms and are considered serious challengers to the Model Y.

The outcome of the current investigations is important not only for Tesla but for other technology and auto companies that are working on autonomous cars. While Mr. Musk has frequently suggested the widespread use of these vehicles is near, Ford, General Motors and Waymo, a division of Google’s parent, Alphabet, have said that moment could be years or even decades away.

played a major role” in the 2016 Florida accident. It also said the technology lacked safeguards to prevent drivers from taking their hands off the steering wheel or looking away from the road. The safety board reached similar conclusions when it investigated a 2018 accident in California.

By comparison, a similar G.M. system, Super Cruise, monitors a driver’s eyes and switches off if the person looks away from the road for more than a few seconds. That system can be used only on major highways.

In a Feb. 1 letter, the chairman of the National Transportation Safety Board, Robert Sumwalt, criticized NHTSA for not doing more to evaluate Autopilot and require Tesla to add safeguards that prevent drivers from misusing the system.

The new administration in Washington could take a firmer line on safety. The Trump administration did not seek to impose many regulations on autonomous vehicles and sought to ease other rules the auto industry did not like, including fuel-economy standards. By contrast, President Biden has appointed an acting NHTSA administrator, Steven Cliff, who worked at the California Air Resources Board, which frequently clashed with the Trump administration on regulations.

Concerns about Autopilot could dissuade some car buyers from paying Tesla for a more advanced version, Full Self-Driving, which the company sells for $10,000. Many customers have paid for it in the expectation of being able to use it in the future; Tesla made the option operational on about 2,000 cars in a “beta” or test version starting late last year, and Mr. Musk recently said the company would soon make it available to more cars. Full Self Driving is supposed to be able to operate Tesla cars in cities and on local roads where driving conditions are made more complex by oncoming traffic, intersections, traffic lights, pedestrians and cyclists.

Despite their names, Autopilot and Full Self-Driving have big limitations. Their software and sensors cannot control cars in many situations, which is why drivers have to keep their eyes on the road and hands on or close to the wheel.

a November letter to California’s Department of Motor Vehicles that recently became public, a Tesla lawyer acknowledged that Full Self-Driving struggled to react to a wide range of driving situations and should not be considered a fully autonomous driving system.

The system is not “not capable of recognizing or responding” to certain “circumstances and events,” Eric C. Williams, Tesla’s associate general counsel, wrote. “These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving paths, unmapped roads.”

Mr. Levine of the Center for Auto Safety has complained to federal regulators that the names Autopilot and Full Self-Driving are misleading at best and could be encouraging some drivers to be reckless.

“Autopilot suggests the car can drive itself and, more importantly, stop itself,” he said. “And they doubled down with Full Self-Driving, and again that leads consumers to believe the vehicle is capable of doing things it is not capable of doing.”

View Source

Carmakers Strive to Stay Ahead of Hackers

“Human life is involved, so cybersecurity is our top priority,” said Kevin Tierney, General Motors’ vice president for global cybersecurity. The company, which has 90 engineers working full time on cybersecurity, practices what it calls “defense in depth,” removing unneeded software and creating rules that allow vehicle systems to communicate with one another only when necessary.

It’s a practice also followed by Volkswagen, said Maj-Britt Peters, a spokeswoman for the company’s software and technology group. She noted that Volkswagen’s sensitive vehicle control systems are kept in separate domains.

Continental, a major supplier of electronic parts to automakers, employs an intrusion detection and prevention system to thwart attacks. “If the throttle position sensor is talking to the airbag, that is not planned,” Mr. Smoly said. “We can stop this, but we wouldn’t do so while the vehicle was moving.”

Still, determined hackers will eventually find a way in. To date, vehicle cybersecurity has been a patchwork effort, with no international standards or regulations. But that is about to change.

This year, a United Nations regulation on vehicle cybersecurity came into force, obligating manufacturers to perform various risk assessments and report on intrusion attempts to certify cybersecurity readiness. The regulation will take effect for all vehicles sold in Europe from July 2024 and in Japan and South Korea in 2022.

While the United States is not among the 54 signatories, vehicles sold in America aren’t likely to be built to meet different cybersecurity standards from those in cars sold elsewhere, and vice versa.

“The U.N. regulation is a global standard, and we have to meet global standards,” Mr. Tierney of G.M. said.

View Source