Tesla moving ahead on self-driving cars

Technology not yet ready, critics say

Inside the Tesla Model 3, the dashboard is mostly contained in a touchscreen in the center of the front console. There's no information directly behind of the steering wheel. MUST CREDIT: Washington Post photo by Jhaan Elker.
Inside the Tesla Model 3, the dashboard is mostly contained in a touchscreen in the center of the front console. There's no information directly behind of the steering wheel. MUST CREDIT: Washington Post photo by Jhaan Elker.

SAN FRANCISCO -- This week, a group of Tesla drivers was selected to receive a software update that downloaded automatically into their cars, enabling the vehicles to better steer and accelerate without human hands and feet.

According to Tesla, hundreds of thousands of its cars will be able to drive themselves as soon as this year, probably making them the first large fleet of vehicles billed as autonomous owned by ordinary consumers.

Tesla is forging ahead despite skepticism among some safety advocates about whether Tesla's technology is ready – and whether the rest of the world is ready for cars that drive themselves. An industry coalition consisting of General Motors' Cruise, Ford, Uber and Waymo, among others, this week criticized the move by Tesla, saying its vehicles are not truly autonomous because they still require an active driver.

Self-driving is lightly regulated in the United States, and Tesla does not need permission to launch the new feature.

A point of contention among Tesla's critics is that the company is moving ahead without a key piece of hardware. Nearly all self-driving car makers have embraced light detection and ranging (lidar) sensors, which are placed on the outside of vehicles and can detect the precise size, shape and depth of objects in real time, even in bad weather.

Instead, Tesla is trying to achieve full self-driving with a suite of cameras and a type of radar that are all constantly connected to an advanced neural network. Tesla's technology can detect vehicles and pedestrians in the road and some objects such as trees, but it cannot always see the true shape or depth of the obstacles it encounters, according to some safety experts. That might not allow the car to distinguish between a box truck and a semi as it approached the rig from behind, for example.

Tesla Chief Executive Officer Elon Musk has decried light detection and ranging as "expensive," redundant and "a fool's errand," calling anyone who relied on it "doomed."

'SLOW AND CAUTIOUS'

Tesla did not respond to requests for comment. The company has said it will not activate full self-driving until it receives regulatory approval, though it remains unknown exactly what certification would be needed. Musk said on Twitter that the self-driving beta rollout would be "extremely slow & cautious, as it should."

Demonstrating the challenges, in one such recent update, some Tesla cars could detect red lights and stop signs but would not proceed through the intersection until the driver confirmed via the accelerator or steering wheel stalk that the traffic light was green, according to Tesla.

"The fundamental challenge of neural nets is achieving sufficient reliability to use in a safety-critical system," said Edward Niedermeyer, communications director for the Partners for Automated Vehicle Education campaign, a coalition of nonprofits seeking to help the public better understand driverless technology.

"I'm puzzled as to where the confidence came from almost four years ago that they'd be able to do this," said Niedermeyer, who wrote the 2019 book "Ludicrous: The Unvarnished Story of Tesla Motors." "The reason you do these things is because it's an extremely hard problem, and it's not realistic to solve this problem with some cameras."

SOFTWARE IS KEY

In essence, Tesla is aiming to compensate for its hardware limitations by supercharging its software, almost to create virtual light detection and ranging using Tesla's existing suite of cameras, said Eshak Mir, a former Tesla Autopilot engineer who reviewed and worked with data aimed at training Tesla's neural network.

"They're trying to combine all the feeds from the cameras into one full video and label it in real time," Mir said. "With that, you'll be able to pick up a full sense of depth."

There is no true industry hardware standard for a self-driving car. But before Tesla came along, there was little question that a sophisticated sensor in the vein of light detection and ranging was necessary for the redundancy and complex image processing required of self-driving vehicles. Some experts continue to hold that view.

Overcast skies, rain, snowstorms and especially bright sunlight can challenge mere cameras' perception. "In normal daylight conditions, the cameras work perfectly fine," Mir said.

"Just from my experience, cameras are very dependable, but at the same time there can be a challenge when there's harsh conditions," added Mir, who supports Tesla's current approach.

Tesla on Wednesday posted stronger-than-expected net earnings for the third quarter.

The electric car and solar panel maker said Wednesday that it made $331 million, or 27 cents per share, for its fifth-straight profitable quarter.

Excluding special items such as stock-based compensation, Tesla made 76 cents per share, beating Wall Street estimates of 57 cents. Revenue from July through September was $8.77 billion, also passing analysts' expectations of $6.3 billion, according to FactSet.

But as in previous quarters, the company may have lost money if it weren't for $397 million it earned from selling electric vehicle credits to other automakers so they can meet government fuel economy and pollution regulations.

Information for this article was contributed by staff members of The Associated Press.

Upcoming Events