3

Videos of Tesla’s Full Self-Driving beta software reveal flaws in system - The W...

 2 years ago
source link: https://www.washingtonpost.com/technology/2022/02/10/video-tesla-full-self-driving-beta/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

‘Full Self-Driving’ clips show owners of Teslas fighting for control, and experts see deep flaws

The Washington Post verified footage posted by beta testers and had it reviewed by a panel of experts

Subtitle Settings
Font
Font Size
Font Edge
Font Color
Background
Tesla estimates there are 60,000 of its cars, like this Model Y, using its "Full Self-Driving" beta software on public roads today. (TWP)
February 10, 2022|Updated February 10, 2022 at 9:57 a.m. EST

SAN FRANCISCO — In one video, a Tesla tries to drive down some light-rail tracks. In another, a Tesla fails to stop for a pedestrian in a crosswalk. And at one point, the most advanced driver-assistance product available to consumers appears to slam into a bike lane bollard at 11 mph.

Each of these moments — captured on video by a Tesla owner and posted online — reveals a fundamental weakness in Tesla’s “Full Self-Driving” technology, according to a panel of experts assembled by The Washington Post and asked to examine the videos. These are problems with no easy fix, the experts said, where patching one issue might introduce new complications, or where the nearly infinite array of possible real-life scenarios is simply too much for Tesla’s algorithms to master.

Advertisement

The footage includes a scene in which a driver appears to be fighting for control with the advanceddriver-assistance software, as well as clips showing cars failing to properly interpret critical road markings and signs and ordinary pedestrian behavior.

The Post asked experts to analyze videos of Tesla beta software, and reporters Faiz Siddiqui and Reed Albergotti test the car's performance firsthand. (Jonathan Baran/The Washington Post)

The Post selected sixvideos from a large array posted on YouTube and contacted the people who shot them to confirm their authenticity. The Post then recruited a half-dozen experts to conduct a frame-by-frame analysis.

The experts include academics who study self-driving vehicles; industry executives and technical staff who work in autonomous-vehicle safety analysis; and self-driving vehicle developers. None work in capacities that put them in competition with Tesla, and several said they did not fault Tesla for its approach. Two spoke on the condition of anonymity to avoid angering Tesla, its fans or future clients.

Their analysis suggests that, as currently designed, Full Self-Driving (FSD) could be dangerous on public roadways, according to several of the experts. Some defects appear to plague multiple versions of Tesla’s software, such as inability to recognize light-rail tracks: One video shows a driver shifting into reverse after traveling too far onto the tracks.

(AI Addict)

“The video [footage] shows different scenarios where the automated driving system was not able to detect and/or cope with relevant features of its Operational Design Domain,” or the conditions under which the system is expected to safely operate, said Nicola Croce, technical program manager at Deepen AI, which helps companies deploy driver-assistance and autonomous-driving systems. Tesla is not one of its clients.

Lapses within the design domain, Croce said, are “considered a failure to follow the safety expectations.”

Tesla did not respond to repeated requests for comment. The company disbanded its public relations department in 2020 and does not typically answer media requests.

(Dirty Tesla)

Several drivers who spoke to The Post about the videos defended the technology. While they acknowledged that miscues happen, they said were able to safely “disengage” the softwarebefore a more serious incident.

“I’m not going to put anybody in danger. I’m mindful of the cars around me,” said Chris, a driver from Fenton, Mich., who spoke on condition that he be identified only by his first name out of concern for his privacy.

Full Self-Driving is one of two driver-assistance technologies available on Teslas. The other is “Autopilot,” a system primarily designed for highway use with an attentive driver behind the wheel.

When using Autopilot and Full Self-Driving, drivers must agree to “keep your hands on the steering wheel at all times” and always “maintain control and responsibility for your car,” according to Tesla’s website.

The company has fiercely defended the safety record of Autopilot, with chief executive Elon Musk calling it “unequivocally safer” than regular driving based on crash data. However, the National Highway Traffic Safety Administration is investigating whether Autopilot played a role in about a dozen crashes involving parked emergency vehicles. Last fall, a California driver was charged with vehicular manslaughter after striking another vehicle while Autopilot was activated, killing two people.

But the company has staked its autonomy ambitions on FSD, which brings automated capabilities to city and residential streets. FSD is only available in the form of a software beta, a type of pilot that serves as an advanced testing stage before eventual wide release. Tesla recently said that nearly 60,000 vehicles in the United States are now equipped with it.

(AI Addict)

Full Self-Driving uses Tesla’s suite of eight surround cameras to stitch together a view of the world outside the car. The images are fed into Tesla’s software, which the company intends to leverage to help its vehicles learn. The cameras are supplemented by 12 ultrasonic sensors that detect objects around the vehicle.

Tesla has issued multiple recalls of the Full Self-Driving Beta, sending remote updates after the software raised concerns with federal auto safety regulators. In October, the company recalled the software for about 12,000 vehicles after an update led cars to begin braking abruptly at highway speeds. Tesla remotely issued a fix.

In late January, Tesla notified regulators it would update the Full Self-Driving Beta to eliminate a “rolling stop” function that allowedcars to proceed through stop signs without fully halting. Last week, The Post reported that owner complaints of unexpected braking, a phenomenon known as “phantom braking,” surged in the period after Tesla eliminated the use of radar to aid its vehicles’ perception.

Advertisement

To further understand how the technology operates, The Post turned to videos showing the systemin action. In interviews, most of the drivers who posted the videos said they did so to showcase the car’s cutting-edge capabilities. The car’s mistakes, they said, serve not as evidence of insurmountable limitations, but instead as mile markers to document progress.

(Marc Hoag)

Some drivers said they have run their own experiments to test and improve the software. Kevin Smith, who uses FSD on his Tesla Model Y in Murfreesboro, Tenn., said he identified 13 locations near his hometown that stumped his car and created a route that hit all of them. “Each time, it gets a little bit better,” he said.

While some experts in AI are critical of Tesla’s decision to release Full Self-Driving before it’s ready for the road, many say they appreciate the ability to analyze and learn from videos posted by Tesla drivers. Most show the screen in the car’s center console, offering clues about how the software is interpreting data from the real world.

“The value [of Tesla’s experiment] to society, I think, is transparency,” said Mohammad Musa, founder of Deepen AI.

“Whatever you see from anyone else is what they want you to see,” he said of Tesla’s competitors. “It might actually fire back at [Tesla] and become a PR nightmare. … For better or for worse, they are opening up about things.”

(AI Addict)
First recorded crash

On a clear day in early February, a Tesla in FSD Beta makes a right turn through a San Jose intersection at about 15 mph. A bike lane flanks the inner side of the road. Suddenly, the car approaches a set of green-and-white protective bollards at a sharp angle.

It slams into the first bollard after the crosswalk at about 11 mph.

The car suffered only minor scrapes, but FSD testers and experts who analyzed the video say it is the first publicly released footage of a crashinvolving the software. And it revealed flaws.

“The bollard issue is both mapping and perception. As permanent bollards rather than temporary cones, they should be on a map,” said Brad Templeton, a longtime self-driving-car developer and consultant who worked on Google’s self-driving car. That way, he said, “the car would know that nobody ever drives through these.”

“As to why the perception missed them until too late, this is an issue with computer vision. Perhaps it never got trained on these unusually shaped and [colored] bollards,” said Templeton, who owns a Tesla and has described himself as a “fan.”

Tesla’s ultrasonic sensors might be expected to detect such hazards, but their locations in places such asthe front bumper can be a weakness. “Sparse, thin things like posts may not be seen by these,” Templeton said.

(AI Addict)
A pedestrian near miss

On an overcast December dayin San Jose, one video shows a routine right turn at a green light leading to a close call with a pedestrian. Traveling at about 12 miles an hour, the Tesla is proceeding across light-rail tracks when a woman steps off the sidewalk and into the crosswalk.

The woman stops abruptly when she sees the Tesla heading toward her. The Tesla appears to slow down, but only after traveling through most of the crosswalk.

After analyzing the video and others like it, The Post’s panel of experts said FSD does not appear to recognize pedestrian walk signs, or anticipate that a stationary pedestrian might venture into the street.

“It’s unclear whether the car reacted or not to [the pedestrian’s] presence, but clearly the driver is shaken,” said Andrew Maynard, a professor at Arizona State University, who is director of its Risk Innovation Lab.

The driver, who confirmed the veracity of the footage, declined to comment further.

Hod Finkelstein, chief research and development officer for AEye, a company that sells lidar technology to automakers, said he does not believe cameras alone are good enough to detect pedestrian intent in all conditions, in part because they aren’t good at measuring the distance of faraway objects and can be blinded by car headlights and the sun. Traditional manufacturers of autonomous vehicles have used a combination of cameras, lidar, traditional radar and even ultrasonic sensors for close range.

Advertisement

That the Tesla keeps going after seeing a pedestrian near a crosswalk offers insight into the type of software Tesla uses, known as “machine learning.” This type of software is capable of deciphering large sets of data and forming correlations that allow it, in essence, to learn on its own.

Tesla’s software uses a combination of machine-learning software and simpler software “rules,” such as “always stop at stop signs and red lights.” But as one researcher pointed out, machine-learning algorithms invariably learn lessons they shouldn’t.It’s possible that if the software were told to “never hit pedestrians,” it could take away the wrong lesson: that pedestrians will move out of the way if they are about to be hit, one expert said.

Software developers could create a “rule” that the car must slow down or stop for pedestrians. But that fix could paralyze the software in urban environments, where pedestrians are everywhere.

Maynard said the early-February crash with a bollard may reveal characteristics of how Tesla’s system learns.

“[It] shows that FSD beta is still fazed by edge cases that it hasn’t learned to navigate, yet most human drivers would handle with ease,” he said. “One question it raises is whether Tesla are teaching FSD by brute force — exposing the algorithms to every conceivable scenario — or whether they are teaching it to learn and problem solve like a human driver. The latter is what makes humans so adaptable on the road, and yet is exceptionally hard to emulate in a machine.”

(AI Addict)
Optical illusions

In another clip from early Decemberrecorded by the same driver, the Tesla appears to stop for a pedestrian crossing the road outside a crosswalk. The Tesla begins to stop long before the pedestrian approaches the curb. Many human drivers would have kept on driving.

The video suggests Teslas may be programmed to slow down for pedestrians if they are moving in the direction of the road, the experts said. But one expert suggested another possibility: The car may have stopped because of an optical illusion.

A red sign between the Tesla and the pedestrian briefly lines up with a tree on the sidewalk, for a moment creating an image generally resemblinga stop sign. A later video uploaded in February demonstrated the same phenomenon, suggesting the stop sign illusion was indeed tricking the car.

Were Tesla’s software to be confused by a phantom stop sign, that would highlights a key difference with many of its competitors, which use detailed maps showing the precise location of stop signs and other obstacles and road markings.

Struggle for control

In another instance, the same Tesla is passing a UPS truck stopped on a narrow street with parked vehicles on either side. Unsure what to do, the car’s software prompts the driver to take over. But the driver struggles to gain control of the vehicle, swinging the steering wheel dramatically from side to side.

“I am taking over,” the driver says, as the wheel turns erratically. “I’m — I’m trying.”

(AI Addict)

Experts say the incident illustrates a fundamental challenge with Tesla’s decision to release software that requires regular intervention by humans. Other companies have bypassed this stage, instead releasing cars that aim to do away with the human driver entirely.

In the case of the UPS truck, both the computer systemand the human were attempting to drive the car through a tight spot with very little wiggle room to the left or right. In most cases, the driver takes over by yanking the steering wheel in the opposite direction the software is trying to turn. That movement wasn’t possible under these circumstances, however, leaving it unclear whether the car or the human was in control. The struggle for control was amplified by the lack of a sharp turn, preventing the driver from cranking the wheel to regain his steering input from the software.

“It’s unclear who exactly is in control at that moment,” Maynard said. “There’s an odd glitch here where there seems to be a short fight for control between the driver and the car. It appears there are scenarios where both driver and car potentially lose control at some points.”

Maynard called the incident an “important” moment that reveals a glitch not in the car’s judgment, “but in the ability of the human driver to ensure safety.”

(Dirty Tesla)
Learning to read

Another video The Post analyzed was posted in November by Chris, the Fenton, Mich., driver. The video shows the car failing to react to a “Stop here on red” sign, forcing Chris to apply the brakes.

An autonomous-driving researcher said such signs, which are ubiquitous on American roadways, can create vexing problems for Tesla engineers. Unless the car’s cameras recognize the letters on the sign, the computer would have to look for other clues, like an arrow or a thin white line painted across the road. But that could create problems in other scenarios, prompting the car to stop erroneously when it sees a line on the road or a similar-looking arrow.

Advertisement

Many of Tesla’s competitors use high-definition maps to take the guesswork out of where to stop and turn, the experts said. But that strategy raises other issues, including whether any map can keep pace with precise conditions on the nation’s ever-changing network of roads.

“Most of the problems here are solved if you have maps,” Templeton said. “If you use maps, you can drive decently in your service area,” he wrote in an email to The Post. “Without maps, you can crash on any street in the country.”

After driving with FSD for about a year, Chris said he thinks it will be a decade before the cars can reliably drive themselves.

The experts who spoke with The Post agreed with that timeline.

“The last mile of safety is really the hardest part,” said Musa. “It’s like launching aviation in the early 1900s: They didn’t get the first plane right in the first go. They just kept improving every time something bad happens.”


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK