Technology

They warn of serious flaws in Tesla’s FSD complete autonomous driving system: what is happening?


Tesla electric vehicles equipped with the Full Autonomous Driving (FSD) ‘software’, which is currently only accessible in beta form, can pose a danger if they appear en masse on public roads, since the problems of that program are not easy to solve.

This was warned by a group of experts interviewed by The Washington Post newspaper in relation to numerous videos, broadcast on the networks, where those high-tech cars fail to distinguish streetcar tracks or bike lanes, or fail to stop when someone crosses a crosswalk.

The media assures in its article, published this February 10, that it has verified the authenticity of the recordings, while among the experts consulted there are both academics who study autonomous driving vehicles and technical employees who analyze the safety of this technology, among others. The text records some studied situations that expose the deficiencies of the system.

Illustrative image

minor obstacles

In one of the videos examined, a Tesla is seen turning right at an intersection, in the city of San José (California, USA), and colliding with a pole that separates the road from a bicycle lane. In the incident, which occurred in early February, the vehicle sustained minor damage to the front bumper.

The experts interviewed pointed to the FSD as the cause of the impact. “As for why the automatic detection didn’t see it until it was too late, it’s about a computer vision problem. Maybe he never trained himself to detect those posts or bollards of unusual shapes and colors, “said Brad Templeton, developer of self-driving cars. The specialist indicated that Tesla’s ultrasonic sensors are capable of detecting these dangers, but sometimes they appear hidden in front of such places in the car that they cannot ‘see’ them in time.

pedestrian problems

Another recording, filmed in the same city last December, shows another vehicle with the FSD making a right turn, but failing to brake, forcing a woman to stop. abruptly so as not to be overwhelmed.

Commenting on this mishap, the experts pointed out that the ‘software’ might not recognize crosswalk signs nor anticipate that an individual proceeds to cross the street. For his part, Hod Finkelstein, principal investigator at the specialized company AEye, stressed that the installation of cameras is insufficient to detect pedestrians under certain conditions, since these can be blinded by headlights or the sun.

The fact that Tesla continues to investigate how to safely detect a pedestrian reveals more details about the software used by the company, known as machine learning, which can decipher large volumes of data and form correlations that allow it to “learn for itself”, explains the newspaper. The auto giant combines this ‘software’ with simpler programming rules, such as “always stop at red lights”.

Where is the Tesla Roadster that Elon Musk sent into space more than four years ago?

One expert warned that machine learning algorithms assimilate certain scenarios that should not. According to Andrew Maynard, a professor at the University of Arizona, “beta FSD is still affected by borderline cases that it hasn’t learned how to handle, even though most human drivers would handle it with ease.”

Optical illusions

In a third video, recorded at the beginning of last December, the electric vehicle stops when it notices that a pedestrian is about to cross the zebra. However, it begins to stop long before the passerby approaches the curb.

Some experts stressed that the Teslas are programmed to proceed in such a way if they detect pedestrians heading towards the street. For this reason, said one of those consulted by the media, the car could stop due to an optical illusion.

“Let’s say a red signal between the Tesla and the pedestrian briefly lines up with a tree on the sidewalk, momentarily creating an image resembling a stop sign,” the paper’s authors explain. Meanwhile, another recording uploaded to the Internet in February shows the same phenomenon with a false signal which confuses the car.

Progress Documentation

Some of the drivers who spoke to the newspaper about the videos came out in defense of the FSD. Thus, they maintained that the system can be safely disconnected before the situation goes older. In addition, most of them said that the errors do not serve to show limitations of the program, but to document progress.

South Korea could sanction Tesla for exaggerating the performance of the batteries of some of its models

Although some analysts criticize the decision to launch the FSD before it was fully perfected, others believe that this is how Tesla shows its transparencywhile the videos help to understand how the ‘software’ interprets the information it collects.

And what does Tesla say about it?

The FSD is one of two important technologies developed by the company. The other is the Autopilot, designed for highway use. In Tesla they warn that in any case drivers should “keep their hands on the wheel at all times” to take control if necessary.

As for Autopilot, Tesla CEO Elon Musk has defended its reliability, citing accident data. However, the US National Highway Traffic Safety Administration is investigating whether the system failed in various incidents. One of those that drew attention occurred last fall, when a driver was charged in California with involuntary manslaughter after ramming another car and kill two people while autopilot was on.

For its part, Tesla continues betting on the FSD and launching updates to its beta version, although on several occasions it had to withdraw them after indications from the authorities. At the end of January, from the entity they assured that in the US. almost 60,000 cars are equipped with said ‘software’.

Meanwhile, the experts consulted stressed that progress in autonomous driving technology is only possible through the analysis of imperfections. “It is like the launch of aviation at the beginning of the 20th century: They did not hit the first plane, the first time. They just kept getting better every time something bad happened,” said Mohammad Musa, founder of the company Deepen AI, whose clients include Tesla.



Source link

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button