Tesla’s “Full Self Driving” Beta is just ridiculously bad and potentially dangerous

A Beta Version of Tesla’s “Full Self Driving” Autopilot Update Launched to Certain Users. And man, if you thought “Full Self-Government” is even close to a reality, this video of the system in action will definitely get you rid of the idea. This is perhaps the best comprehensive video illustrating how morally questionable, technologically limited and potentially dangerous Autopilot’s “Full Self Driving” beta program is.

This content is imported from YouTube. You can find the same content in a different format, or you can find more information on their website.

In a 13-minute video posted by user AI Addict on YouTube, we see a Model 3 with FSD Beta 8.2 bagging around in Oakland. It seems unhappy and completely confused at all times, and never imitates a human driver in a passive way. Early in the video, the front passenger notices the correct decision of the car to drive a bunch of double parks, rather than waiting behind them, but the moment of praise is cut short when the parking lot is parked right on the center line. trying to get into a left turn lane.

This is because – like all semi-autonomous systems offered for sale today – the “Full Self Driving” and “Autopilot” systems are in fact not completely autonomous. They need constant human supervision and intervention of a second. And now that the latest beta version of the software is in nature, it requires more attention than ever before.

The video quickly moves from ’embarrassing mistakes’ to ‘extremely risky, potentially damaging driving’. In autonomous mode, the Tesla violates a variety of traffic laws, beginning with a last-ditch effort to cross a hard line and carry out an illegal lane change. It then tries to turn left next to another car, only to give up halfway through the intersection and disconnect.

It takes another turn too much, breaks it into the oncoming lane and requires driver intervention. Shortly afterwards, it crosses again into the oncoming lane on a straight stretch of road with cyclists and oncoming traffic. It then stumbles drunk through an intersection and again needs the intervention of the driver to carry it through. As he turns left to a stop sign, he slows down before turning and chews in the way of oncoming cars that need to brake to prevent it from hitting.

The video is not even halfway done, but the number of errors continues with another random disconnect. The Tesla tries to turn right at a red light where it is forbidden, and once again almost violates the law and requires the driver to actually prevent him from doing anything. It stops randomly in the middle of the road, goes straight through a turn lane, stops behind a parked car and eventually almost falls into a curb while making a turn. After stopping traffic to crawl around a stationary car, he confidently drives directly into the oncoming lane before realizing and disconnecting his fault. Another traffic violation on the books – and another moment where the cluttered car just gives up and leaves it to the human driver to clear up the mess.

Tesla’s software is defeated by cars stopped in the lane and an intersection where it clearly has preferential rights. Then comes another near-collision. This time the Tesla arrives at an intersection where it has a stop sign and does not cross traffic. It continues with two cars coming in, the first car driving past the car’s front bumper and the rear car braking to debone the Model 3. It is absolutely incredible and indefensible that the driver, who is supposed to monitor the car, ensuring safe operation, did not intervene there. It is even more popular that this software is available to the public.

But that’s not the end of the video. To round it off, the Model 3 almost falls into a Camry that has the right of priority while trying to negotiate a nod in the road. Once it is through the intersection, drive it straight to a fence and plow almost directly into it. Both of these incidents required driver intervention to be avoided.

To be sure, no one solved autonomous driving. This is a challenging problem that experts say will only be solved with highly advanced artificial intelligence. Tesla’s software clearly does a great job of identifying cars, stop signs, pedestrians, bicycles, traffic lights, and other basic obstacles. Yet it is ridiculous to think that this is something that is close to ‘full self-government’. There is nothing wrong with that, but Tesla stands alone in its inability to acknowledge its own shortcomings.

If technology is immature, the natural response is to keep working until it is ironed out. Tesla has opted out of the strategy here, but prefers to sell software that it knows is incomplete, asking for a substantial premium and hoping that those who buy it have the nuanced, advanced understanding of its limitations – and the ability and responsibility to jump in and save. that if it is inevitably stunned. In short, every Tesla owner who buys ‘Full Self-Driving’ serves as an unpaid safety supervisor and conducts research on behalf of Tesla. Perhaps more damning, the company takes no responsibility for its actions and leaves it to the manager’s discretion to decide when and where to test them.

This content is imported from Twitter. You can find the same content in a different format, or you can find more information on their website.

This leads to videos like this, where early adopters do uncontrolled tests in city streets, with pedestrians, cyclists and other drivers not knowing they are part of the experiment. If even one of the Tesla drivers slips up, the consequences can be fatal.

All these tests are carried out on public roads, for the benefit of the world’s most precious car manufacturer, at basic cost. We asked Tesla for comment on the video, but the company has no press office and usually does not respond to inquiries.

This content is created and maintained by a third party and imported into this site to help users provide their email addresses. You may find more information about this and similar content on piano.io

Source