This scary video of some bad Tesla drivers might help make the driver safer

Illustration for the article titled This scary video of some bad Tesla drivers, which might help make the driver safer to use

Screenshot: Youtube

As usual, Tesla is very generous in giving us a lot to talk about, especially when it comes to them Level 2 known driver assistance system, confusing as Autopilot and / or Full Self-Government (FSD). Yesterday there was an accident of a Tesla using Autopilot hitting a cop car, and now a video of a largely Autopilot-assisted ride by Oakland is making the rounds, which is getting a lot of attention due to the often confusing and / or just poor decisions the car makes. Oddly enough, however, it’s the madness of the system’s performance that might just help people use it safely.

It all comes on the heels of a letter from the National Transport Safety Council (NTSB) to the US Department of Transportation (USDOT) regarding the “advanced notification of proposed rule” (ANPRM) of the National Highway Traffic Safety Administration (NHTSA), where the NTSB effectively says what the fuck (WTF) we should do regarding autonomous vehicle testing on public roads.

From that letter:

Since NHTSA does not set any requirements, manufacturers can drive and test vehicles virtually anywhere, even if the limits of the AV control system are exceeded. For example, Tesla recently released a beta version of its Level 2 autopilot system, which is described as a complete self-driving capability. By releasing the system, Tesla is testing a highly automated AV technology on public roads, but with limited surveillance or reporting requirements.

Although Tesla contains an indemnity that ‘features currently activated require active driver supervision and do not make the vehicle autonomous, NHTSA’s practical approach to supervising AV testing poses a potential risk to motorists and other road users.

At this stage, the NTSB / NHTSA / USDOT letter does not offer any solutions, but merely highlights something we have been seeing for years now: Tesla and other companies are testing self-driving car software on public roads, surrounded by other drivers and pedestrians who do not agreed to participate in any test, and in this beta software test the accidents may be literal.

All of this provides a great context for the Autopilot in Oakland video, the highlights of which can be seen in this tweet

… and the full 13 and a half minute video can be watched here:

There’s so much in this video that’s worth watching, if you’re interested in Tesla’s Autopilot / FSD system. I think this video uses the latest version of the FSD beta, version 8.2, of which there are many other management videos available online.

The system is technologically impressive; to do one of these things is a tremendous achievement, and Tesla engineers should be proud.

At the same time, however, it is not nearly as good as a human driver, at least in many contexts, and yes, as a level 2 semi-Autonomic system, it requires a manager to be vigilant and ready to take over at any moment, a task for which people are notorious, and why do i think any L2 system is inherently flawed.

Although many FSD videos show how the system is used on highways, where the overall driving environment is much more predictable and easier to navigate, this video is interesting, precisely because city driving has such a higher degree of difficulty.

This is also interesting, because the man in the passenger seat is such a constant and unadaptable excuse, to the point where the Tesla would attack and mow down a litter of kittens, thanks to its excellent ability to detect a small target.

Over the course of the Oakland ride, there are plenty of places where the Tesla performs well. There are also places where it makes really terrible decisions, driving in the incoming traffic lane, turning wrong in a one-way street or braiding around like a drunk robot or cutting curbs or just stopping, for no apparent reason, right in the middle of the road.

In fact, the video is usefully divided into chapters based on these interesting interesting events:

0:00Introduction

0:42Double parked cars (1)

1:15Pedestrian in Crosswalk

1:47Cross solid lines

2:05Non-involvement

2:15China Town

3:13Driver Avoidance

3:48Unprotected links (1)

4:23Turn right into the wrong lane

5:02Near Head On Incident

5:37Acting drunk

6:08Unprotected links (2)

6:46Non-involvement

7:09No red on

7:26“Take over immediately”

8:09Wrong job; Behind parked cars

8:41Double parked truck

9:08Bus Only Lane

9:39Close call (curb)

10:04Turn left; Track blocked

10:39Wrong way !!!

10:49Double parked cars (2)

11:13Stop sign delay

11:36Hesitation left

11:59Near collision (1)

12:20Nearby collision (2)

12:42Close call (wall / fence)

12:59Verbal Review of Drive / Beta

It reads like the playlist of a very strange concept album.

Nothing in this video, impressive as it is objective, says that this machine drives better than a human. If one had done the things seen here, you would have asked out loud over and over again.

Some situations are clearly things where the software is not programmed to understand, such as how parked cars with hazard lights are at obstacles that need to be driven carefully. Other situations are due to poor interpretation of camera data of the system, or excessive compensation, or just problems with processing the environment.

Some of the defenses of the video help to address the bigger issues:

The argument that there are many, many more human accidents on any given day is very misleading. Of course there are many more, but there are also many more people who drive cars, and even if the numbers were equal there, no car manufacturer would try to sell shit to human drivers.

On top of that, the memories that FSB is a beta, just remind us of the NTSB letter with all the acronyms: should we rent out companies? beta test self-ry motor software in public without supervision?

Tesla’s FSB is not yet safer than a normal human driver, which is why videos like these, which show many disturbing FSD driving events, are so important and can save their lives. These videos weaken confidence in FSD, which is exactly what needs to happen if this beta software is going to be tested safely.

Blind trust in any L2 system is how you end up crashing and maybe dying. Because L2 systems give little no warning when people need to take over, and unreliable people behind the wheel are probably more ready to take control.

I’m also not the only one who suggests it:

The paradox here is that the better a level 2 system becomes, the greater the chance that the people behind the wheel trust it, which means the less attention they will pay, resulting in them being less able to control take when the system actually needs them.

This is why most collapse with Car Driver on highways, where a combination of overall good performance of the automatic pilot and high speeds leads to poor driver attention and less response time, which can lead to a disaster.

All level 2 systems not just autopilot suffer from this, and therefore all are rubbish.

Although this video clearly shows that FSD’s basic driving skills still need work, Tesla should not focus on that, but to invent safe and manageable failure procedures, so immediate attention from the driver is not necessary.

Until then, the best case scenario is for the safe use of Autopilot, FSD, SuperCruise or any other Level 2 system to watch all these videos of the systems, lose some confidence in them and get a little tense and vigilant stays when the machine is running.

I know this is not what anyone does want of autonomous vehicles, but the truth is that they are not very finished yet. Time to accept it and treat it that way if we ever really want to progress.

To become defensive and to try to cover struggling machinery does not help anyone.

So if you like your Tesla and like Autopilot and FSD, you should watch the video carefully. Appreciate the good parts, but really accept the bad parts. Do not try to make excuses. Watch, learn and keep these fucking cups in the back of your mind as you sit behind a wheel you are not really steering on.

It is not fun, but this stage of such technology always requires work and work is not always fun.

.Source