Tesla’s Autopilot hits the Model 3 on an oncoming road car, the ‘Full Self-Driving’ option was turned on.

Tesla’s Autopilot hits the Model 3 on an oncoming road car, the ‘Full Self-Driving’ option was turned on.


Tesla’s Autopilot system, with its “Full Self-Driving” option, is not something as its name suggests: it is not a real automatic nor a fully automatic vehicle. Instead, it is a state-of-the-art driver assistance system that can help reduce the driver’s workload on highways or on well-marked city streets. However, it is far from the best system, as shown in this new video from the YouTube channel Beta Tech OG, when its Model 3 was about to be hit by an oncoming train.

In this video, the driver is testing the Autopilot system, with the Full Personal Drive option, on the streets of Denver, Colorado. During a nearly 18-minute video, the Model 3 is about to hit several barriers, causing the driver to give Tesla software a low score. The two mistakes the program made were brighter than the others and one of them almost saw Tesla hit by a Denver railway train as it tried to turn left onto the train lane.

Before approaching the train, the Model 3 was stopped at a red light, with its left turn. This incident signaled to the car that the driver wanted to turn left and, after the light turned green, the car had to wait until the road was clear to turn. However, instead of waiting, Tesla either calculated that he had enough space and time to take turns before the train arrived, or misjudged the train. However, the irony is that the Autopilot display on the infotainment screen, which shows the environment of the recognizable vehicle, changed the train. You can see it on the screen before the car starts to turn. Why Tesla decided to turn left in front of the train, despite realizing it, is unknown; the driver was forced to replace and change the motion of the vehicle. If the car thought it could do the thigh in time, its software is very aggressive. Even the train driver blew the horn, also believing that the turn was a bad decision in this situation.

Later in the video, near the end, the Model 3 tried to make the usual left turn, but took that turn too wide and almost hit the two pedestrians standing in the corner. In the end, the driver said he was “very disappointed with Tesla”, which is understandable given his expectations and low results. However, in Tesla’s defense, there is a button to report any issues that arise while using Autopilot. The thing to note is that all that is needed is to press just one button and the driver is not required to add any details about the situation, so it is unclear how much of the report work for Tesla is very helpful.

There is much broader talk about Tesla’s ethics of trying their Automatic Driving option on public roads, using customers – not trained professionals – to try it out. Although Tesla is not the only company bringing its car into the city, when it comes to state-of-the-art driver assistance systems, some brands are more cautious in their approach and do not offer software until it has been fully tested by regulatory experts. environment. Fortunately no one was injured this time and he was able to get his car out of the train and oncoming pedestrians, but that does not mean everyone will be able to be touched in time in the future.

In addition to these two features, there are a few others. In fact, walking toward an oncoming train is just one of the many adverse events in this video:

  • Tesla practically crashed into a barrier (indicating that the road was closed) * (7: 6);
  • Tesla has chosen the wrong path and is clearly confused (show the dashboard, 11: 6);
  • Tesla tried to escape the red light, while the cars were moving (12:12);
  • Tesla stood in the middle of the crowd for no reason (13: 9);
  • Tesla chose the wrong path to the left (1:25 pm);
  • Tesla repeatedly turned left and turned off for no reason (where it was not even allowed to turn left, 15: 2);
  • Tesla failed to turn left properly and almost hit the pedestrians (17:13).

All this during a driving experience that lasted maybe 30 minutes (sometimes video is accelerated). Plus, these are just great cases, Tesla runs very well. It changes lanes very often (also at intersections) for no reason and moves surprisingly at the red light, very close to the passing traffic, for no reason.

Overall, the driver feels this is a very bad performance.

The network still wondered: “I do not understand why you can give control to a system that is certainly still in beta. The cost of failure is too high to take risks ”.

“Don’t be an accident test for Tesla”

The electric car, which generally has a driver assistance system, is becoming increasingly popular with a large number of new players arriving at the market over the past decade, some of which are branches of large groups already in burning cars. This has led to increased competition and companies now seem to be using all means to advertise their products. This may have happened in early January with Dan O’Dowd, who published a “critical criticism” of Tesla’s Full Self-Driving (FSD) program in an advertising system in the New York Times.

Dan is the co-founder and CEO of Green Hills Software. Green Hills is a private company that develops operating systems and software tools for board systems, but with the advent of the electric car has also moved into developing driver assistance systems (ADAS – Advanced Driver Assistance Systems). Title Don’t be the dummy of Tesla’s crash attempt (“Don’t be an accident test for Tesla”), Dan’s announcement claims that in its current version, FSD would kill millions of people every day if it drove more cars.

Dan relied on his criticism of a study of videos published online showing Tesla owners using the full Personal Driving feature, which Tesla says is in beta version and allows only short distances under the control of drivers. According to his research, the videos show that FSD commits a “serious driving error” every eight minutes and an “unintentional error” every 36 minutes which “is likely to cause a collision”. In the poster ad, Dan considers FSD to be the “worst commercial program” he has ever seen and thinks he is still in the alpha phase.

Because of this, he thinks it should be tested by Tesla’s internal staff rather than Tesla’s owners. “A program that drives self-driving vehicles that millions of lives will depend on must be the best software,” he said. Although the restricted version of FSD is available to anyone who owns Tesla, owners can also submit applications to become users of the latest version of the beta test if they have high driving safety features, as specified. and their car software. Dan is actually campaigning to ban Tesla’s FSD.

He said he put the ad under the auspices of the Dawn Project, the pressure group that is conducting this campaign. This is an organization that describes itself as “dedicated to making computers safer for human beings.” According to some critics, Dan’s ad is seen to some extent as an advertisement designed to attract his own business. The Green Hills program said earlier that month that its technology was being used to develop a driver assistance program for the all-electric BMW iX, a sports SUV that BMW demonstrated at CES 2022.

In response, Elon Musk, CEO of Tesla, attacked the program from the Green Hills Program. Musk tweeted, “The Green Hills program is a waste,” and endorsed the view that “FSD critics still have a large financial stake in a competitive solution.” But for his part, Dan said the best sources of information about the product are its competitors. “They break up in pieces, they find out what they are doing right, they find out they are wrong. They know better than anyone, and they will tell you. The seller will never tell you these things, ”he said.

Additionally, he also claimed that the original version of Tesla’s Autopilot, which was the predecessor to FSD, was built using the Green Hills Software. “I pulled myself out of the project and said, ‘I don’t know if this is right, if this is what we should do here, it won’t work,'” Dan said. Tesla, which does not have a functioning media affairs office, did not comment on Dan’s allegations that there is a connection between technology driver support software from Green Hills Software and FSD.

However, some online comments have shown that it is “absolutely ridiculous” to use other people’s YouTube videos – instead of trying out the technology directly – to provide “evidence” that FSD would kill millions of people every day if it were installed. on every car in the world. Another red mark will be the fact that the ad does not immediately use the full name of the Full Beta program. It does not require a “Beta” component, which may lead some readers to believe that Beta FSD is a complete product while it is still in installation configuration.

Source: video in text

And you?

What do you think; what do you think
What do you think about the views of an Internet user who thinks that since FSD is still in beta phase, drivers should not give it control?
What do you think of Tesla’s approach to giving its users the opportunity to try out the beta version as opposed to the competition whose cars are only tested by experts while its software is still in beta?