A highly anticipated update has arrived for Teslas without ultrasonic sensors (USS), promising better parking assistance with Tesla Vision. However, in practice, the camera-based solution, named Tesla Vision Parking Assist, is far from perfect. Let’s discuss what works and what doesn’t.
The long-awaited update that restores parking assistance for Teslas without ultrasonic sensors has finally arrived in France. Version 2023.6.9 brings a feature we’ve been waiting for since September 2022, a little over six months.
Some had high hopes for this functionality, believing that the camera-based solution would outperform proximity sensors. Of course, we could anticipate some physical limitations, such as reduced visibility. But even when external conditions are favorable, can we trust Tesla Vision to park our cars?
We’ll delve into what Tesla promised, what is currently offered, and how ultrasonic sensors worked, which were not perfect either. Finally, we’ll explore whether Tesla can drastically improve Tesla Vision for parking, or if we’re stuck with the current imperfections forever.
Tesla Vision Park Assist: A Promise Made Six Months Ago
On Tesla’s website, we can still read the following statement:
In the near future, once these features have reached the same performance levels as current vehicles, they will be restored via a series of remote software updates.
If the 2022.6.9 update brings parking assistance via Tesla Vision, the manufacturer believes it has reached the same performance level as vehicles equipped with ultrasonic sensors.
“Park Assist” is not the only feature removed by Tesla on vehicles delivered without ultrasonic sensors, but it’s the only one that has returned so far. Auto Park, Auto Exit, and Smart Auto Exit are still unavailable for vehicles with corresponding options.
While initial tests of Park Assist via Tesla Vision seemed promising, now that we have it on our vehicles, it’s time to examine its performance in real-world situations.
Tesla Vision’s Theory vs. Physical Limitations
The reliability of a parking assist system is crucial. How can we trust a system that provides inaccurate information, either due to overestimated distances or undetected obstacles? This is the challenge facing Tesla Vision, as physical limitations indeed exist.
The camera placement on Teslas means that some areas near the vehicle are invisible, making it impossible to determine the location of surrounding objects without movement. More specifically, anything within one meter of the front bumper and less than fifty centimeters high is completely invisible to the three cameras at the top of the windshield.
The eagerly awaited Hardware 4 could address this blind spot if cameras are installed on the front bumper. However, as it stands, Tesla cannot overcome the visibility issues caused by the current camera placement.
Tesla Vision’s theory relies on a camera-based occupancy grid to identify objects, differentiate them, and refine the vehicle’s positioning in space.
In other words, it should measure the distance to surrounding objects and determine the Tesla’s position relative to its surroundings. Let’s see if this promise holds up in practice in different situations.
Parking Assistance: Useful Feature or Gimmick?
The first test to determine if we can trust Tesla Vision for parking is simple: does it accurately and reliably detect the environment? Under normal daylight conditions, the answer is generally yes. For example, you can see below how a Tesla Model Y parked in reverse against a hedge, between two cars, is displayed.
The lines accurately represent the close surroundings of the vehicle, and the 83-centimeter forward indication is fairly reliable, as a standing person can be seen at the orange line through the cameras. However, one initial issue arises at the rear: the red line appears to be inside the vehicle, as if it were backed against the hedge, when in reality, there are more than 15 centimeters left before contact.
The rear camera remains your best ally for reversing maneuvers in a Tesla because it allows for better obstacle visualization at the rear than the lines drawn by Tesla Vision on the left side of the screen. Of course, this is only true when the camera is clean; otherwise, it is of no help. But in that case, a message will appear on the screen, indicating that parking assistance (which relies on cameras) is also unavailable.
This issue doesn’t really arise with ultrasonic sensors, as they are much less sensitive to dirt.
We understand that in some cases, parking assistance will not be available, but when it is, can we really trust it? We tested various situations and, unfortunately, it is relatively easy to find faults in the Tesla Vision system. If you are bothered by the famous, annoying, and not very useful “beep beep” of this parking assistance, you can disable it in the corresponding menu.
The vision-based occupancy grid tries to guess not only the placement of a curb but also its height when it is not visible to the cameras. Thus, parking forward with a 15-centimeter-high sidewalk may sometimes transform into a wall several tens of centimeters high and much closer than in reality when imagined by Tesla Vision.
A few tests show the current limitations of Tesla Vision
We experienced these issues during our tests, as you can see in the tweet below, where we decided to trust the parking assistance when it indicated not to get closer to the present obstacle.
I stopped when it told me “STOP.”
Highly sophisticated and infallible system. pic.twitter.com/giCcQHOPUm
— Bob Jouy (@bobjouy) March 28, 2023
Unlike reversing, where a second check can be done using the camera, when driving forward, we have no other way of seeing what is in front of the bumper than with our viewpoint at the driver’s seat. Of course, without parking assistance, we would have moved much closer than in the photo, but what do you do when the vehicle instructs you to stop to avoid hitting an obstacle?
Our trust in Tesla Vision for parking assistance is greatly shaken after a few days of testing, but let’s acknowledge one thing: for now, Tesla Vision seems to be overly cautious rather than the opposite.
Indeed, it would be much more severe to have undetected obstacles where we might imagine we could continue to advance because the vehicle indicates no surrounding danger, only to eventually hit an invisible object. Unfortunately, although less frequent, this scenario occurs in some situations, so be cautious about what the dashboard displays.
As it stands, in both forward and reverse, stopping when suggested will cause you to exit your Tesla and then move it forward or backward because you are far from ideally positioned. In practice, distance indications stop at 30 centimeters, after which the “STOP” indication appears, causing anxiety. But this is also the case with Tesla vehicles equipped with ultrasonic sensors.
If you try to trust your car while backing up to a Supercharger, you’ll probably end up too far from the charging station to connect, which is quite amusing. However, this might not be an issue if you’re using a V4 charger with its much longer cable.
Here’s what it looks like in a video: pic.twitter.com/wyJc1D5ibt
— Bob Jouy (@bobjouy) March 29, 2023
In this everyday situation where a Tesla Model 3 is driven into a garage, the ultrasonic sensors don’t provide any useful information, as they regularly display an incorrect distance on the left side of the car’s screen.
However, we can still observe numerous cases where ultrasonic sensors are currently better than Tesla Vision. First, ultrasonic sensors are not dependent on lighting or weather conditions. This means that when cameras are wet or dirty, parking assistance via Tesla Vision is unavailable, while ultrasonic sensors operate as expected.
Additionally, when performing more conventional maneuvers, such as approaching a wall in forward or reverse, ultrasonic sensors provide greater precision than Tesla Vision and its approximate display. This is easily understandable, as distance is estimated using cameras on one side and precise calculations with dedicated sensors on the other. One similarity between the two solutions is that there is no measurement below 30 centimeters. In other words, if you want to approach an obstacle less than 30 centimeters away, you’ll only see the “STOP” indication on the screen, with or without ultrasonic sensors.
While we can hope for improvements in Tesla Vision over time, it’s important to manage expectations to avoid disappointment. Indeed, it’s highly likely that the current implementation won’t change much, as physical limitations prevent cameras from seeing correctly under all circumstances. The famous Hardware 4 will likely resolve this issue for future Tesla models, but for now, we must make do with this imperfect solution.