Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.

Tesla also suppresses the frontal radars in the Model S and X, entrusts everything to Tesla Vision

The progressive deployment of the Tesla Vision system continues, by which the sensors that make the car “see” its outside world are limited to 360-degree video cameras and 12 ultrasound sensors. The front radar is superfluous in this conception. It had already stopped being included in the Tesla Model 3/Y since spring 2021, this time it’s up to the Model S/X produced since February.

The changes affect all cars made in the United States and destined for the domestic market, but also affect Model S/X for export. Already on the Tesla website, there is no mention of the front radar for higher-end models, nor do they appear graphically. Idem for the Model 3/Y that arrive on the Spanish market, which comes from China, since they do not equip the frontal radar.

In theory, the frontal “vision” capacity is superior to that of the radar even in range, 250 meters with the cameras. However, the Autosteer system -automatic steering- is limited to 130 km/h and the safety distance from the vehicle in front is increased. The IIHS took almost half a year to approve this technology in Model 3 and Model Y. The NHTSA still does not see it with good eyes.

The perception of the front cameras and radars is very different. Human beings base our perception of the world behind the wheel on 90% through our eyes. Although these only capture visible light -like the cameras-, we better interpret what is in front of us. Artificial intelligence does it faster, but it has less reasoning capacity.

Instead, radars physically detect objects – even if they are supposed to be invisible – because they send out waves that bounce off bodies and are received back. They not only determine the presence of an object but also its position. For other manufacturers, the criterion of greater security is a mixture of artificial vision and radar/LiDAR, what one sensor does not perceive well, another does. The problem is the management of discrepancies.

With the Tesla Vision -or Pure Vision- system there are no discrepancies, the cameras do almost everything, although the weight of artificial intelligence is greater to identify distances. Ultrasound sensors are only effective at a very short range, so they are only used in the vicinity of the vehicle.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button