«Шестое чувство»: российско-китайская система для 4D-радаров научила беспилотники «видеть» в любую погоду

RadarSFEMOS Makes Autopilot Resistant to Fog, Snow, and Rain — and Works Without GPS and Lidars

Researchers from MIPT and Chinese scientists have developed a data processing system for 4D radars for drones. The technology, called RadarSFEMOS, can accurately determine the location and movement of objects even in conditions where cameras and lidars lose effectiveness — in rain, snow, and fog. In the future, the system is planned to be implemented in Russian unmanned trucks and urban robotaxis, according to the university's press service.

4D radars are advanced radar sensors that determine not only the distance, azimuth, and elevation angle of an object, but also its speed in the radial direction (i.e., how much it is approaching or moving away). This distinguishes them from classic 2D and 3D radars and makes them especially useful in poor visibility conditions, where cameras and lidars often lose their "sight."

RadarSFEMOS is a self-learning system for 4D radars that clears radar data from interference and determines the direction and speed of objects without the use of additional markup or odometry (i.e., data on the movement of transport from the speedometer or GPS). To achieve high accuracy, scientists have combined several approaches.

Firstly, the system uses a diffusion noise reduction model, which filters radar data in milliseconds, increasing the accuracy of perception. Secondly, a transformer analyzer with an adaptive architecture is capable of recognizing objects even at extremely low data density — only 5–10 points per square meter, while lidars require at least 100 points in the same area for comparable accuracy.

The algorithm takes into account the movement of the car itself and excludes it from the analysis, leaving only the movement of surrounding objects in the calculations. An additional advantage is the ability of the 4D radar to measure radial speed — that is, how quickly an object is approaching or moving away from the car. This allows for accurate differentiation between static and moving objects, ensuring more reliable navigation.

Artificial intelligence makes the system truly adaptive: RadarSFEMOS does not require manual markup. As data accumulates, it learns to distinguish noise from real movement and gradually increases the accuracy of operation. At the same time, the system simultaneously analyzes two consecutive frames of radar data and classifies objects as moving or stationary.

Our system determines the movement of objects around the car and divides them into moving and static, and does this in all weather conditions. It's as if the drone has gained a sixth sense. This is not just a scientific article, the algorithm is already ready to work on serial radars, which are 50 times cheaper than lidars
Stepan Andreev, Director of the Scientific and Technological Center for Telecommunications at MIPT

The development was tested on two popular datasets — View-of-Delft (VoD) and TJ4DRadSet. The results showed that RadarSFEMOS significantly reduces the number of false positives on "phantom" objects and increases the accuracy of determining real objects up to 89%.

In the future, scientists plan to adapt the system to more complex movement scenarios and teach it to predict the trajectories of objects with even greater accuracy. This will pave the way for the creation of safer and more reliable unmanned vehicles that can confidently navigate in all weather conditions.

Read more on the topic:

Big Brother is watching: a "machine vision" system has begun to monitor the work of employees at the Leningrad Nuclear Power Plant

Machine vision and neural networks in production: a platform for collaborative robots with quality control function was presented in Skolkovo

A new model of computing infrastructure with AI was created at Moscow State University