Around view monitor for transportable flight simulator was created in MAI

October 19, 2022
Around view monitor for transportable flight simulator was created in MAI

MAI developed around view monitor for transportable flight simulator. The work was carried out by students of 4-5 courses of the department 301 "Automatic and intelligent control systems" Sergey Melyukov, Ivan Antonov, Semen Nogtev and MAI graduate Borislav Ivanov under the supervision of Vladimir Borisovich Chemodanov, head of NIO-301, Candidate of Technical Sciences, Associate Professor.

The MAI residents won the prize (diploma of the III degree) defending this project at the competition of scientific and technical works of students and postgraduates, that was held at the Alushta, MAI, in Crimea as part of the XXXI International Scientific and Technical Conference "Modern technologies in control tasks, automation and information processing" in September 2022.

– The system might be used in small-sized simulators for around view, which can significantly reduce the cost due to the absence of the need to use additional screens and thus increase its mobility. Many simulators are stationary, and in order to move them, it is necessary to make considerable efforts, – says project participant Ivan Antonov.

The system will also be useful for assessing the psychophysiological state of pilots or operators of potentially dangerous objects. The developers have already implemented the system into the simulator for testing, in the future it is planned to improve the algorithm for tracking head rotation using a neural network.

The uniqueness of the development is also in the application of tracking methods - virtual reality technology designed to determine the position and orientation of a real object in a virtual environment using special sensors and markers.

At the moment, the system exists in two versions.

The first one is implemented using a tracker with three infrared LEDs, a camera and a special program that allows to control the tracker's position in space. With the help of a simple search algorithm, the image is analyzed from the camera, and the position of the LEDs is found.

In the second option, there is no need to use a tracker. A regular webcam is used here. The program receives an image from it and, with the help of trained neural network, it outputs head movements.

Or you can write to us using the form below

Please check this field
Please check this field
Please check this field
Please check this field
Please check this field
CAPTCHA

* Required fields