MAI will test new control system for space robots
Alexey Koltovsky, a third-year student of Aerospace Institute No. 6, is developing a software package for a space robot unmanned autonomous control. The development allows to determine the position of the device and the distance it has traveled after reading the data from the camera.
The technology will help to increase the speed of the apparatus when exploring various celestial bodies by several times via simplifying the assessment of the environment.
– The signal from the device on the surface of Mars goes to the Earth for about 15 minutes, and it takes the same time to receive it back, and the assessment of the environment is often hampered by the poor quality of photographs from control cameras, – says Alexey Koltovsky. – With the use of new technologies, it became possible to assess the situation around the device in a three-dimensional representation. At the same time, the robot will be able to switch to full autonomy in making decisions and drawing up a route.
The software package is based on the technologies of visual odometry and instant localization on the ground – SLAM.
– The complex will allow the robot to build a map of the area based on visual data without using an accelerometer and other measuring tools, visually detect obstacles and overcome them, and also return to the point of the last signal reception when it is lost, – says the MAI resident.
For the task implementation he uses Python with machine learning and machine vision libraries. Since the on-board computers of autonomous vehicles are often limited in computing power, Alexey is now optimizing algorithms and datasets in order to increase the performance of the program.
– For example, the OpenCV library without optimization makes it possible to process about 0.82 frames per second, while optimization allows to achieve a result of 8–10 frames per second, – he says. – In case of success the technology might be applied in various areas of life, but first of all, I focus on increasing the autonomy of research spacecraft.
For Russia, the project to adapt such software systems for space tasks and the limited resources of the computing power of devices might be considered unique.
– The ability to visually identify obstacles and overcome them was partially implemented in the Perseverance rover, which was launched by NASA in 2020. Images from the camera were interpreted into three-dimensional surface maps, which made it possible to plan the route more optimally. The implementation of this technology helped to increase the speed of movement six times – from 20 m/h to 120 m/h,” notes Alexey.
The first tests of the development on the six-wheeled robot are scheduled to spring.