Preview

Proceedings of the Southwest State University

Advanced search

Virtual Interface Technology in the Process of Simulation of Complex Functional Modules of Control Systems for Industrial Robots and Multi-Axis Mechatronic Systems

https://doi.org/10.21869/2223-1560-2022-26-1-92-115

Abstract

Purpose of research. Development of a tool for debugging intelligent control system algorithms, including the development of a vision system and planning a software trajectory for an industrial robot.
Methods. To achieve this goal, a review of existing simulation tools was carried out. A protocol of contactless humanrobot interaction is presented. An algorithm for the recognition of gesture commands based on the difference of three-dimensional binary microblocks and the construction of the skeleton of the human body has been developed. An example of using the ROBOGuid software tool for imitating the motion of an industrial robot in the development and debugging of its own control methods focused on real objects is presented.
Results. The use of digital twins of technological equipment to simulate and display real technological processes in a virtual environment, in the context of the formation of a new concept of Industry 4.0 and the sixth technological order, allows improving the main and auxiliary production processes, as well as analyzing, researching and evaluating the economic efficiency of new technological and technical solutions. Simulation allows the development of ergonomic ways of human interaction with mechatronic objects. The solution proposed in the work was tested on the example of working out a complex spatial contour that simulates the milling of a part. Experimental studies of the gesture command recognition algorithm proposed in the work were carried out on the publicly available UCF101 dataset, the results are compared with known approaches to recognizing human actions.
Conclusion. The developed interface module was used on the example of working out a complex spatial contour that simulates the milling of a part, and the method of a contactless robot control system has shown its effectiveness and the need to develop this direction.

About the Authors

A. A. Zelensky
Moscow State University of Technology "STANKIN"
Russian Federation

Alexander A. Zelensky, Cand. of Sci. (Engineering), Associate Professor, Director of the Institute of Digital Intelligent Systems

1 Vadkovsky lane, Moscow 127055

Researcher ID: AAG-2201-2019



M. M. Zhdanova
Moscow State University of Technology "STANKIN"
Russian Federation

Marina M. Zhdanova, Junior Researcher

1 Vadkovsky lane, Moscow 127055

Researcher ID: A-2068-2014



T. Kh. Abdullin
Moscow State University of Technology "STANKIN"
Russian Federation

Tagir Kh. Abdullin, Leading Engineer, Lecturer at the Department of Industrial Electronics and Intelligent Digital System

1 Vadkovsky lane, Moscow 127055



V. V. Voronin
Moscow State University of Technology "STANKIN"
Russian Federation

Viacheslav V. Voronin, Cand. of Sci. (Engineering), Associate Professor, Deputy Director of the Center for Cognitive Technologies and Machine Vision

1 Vadkovsky lane, Moscow 127055

Researcher ID: H-7334-2013



References

1. Gribkov A. A., Pivkin P. M., Zelenskii A. A. Industriya 4.0 v stankostroenii [State Industrial Policy and the Machine-Tool Industry]. Russian Engineering Research. 2021, 41(4): 342-346. https://doi.org/10.3103/S1068798X21040092

2. SinuTrain for SINUMERIK Operate. Siemens; 2021 [cited November 10, 2021]. Access: https://new.siemens.com/global/en/markets/machinebuilding/machine-tools/cnc4you/sinutrainuebersicht.html.

3. ShopTurn; 2021 [cited November 10, 2021]. Access: https://docviewer.yandex.ru/?tm=1634583248&tld=ru&lang=ru&name=TUST_0404_ru.pdf&text=ShopTurn&url=https%3A//support.industry.siemens.com/dl/files/019/28803019/att_32871/v1/TUST_0404_ru.pdf&lr=213&mime=pdf&l10n=ru&sign=d0d5accb52e1d8a43751b49e7d44e6ce&keyno=0.

4. ShopMill; 2021 [cited November 10, 2021]. Access: https://docviewer.yandex.ru/?tm=1634583319&tld=ru&lang=ru&name=TUSM_1209_ru_ru-RU.pdf&text=ShopMill&url=https%3A//support.industry.siemens.com/dl/files/295/41131295/att_80704/v1/TUSM_1209_ru_ru-RU.pdf&lr=213&mime=pdf&l10n=ru&sign=89ed36f9c3c96b52801a3f382b11df71&keyno=0.

5. Vericut; 2021 [cited November 10, 2021]. Access: http://vericut.ru/index.php/products/products

6. Simulate Robot Applications. RoboDK; 2021 [cited November 10, 2021]. Access: https://robodk.com/index.

7. ROBOGuid; 2021 [cited November 10, 2021]. Access: https://www.fanuc.eu/ru/ru/роботы/принадлежности/roboguide.

8. Zelensky A.A., Kharkov M.A., Ivanovsky S.P., Abdullin T.Kh. Vysokoproizvoditel'naya sistema chislovogo programmnogo upravleniya na baze programmiruyemykh logicheskikh integral'nykh skhem [High-performance numerical control system based on programmable logic integrated circuits]. Vestnik Voronezhskogo gosudarstvennogo tekhnicheskogo universiteta = Bulletin of Voronezh State Technical University. 2018, 14(5): 8–12. (In Russ.)

9. Zelenskii A.A., Abdullin T.Kh., Ilyukhin Yu.V., Kharkov M.A. Vysokoproizvoditel'naya tsifrovaya sistema na osnove PLIS dlya upravleniya dvizheniyem mnogokoordinatnykh stankov i promyshlennykh robotov [High performance FPGA based digital motion control system for multi-axis machine tools and industrial robots]. STIN=STIN. 2019, (8): 5-8. (In Russ.)

10. Zelenskii A.A., Abdullin T.Kh., Alepko A.V. Osobennosti postroyeniya v real'nom vremeni s-obraznoy krivoy razgona/tormozheniya pri kusochno-lineynoy interpolyatsii poverkhnostey slozhnoy formy [Features of real-time construction of an s-shaped acceleration / deceleration curve with piecewise-linear interpolation of complex-shaped surfaces]. Robototekhnika i tekhnicheskaya kibernetika= Robotics and technical cybernetics. 2021, 9 (3): 17–26. (In Russ.)

11. Zhao H., Zhu L.M., Ding H. A real-time look-ahead interpolation methodology with curvature-continuous B-spline transition scheme for CNC machining of short line segments. International Journal of Machine Tools & Manufacture. 2013, 65: 88-98. https://doi.org/10.1016/j.ijmachtools.2012.10.005

12. Zelensky A., Semenishchev E., Alepko A., Abdullin T., Ilyukhin Y., Voronin V. Using neuro-accelerators on FPGAs in collaborative robotics tasks. Proceedings Volume 11876, Optical Instrument Science, Technology, and Applications II. 2021, 118760O. https://doi.org/10.1117/12.2600582

13. Zelensky A., Alepko A., Dubovskov V., Kuptsov V. Heterogeneous neuromorphic processor based on risc-v architecture for real-time robotics tasks. Proceedings of SPIE - The International Society for Optical Engineering. 2. Сер. "Artificial Intelligence and Machine Learning in Defense Applications II". 2020, 115430L. https://doi.org/10.1117/12.2574470

14. Karel. Wikipedia; 2021 [updated February 14, 2021; cited November 10, 2021]. Access: https://ru.wikipedia.org/wiki/Karel.

15. TCP/IP. Wikipedia; 2021 [updated October 28, 2021; cited November 10, 2021]. Access: https://ru.wikipedia.org/wiki/TCP/IP.

16. Chen C., Jafari R., Kehtarnavaz N. UTD-MHAD: A Multimodal Dataset for Human Action Recognition Utilizing a Depth Camera and a Wearable Inertial Sensor. Proceedings of IEEE International Conference on Image Processing, 2015. https://doi.org/10.1109/ICIP.2015.7350781.

17. Serrano-Cuerda J., Fernández-Caballero A., López M. Selection of a visible-light vs. thermal infrared sensor in dynamic environments based on confidence measures. Applied Sciences. 2014, 4(3): 331-350. https://doi.org/10.3390/app4030331

18. Zhdanova M.M., Voronin V.V., Sizyakin R.A., Gapon N.V., Balabaeva O.S. Model' ob"yedineniya izobrazheniy, poluchennykh s datchikov razlichnoy prirody [A model for combining images obtained from sensors of different nature]. Dinamika tekhnicheskikh sistem «DTS-2019»: sb. tr. XV mezhdunar. nauch.-tekhn. Konf.= Proc. Dynamics of technical systems "DTS-2019". 2019, 81-84. (In Russ.)

19. Voronin V., Zhdanova, M., Semenishchev, E., Zelensky, A., Tokareva, O. Fusion of color and depth information for human actions recognition. Signal Processing, Sensor/ Information Fusion, and Target Recognition XXIX. 2020, Т. 11423. https://doi.org/10.1117/12.2560130.

20. Solmaz B., Assari S. M., Shah M. Classifying web videos using a global video descriptor. Machine vision and applications. 2013, 24(7): 1473-1485. https://doi.org/10.1007/s00138-012-0449-x

21. Zhdanova M., Voronin V., Semenishchev E., Ilyukhin Y., Zelensky A. Human activity recognition for efficient human-robot collaboration. Proc. International Society for Optics and Photonics. 2020, 115430K. https://doi.org/10.1117/12.2574133.

22. Wan Q. et al. Face description using anisotropic gradient: thermal infrared to visible face recognition. Mobile Multimedia/Image Processing, Security, and Applications. 2018, 106680V. https://doi.org/10.1117/12.2304898

23. Zhao G, Pietikäinen M. Dynamic texture recognition using volume local binary patterns. Springer. 2006, 165-177. https://doi.org/10.1007/978-3-540-70932-9_13

24. Maenpaa T. The Local Binary Pattern Approach to Texture Analysis: Extensions and Applications. 2004. https://www.elibrary.ru/item.asp?id=8860330.

25. Zhai D., Liu X., Chang H., Zhen Y., Chen X., Guo M., Gao W. Parametric local multiview hamming distance metric learning. Pattern Recognition. 2018, 75: 250-262. https://doi.org/10.1016/j.patcog.2017.06.018

26. Mehta R., Eguiazarian K. E. Texture classification using dense micro-block difference. IEEE Transactions on Image Processing. 2016, 25(4): 1604-1616 https://doi.org/10.1109/TIP.2016.2526898.

27. Belagiannis V., Zisserman A. Recurrent human pose estimation. 12th IEEE International Conference on Automatic Face & Gesture Recognition. 2017: 468-475. https://doi.org/10.1109/FG.2017.64

28. Soomro K., Zamir A. R., Shah M. UCF101: A Dataset of 101 Human Action Classes From Videos in The Wild. CRCV-TR-12-01. 2012. https://arxiv.org/abs/1212.0402

29. Peng X., Wang L., Wang X., Qiao Y. Bag of visual words and fusion methods for action recognition: comprehensive study and good practice. Computer Vision and Image Understanding. 2016, 150: 109-125 https://doi.org/10.1016/j.cviu.2016.03.013


Review

For citations:


Zelensky A.A., Zhdanova M.M., Abdullin T.Kh., Voronin V.V. Virtual Interface Technology in the Process of Simulation of Complex Functional Modules of Control Systems for Industrial Robots and Multi-Axis Mechatronic Systems. Proceedings of the Southwest State University. 2022;26(1):92-115. (In Russ.) https://doi.org/10.21869/2223-1560-2022-26-1-92-115

Views: 571


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2223-1560 (Print)
ISSN 2686-6757 (Online)