|

Mobile manipulation robot for international robotics competition

Authors: Anisimov R.O., Bakaev V.S., Bakhov T.B., Goloburdin N.V., Marchuk A.M., Mostakov N.A.
Published in issue: #11(52)/2020
DOI: 10.18698/2541-8009-2020-11-656


Category: Mechanical Engineering and Machine Science | Chapter: Robots, Mechatronics, and Robotic Systems

Keywords: mobile manipulation robot, RoboCup, RoboCup@Work, Bauman Robotics Club, state machine, navigation, technical vision, RealSense, manipulator
Published: 08.12.2020

The article is devoted to the experience of the participation of BMSTU students team in the RoboCup robotics championship in the RoboCup@Work nomination. This championship is one of the most prestigious student events in which universities from all over the world take part. This nomination is aimed at imitating the actions of a robot in a warehouse. The main tasks lie in the area of navigation and object manipulation. A description of the main subsystems of a mobile manipulation robot is given: a navigation system, a vision system based on a combination of deep learning algorithms and classical methods of image processing, a manipulation system and a finite state machine.


References

[1] Carstensen T., Carstensen J., Dick A., et al. Staying on top at RoboCup@Work 2016. RoboCup 2016: Robot World Cup XX. Springer, 2016, pp. 601–612.

[2] Norouzi A., Schnieders B., Zug S., et al. RoboCup@Work rulebook. Robocup, 2019

[3] Thrun S., Burgard W., Fox D. Probabilistic robotics. The MIT Press, 2005.

[4] Dellaert F., Fox. D., Burgard W., et al. Monte Carlo localization for mobile robots. Proc. IEEE Int. Conf. Robot. Automat., 1999. DOI: https://doi.org/10.1109/ROBOT.1999.772544

[5] Fox D., Burgard W., Thrun S. The dynamic window approach to collision avoidance. IEEE Robot. Autom. Mag., 1997, vol. 4, no. 1, pp. 23–33. DOI: https://doi.org/10.1109/100.580977

[6] Dynamixel workbench. 1. Introduction. emanual.robotis.com: website. URL: http://emanual.robotis.com/docs/en/software/dynamixel/dynamixel_workbench/ (accessed: 15.09.19).

[7] Zenkevich S.L., Yushchenko A.S., eds. Osnovy upravleniya manipulyatsionnymi robotami [Control basics of manipulation robots]. Moscow, Bauman MSTU Publ., 2004 (in Russ.).

[8] Redmon J., Divvala S., Girshick R., et al. You only look once: unified, real-time object detection. IEEE CVPR, 2016. DOI: https://doi.org/10.1109/CVPR.2016.91

[9] Deng J., Dong W., Socher R., et al. ImageNet: a large-scale hierarchical image database. IEEE CVPR, 2009. DOI: https://doi.org/10.1109/CVPR.2009.5206848

[10] Canny J. A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell., 1986, vol. 8, no. 6, pp. 679–698. DOI: https://doi.org/10.1109/TPAMI.1986.4767851

[11] He K., Zhang X., Ren Sh., et al. Deep residual learning for image recognition. IEEE CVPR, 2016. DOI: https://doi.org/10.1109/CVPR.2016.90