|Table of Contents|

Motion planning of radioactive source grasping robot based on memory reasoning(PDF)

Journal of Shenzhen University Science and Engineering[ISSN:1000-2618/CN:44-1401/N]

Issue:
2022 Vol.39 No.3(237-362)
Page:
343-348
Research Field:
Electronics and Information Science

Info

Title:
Motion planning of radioactive source grasping robot based on memory reasoning
Author(s):
NAN Wenhu1 XU Fumin1 and YE Bosheng2
1) School of Mechanical and Electrical Engineering, Lanzhou University of Technology, Lanzhou 730050, Gansu Province, P. R. China
2) The National CNC Engineering Center, Huazhong University of Science and Technology, Wuhan 430074, Hubei Province, P. R. China
Keywords:
intelligent robot force feedback historical grasping data memory reasoning decision autonomous grasping Gazebo simulator radioactive sources grasping
PACS:
TP24
DOI:
10.3724/SP.J.1249.2022.03343
Abstract:
Aiming at the problem that machine vision is difficult to be applied to the radioactive source grasping due to the semi-closed and strong radiation environment of the lead can, we propose a memory reasoning based reinforcement learning grasping method. The kinematics model of intelligent robot grasping system is constructed based on machine vision. The interaction between the intelligent robot and internal environment of lead cans is realized by force feedback. Through the memory reasoning decision of historical grasping data, the autonomous grasping of radioactive sources is realized. Using the Gazebo simulator in robot operating system (ROS), the Monte Carlo sampling method and reinforcement learning grasping method based on memory reasoning are simulated, respectively. The results show that the reinforcement learning grasping method based on memory reasoning achieves the average grasping efficiency of 84.67% higher than that of Monte Carlo sampling method and thus demonstrate that the reinforcement learning grasping method can effectively solve the problem of autonomous grasping of radioactive sources in lead cans.

References:

[1] 余玉琴,魏国亮,王永雄.基于改进YOLO v2的无标定3D机械臂自主抓取方法[J].计算机应用研究,2020,37(5):1450-1455.
YU Yuqin, WEI Guoliang, WANG Yongxiong. 3D uncalibrated robotic grasping method based on improved YOLO v2 [J]. Application Research of Computers, 2020, 37(5): 1450-1455.(in Chinese)
[2] 童磊.面向机器人抓取的零件识别与定位方法研究[D].厦门:华侨大学,2018.
TONG Lei. Research on parts identification and positioning for robot crawling [D]. Xiamen: Huaqiao University, 2018.(in Chinese)
[3] 何涛.基于视觉定位的单向器自动装配系统设计与实现[D].杭州:浙江工业大学,2015.
HE Tao. The design and realization of starter driver’s automatic assembly based on visual detection [D]. Hangzhou: Zhejiang University of Technology, 2015.(in Chinese)
[4] YU Kuanting, BAUZA M, FAZELI N, et al. More than a million ways to be pushed: a high-fidelity experimental dataset of planar pushing [C]// IEEE/RSJ International Conference on Intelligent Robots and Systems. Deajeon, Korea (South): IEEE, 2016: 30-37.
[5] 张森彦,田国会,张营,等.一种先验知识引导的基于二阶段渐进网络的自主抓取策略[J].机器人,2020,42(5):513-524.
ZHANG Senyan, TIAN Guohui, ZHANG Ying, et al. An autonomous grasping strategy based on two-stage progressive network guided by prior knowledge [J]. Robot, 2020, 42(5): 513-524.(in Chinese)
[6] 周祺杰,刘满禄,李新茂,等.基于深度强化学习的固体放射性废物抓取方法研究[J].计算机应用研究,2020,37(11):3363-3367.
ZHOU Qijie, LIU Manlu, LI Xinmao, et al. Research on solid radioactive waste grasping method based on deep reinforcement learning [J]. Application Research of Computers, 2020, 37(11): 3363-3367.(in Chinese)
[7] 薛腾,刘文海,潘震宇,等.基于视觉感知和触觉先验知识学习的机器人稳定抓取[J].机器人,2021,43(1):1-8.
XUE Teng, LIU Wenhai, PAN Zhenyu, et al. Stable robotic grasp based on visual perception and prior tactile knowledge learning [J]. Robot, 2021, 43(1): 1-8.(in Chinese)
[8] 崔少伟,魏俊杭,王睿,等.基于视触融合的机器人抓取滑动检测[J].华中科技大学学报自然科学版,2020,48(1):98-102.
CUI Shaowei, WEI Junhang, WANG Rui. Robotic grasp slip detection based on visual-tactile fusion [J]. Journal of Huazhong University of Science and Technology Natural Science Edition, 2020, 48(1): 98-102.(in Chinese)
[9] 惠文珊,李会军,陈萌,等.基于CNN-LSTM的机器人触觉识别与自适应抓取控制[J].仪器仪表学报,2019,40(1):211-218.
HUI Wenshan, LI Huijun, CHEN Meng, et al. Robotic tactile recognition and adaptive grasping control based on CNN-LSTM [J]. Chinese Journal of Scientific Instrument, 2019, 40(1) : 211-218. (in Chinese)
[10] FALLAHINIA N, MASCARO S A. Comparison of constrained and unconstrained human grasp forces using fingernail imaging and visual servoing [C]// IEEE International Conference on Robotics and Automation. Piscataway: IEEE, 2020: 1-7.
[11] 张磊,张华希,方正,等.一种基于自适应蒙特卡罗定位的机器人定位方法:CN104180799A[P]. 2014.
ZHANG Lei, ZHANG Huaxi, FANG Zheng, et al. A robot positioning method based on Adaptive Monte Carlo positioning: CN104180799A [P]. 2014.(in Chinese)

Memo

Memo:
-