Adaptive automated mobile manipulators in agile manufacturing
A complete solution was implemented only for the MiR100+UR3e mobile manipulator, while the image recognition unit and pick-and-place operation of the YuMI robot was developed and tested without combining them with the SEIT100’s guided mission to form a complete solution. A couple of 3D markers were created (both physically and in the MiR100’s API) and used by the MiR100+UR3e mobile manipulator to guide it into the correct position for doing the assigned pick-and-place tasks. For the UR3e robot arm and gripper, an orange tag attached to the gripper was used in the calibration process. The tag was recognized by the image recognition unit (camera + Jetson Nano) and the appropriate coordinate transformation (between the camera coordinate frame and the UR3e’s base coordinate frame) was calculated and sent to the mobile manipulator to ensure that the gripper would reach accurately to the workpiece in the pick-and-place operation. The solution for SEIT100+YuMI IRB 14000 mobile manipulator was partially completed, as the image processing unit and pick-and-place operation was created and tested. Thus, there is a possibility for continuing the development of the solution for the SEIT100+YuMI IRB 14000 mobile manipulator in the future if there is a need for it.
Public software repository: https://version.aalto.fi/gitlab/afof/adaptive-automated-mobile-manipulators-in-agile-manufacturing