Adaptive automated mobile manipulators in agile manufacturing
Basic Information
Project ID: AEE-2021
Students: David Wargh, Marcus Myllyviita, Rudolf Stråhlmann
Project manager: Rudolf Stråhlmann
Instructor: Ronal Bejarano Rodriguez
Other advisors: -
Starting date: 12.1.2021
Completion date: 3.6.2021
Agile manufacturing is an approach to manufacturing that leverages flexibility, bottom-up innovation, and augmentation in order to quickly adapt, through an iterative process, to changing conditions such as customer needs and market changes while still controlling costs and quality. The term is applied to an organization that has created the processes, tools, and training needed to be able to do so.
Using collaborative robots that are integrated into automated guided vehicles (AGVs) is a new way to promote agile manufacturing that also brings many technological challenges. Having better resilience under unexpected changes and conditions is demanded from every system on the factory floor in this futuristic setup. One example of such demands or requirements would be how automated mobile manipulators respond to deviations in physical location of workpieces and environment tools. This project aims to implement and review methods that facilitate the docking and manipulation of the autonomous mobile manipulators (MiR100+UR3e and SEIT100+YuMI) available in the Aalto Factory of the Future (AFoF).
A complete solution was implemented only for the MiR100+UR3e mobile manipulator, while the image recognition unit and pick-and-place operation of the YuMI robot was developed and tested without combining them with the SEIT100’s guided mission to form a complete solution. A couple of 3D markers were created (both physically and in the MiR100’s API) and used by the MiR100+UR3e mobile manipulator to guide it into the correct position for doing the assigned pick-and-place tasks. For the UR3e robot arm and gripper, an orange tag attached to the gripper was used in the calibration process. The tag was recognized by the image recognition unit (camera + Jetson Nano) and the appropriate coordinate transformation (between the camera coordinate frame and the UR3e’s base coordinate frame) was calculated and sent to the mobile manipulator to ensure that the gripper would reach accurately to the workpiece in the pick-and-place operation. The solution for SEIT100+YuMI IRB 14000 mobile manipulator was partially completed, as the image processing unit and pick-and-place operation was created and tested. Thus, there is a possibility for continuing the development of the solution for the SEIT100+YuMI IRB 14000 mobile manipulator in the future if there is a need for it.
Public software repository: https://version.aalto.fi/gitlab/afof/adaptive-automated-mobile-manipulators-in-agile-manufacturing