AEEproject

Workspace Navigation

Adaptive automated mobile manipulators in agile manufacturing

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Agile manufacturing is an approach to manufacturing that leverages flexibility, bottom-up innovation, and augmentation in order to quickly adapt, through an iterative process, to changing conditions such as customer needs and market changes while still controlling costs and quality. The term is applied to an organization that has created the processes, tools, and training needed to be able to do so. 

Using collaborative robots that are integrated into automated guided vehicles (AGVs) is a new way to promote agile manufacturing that also brings many technological challenges. Having better resilience under unexpected changes and conditions is demanded from every system on the factory floor in this futuristic setup. One example of such demands or requirements would be how automated mobile manipulators respond to deviations in physical location of workpieces and environment tools.The aim of this project was  This project aims to implement and review methods that facilitate the docking and manipulation of the autonomous mobile manipulators (MiR100+UR3e and SEIT100+YuMI) available in the Aalto Factory of the Future (AFoF). Using the 3D markers as docking points in the immediate environment of the mobile manipulator proved to be challenging, as the robot’s final location often deviated significantly relative to the EnAS-workbench with the conveyor belts on which the workpieces were located. Image processing was also needed to find workpieces in the production line. Additionally, there was a need for coordinate frame transformation as the image recognition unit (web camera + Jetson Nano mini computer) interpreted the location of the workpiece from the camera’s own coordinate frame, while the UR3e robot needed to know the location of the workpiece in its own base frame in order for

Image Added

 A complete solution was implemented only for the MiR100+UR3e mobile manipulator, while the image recognition unit and pick-and-place operation of the YuMI robot was developed and tested without combining them with the SEIT100’s guided mission to form a complete solution.  A couple of 3D markers were created (both physically and in the MiR100’s API) and used by the MiR100+UR3e mobile manipulator to guide it into the correct position for doing the assigned pick-and-place tasks. For the UR3e robot arm and gripper, an orange tag attached to the gripper was used in the calibration process. The tag was recognized by the image recognition unit (camera + Jetson Nano) and the appropriate coordinate transformation (between the camera coordinate frame and the UR3e’s base coordinate frame) was calculated and sent to the mobile manipulator to ensure that the gripper would reach accurately to the workpiece in the pick-and-place operation to be accurate enough. Thus, there was a need for a robust calibration method, able to adapt to the unexpected changes in the docking location of the mobile manipulator. The developed calibration method utilized an orange tag attached to the gripper, from which the camera could find the robot arm in the picture frame and calculate the distance between the gripper and the workpiece from that same picture. For communication with the robot REST, API, web services and sockets were used. 

Image Removed

Finally, a complete solution was developed for the MiR100+UR3e mobile manipulator for red and green circular workpieces, while the solution for SEIT100+YuMI IRB 14000 mobile manipulator was partially completed, as the image processing unit and pick-and-place operation was created and tested. Thus, there is a possibility for continuing the development of the solution for the SEIT100+YuMI IRB 14000 mobile manipulator in the future if there is a need for it.

...