Projects:2021s2-63332 A collaborative solution for advancing Australian manufactures

From Projects
Revision as of 18:18, 8 June 2022 by A1755037 (talk | contribs) (Method)
Jump to: navigation, search

fsfsd Abstract here

Introduction

Collaborative technology is the frontier innovation for taking advanced autonomous systems that already work intelligently, such as assisting in search and rescue, bushfire fighting and military missions, and making them even more effective by working collaboratively. Human-machine collaborations are a more effective use of autonomous and robotic systems because it combines the characteristics of both human’s creativity and cognition and machine’s accuracy and efficiency. Particularly in the manufacturing sector, humans and machines working interactively side-by-side are more productive than using any of themselves.

Project team

Project students

  • Liangchen Shu
  • Yuzuo Zhu
  • Guangzu Shi

Supervisors

  • Prof. Peng Shi
  • A/Prof. Rini Akmeliawati

Advisors

  • Xin Yuan
  • Yang Fei

Objectives

The aim of the project is to design a human-robot collaboration approach to improve Australian manufacturers. The project mainly focuses on one scenario to apply the human-robot collaboration method to prove the feasibility of the project design. To achieve the project goal, the specific human-robot collaboration innovation needs to be confirmed for further design, and the integration of robotics and autonomous control theories is applied in the project design for the scenario implementation. And there are three objectives to achieve the project goal.

There are three objectives to achieve the project aim:

* Discovering innovations in using collaborative technologies could be applied on
* Building a practical scenario and developing the cobot prototype to be applied to the Australian manufacturers.
* Prove the feasibility of the designed human-robot collaboration system.

Background

Method

Virtual Platform Design

Construction To implement this aircrafts parts inspection scenario on the virtual platform. The CoopeliaSim (V-rep) is a suitable software to provide the simulation platform. The build-in Api functions could be integrated with Matlab to achieve the designed system functions. Inside this software, there are several existing models could be directly used to create this specific scene. For this case, the Universal Robot edition 5 (UR5e) is the most suitable choice due to its relatively the largest size among the robotic arms. “The UR5e is a lightweight, adaptable collaborative industrial robot that tackles medium-duty applications with ultimate flexibility. The UR5e is designed for seamless integration into a wide range of applications.” The size of UR5e could provide wide operation range to carry the vision sensor to scan the multiple sides of one aircraft part with enough payload (5 kg) lift and relatively long reach (850mm). Then, the vision sensor is perspective projection-type which the field of view of perspective project-type vision sensor is trapezoidal. They are well suited for camera-type sensors. So, the perspective vision sensor could scan the entire aircraft part and generate the image to show every detail. The last part is mobile base. The selection of mobile base is KUKA Omnirob. This large size omnidirectional rover carries the robotic arm and its control box (actual physical structure) to each destination with high payload(170kg) and achieves a positioning accuracy of up to +/- 5 millimetre, even in the tightest spaces. Scanning Path

The scanning path is designed to cover each side of the given object like the figure presents. The target dummy is created to follow the path, and the target dummy also leads the robotic arm end position to accomplish the robotic arm following the path. The scanning path is from the object bottom to the top on each side, then integrating the UR5e operation range, the limit of up-down path range is set to 0.58827 unit in virtual environment. Then, the up-down path is designed to a circular path to fit in various of scanning objects. After completing one motion from bottom to the top, the target is designed to move left for 0.3 unit in virtual environment to execute the next motion. The whole scanning process is the target moves from bottom to the top, then moves to left and moves from top to the bottom. Eventually, the targets move around the whole object to generate the whole scanning path to lead the robotic execute the scanning process. Human-robot Collaboration The human-robot collaboration is developed via app designer in MATLAB. A GUI is created for human-robot collaboration. The app designer can be programmed to adding buttons and make buttons functionality. There are three sections are designed to achieve the human-robot collaboration function. The first section is creating arrow keys for human manually operate the robotic arm motion. These arrow keys are linked the Coppeliasim api functions for manually change the robotic arm end [x, y, z] orientations. Same as the second section for omnidirectional rover position control by linking the arrow keys to the api motion functions. Another part in the second section is vision sensor image captured. A window is designed to transmit the images of object surface obtained by vision sensor from Coppeliasim to app designer GUI. The transmission frequency is set to 15 for human could receive 15 images per second, so that the images are enough for details validation. The last section is designed for message transmission. The vision sensor could detect the distance from the object surface and identify the potential cracks or dent. So, the calculated values are also sent back to GUI for human validation to eliminate the errors or fail detection.

Physical Platform Design

Results

Conclusion

References

[1] a, b, c, "Simple page", In Proceedings of the Conference of Simpleness, 2010.

[2] ...