Difference between revisions of "Projects:2015s1-26 Autonomous robotics using NI MyRIO"

From Projects
Jump to: navigation, search
(Project Details)
Line 151: Line 151:
  
 
[[File:Milestones.png]]  
 
[[File:Milestones.png]]  
+
 
 +
== References ==
 +
 
 +
[1] National Instruments 2015, NI Autonomous Robotics Competition 2015, National Instruments, viewed 15 April 2015, <http://australia.ni.com/ni-arc>.   
 +
[2] Tribelhorn, B & Dodds, Z 2007, "Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education", paper presented to IEEE International Conference on Robotics and Automation, Roma, 10-14 April, viewed 14 April 2015.
 +
[3] N. Ali, "Human Tracking and Robot Navigation using Multiple Sensors", University of Adelaide, Adelaide, SA, 2013.
 +
[4] Z. Islam, "Robot Navigation using Multiple Depth Sensors", University of Adelaide, Adelaide, SA, 2013.
 +
[5] Y. Pang and Y. Wang, "Autonomous Robot Navigation using a movable Kinect 3D sensor", University of Adelaide, Adelaide, SA, 2014.
 +
[6] G. Csaba and Z.Vámossy, "Fuzzy Based Obstacle Avoidance for Mobil Robots with Kinect Sensor", in 4th IEEE International Symposium on Logistics and Industrial Informatics, Smolenice, Slovakia, 2012.
 +
[7] P. Benavidez, "Mobile Robot Navigation and Target Tracking System", in 6th International Conference on System of Systems Engineering, Aluquerque, New Mexico, USA, 2011. pp. 299-304
 +
[8] D. Correa et al., "Mobile Robots Navigation in Indoor Environments Using Kinect Sensor", in Second Brazilian Conference on Critical Embedded Systems, 2012 © IEEE, doi: 10.1109, pp 36-41.
 +
[9] J. Borenstein and Y Koren, "Real-Time Obstacle Avoidance for Fast Mobile Robots", IEEE Trans. Syst. Man Cybern. (1971–1995), vol. 19, no. 5, 1989, pp. 1179-1187.
 +
[10] X. Cui and H. Shi, "A*-based Pathfinding in Modern Computer Games", Victoria University, Melbourne, Victoria, 2011.
 +
[11] J. Ng, "An Analysis of Mobile Robot Navigation Algorithms in Unknown Environments", University of Western Australia, Perth, WA, 2010.
 +
[12] H.B. Mitchell and M. Batya, “Common Representational Format,” in Data Fusion: Concepts and Ideas, 2nd ed. Berlin, Germany: Springer Berlin Heidelberg, 2012, pp. 51-52.
 +
[13] A. Goshtasby, "Introduction", in Image Registration: Principles, Tools and Methods, 1st ed. Springer, London, England, 2012, Ch 1, pp. 1-3.
 +
[14] National Instruments 2015, National Instruments Autonomous Robotics Competition: 2015 Competition Task and Rules, National Instruments, viewed 16 May 2015, <http://australia.ni.com/sites/default/files/NI%20ARC%202015%20Task%20and%20Rules%20Documentation_0.pdf>.
 +
[15] National Instruments 2015, NI CompactRIO, National Instruments, viewed 15 April 2015, <http://www.ni.com/compactrio/>.   
 +
[16] National Instruments 2015, NI myRIO, National Instruments, viewed 16 April 2015, <http://www.ni.com/myrio/>.   
 +
[17] Cytron Technologies 2013, HC-SR04 User’s Manual, Cytron Technologies, viewed 3 May 2015, <https://docs.google.com/document/d/1Y-yZnNhMYy7rwhAgyL_pfa39RsB-x2qR4vP8saG73rE/edit>.
 +
[18] Robot Electronics 2015, SRF05 – Ultra-Sonic Ranger, Robot Electronics, viewed 3 May 2015, < http://www.robot-electronics.co.uk/htm/srf05tech.htm>.
 +
[19] MaxBotix 2014, LV-MaxSonar-EZ Series High Performance Sonar Range Finder, MaxBotix, viewed 3 May 2015, < http://maxbotix.com/documents/LV-MaxSonar-EZ_Datasheet.pdf>.
 +
[20] N. Ganganath and H. Leung, “Mobile Robot Localisation using Odometry and Kinect Sensor,” in IEEE International Conference on Emerging Signal Processing Applications, Las Vegas, Nevada, 2012, pp. 91-94.
 +
[21] X. Smith, “Robot Localisation, Navigation and Control using National Instruments myRIO,” Flinders University, Adelaide, SA, 2014.
 +
[22] Y. Zou et al., “Indoor Localization and 3D Scene Reconstruction for Mobile Robots Using the Microsoft Kinect Sensor,” in IEEE International Conference on Industrial Informatics, Beijing, China, 2012, pp. 1182-1187.
 +
[23] P. Jensfelt, “Approaches to Mobile Robot Localization in Indoor Environments,” Ph.D dissertation, Dept. Signals, Sensors and Systems, Royal Institute of Technology, Stockholm, Sweden, 2001.
 +
[24] M. Likehachev et al., “Anytime Dynamic A*: An Anytime, Replanning Algorithm,” Carnegie Mellon University, Pittsburgh, USA, 2005.
 +
[25] F. Shih, “Feature Extraction,” in Image Processing and Pattern Recognition, 1st ed. John Wiley and Sons, New Jersey, USA, 2010, Ch. 8, pp. 269-305.

Revision as of 16:44, 20 July 2015

Introduction

This project proposes the construction of an autonomous robot designed around the National Instruments (NI) processing unit called myRIO. The robot will be entered into the National Instruments Autonomous Robotics Competition (NIARC) and will be designed to autonomously navigate an environment with obstacles, collect cargo and deliver cargo to target terminals. Areas of specific focus include obstacle avoidance, navigation and localisation. Obstacle avoidance will be implemented through ultrasonic sensors, while navigation and localisation will use the A* (A star) algorithm and the Kinect sensor. This project will make use of various project management skills such as using budget estimates as well as analysing risks involved. Overall, the expected outcome will be a demonstration that the proposed methods for obstacle avoidance, localisation and navigation work for the competition track. The final robot will be due for completion by the 16th September 2015.

Supervisors

Dr Hong Gunn Chew

A/Prof Cheng-Chew Lim

Honours Students

Under Graduate Group:

Adam Mai

Adrian Mac

Song Chen

Post Graduate Group:

Bin Gao

Xin Liu

Yiyi Wang

Project Details

2.1. Aims and Objectives

The aim of this project is to develop an autonomous robot using the National Instruments myRIO in addition with a graphical programming software called LabVIEW. The robot requires intelligence in the fields of obstacle avoidance, navigation and pathfinding, localisation as well as object handling [14] in order to traverse the environment depicted in Figure 2.4.1. Implementations of these modules will be divided amongst two groups: the under graduate group and post graduate group. The majority of the software responsibilities will be carried out by the under graduates while the post graduates shall deal with the mechanical aspects of the robot. In regards to the implementation, this project aims to address three hypotheses:


1. The Kinect sensor with ultrasonic sensors or ultrasonic sensors alone can accomplish object avoidance for both dynamic and static obstacles.

2. The A* algorithm can be implemented in such a way that the robot can freely move in any direction.

3. Odometry used in conjunction with filters and sensors will be able to localise a robot accurately in a defined environment.


With a total budget of $750, this project will further aim to minimise costs and produce a robot that is cost effective and reliable. The overall final product should be able to avoid both static and dynamic obstacles, determine its own route and transport goods to target locations. As this transportation process will be timed during the competition, maximising time efficiency will be important as well as ensuring the safety of the robot’s environment.

2.2. Significance of Competition

This project exhibits the capabilities of the National Instruments myRIO as well as the CompactRIO (refer to Appendix F for images). The CompactRIO is a real-time embedded industrial controller which National Instruments offers to industries [15]. The myRIO has a similar purpose and acts as a light version of this controller for students to use for smaller applications [16]. In consequence, the competition markets these abilities for National Instruments while allowing students to work with robots. The project illustrates the potential of using autonomous robots as a means of cargo transportation in a shipyard. Similar to all the other previous competitions held by National Instruments, it ultimately displays the benefits of using autonomous robots for industrial applications. The world has seen the power of autonomous robotics within domestic households, where robotic vacuums such as the Roomba have become increasingly popular [2]. Although more primitive than the competition robots, the fundamental idea of portraying the power of autonomous robotics in industries is evident. Furthermore, displaying the skills of future engineers greatly emphasises what the world can expect in the field of robotics for the years to come.

2.3. Motivation for Competition

There are three fundamental motivations behind this project. Firstly, on a national scale, by allowing universities from around Australia and New Zealand to compete with each other, it allows aspiring students to apply themselves in the field of robotics. Consequently, it encourages growth and innovation within this field of work [1] as well as demonstrates the capabilities of this generation’s engineers. Secondly, being a competition funded by the university, the successful completion of the project will allow universities to showcase not only the capabilities of their students, but also their quality of education. Lastly, on a personal level, the under graduate students benefit from the knowledge and experience that are gained upon completion of the competition. It allows them to become familiar with the integration of different systems working in conjunction with each other as well as with project management processes. Ultimately, it allows students to grasp an understanding of the requirements of a professional engineer.

2.4. Introductory Background

Competition Track.png Figure 2.4.1: Map of the National Instruments competition track (sourced from [14])

National Instruments (NI) is a company responsible for producing automated test equipment and virtual instrument software. Every year, it sponsors the National Instruments Autonomous Robotics Competition (NIARC) to allow tertiary students to challenge their capabilities in the robotics field. The theme of the competition changes annually and for 2015, the theme is “Transport and Roll-Out!” [1]. This theme simulates the use of an autonomous robot to collect and transport cargo in a shipyard. As such, this project will require the development of an autonomous robot in conjunction with National Instruments’ myRIO device, pictured in Appendix F. The following describes the procedure in which the robot will need to move in the competition. It has been sourced from the NIARC Task and Rules [14] and Figure 2.4.1 illustrates the overall course that will be traversed. Firstly, the robot will be required to navigate itself from the starting location to the cargo collection area. In between these zones will be a maximum of 3 static obstacles (shipping containers) which will need to be avoided by the robot. Once it reaches this collection area, a team member will simulate a crane and load the cargo on to the robot. After, the robot will unload cargo at any of the cargo terminals whilst avoiding any obstacles on its way. Here, the robot will need to deal with walls, speed bumps as well as a dynamically moving obstacle (another vehicle) in the hazardous area. Once it has no more cargo to unload, it will navigate itself to the finish area. The robot will have a three minute time limit to complete all these tasks.

2.5. Technical Background of Robot System

The autonomous robot system will utilise a range of techniques and components in order to implement its various features. The following is a list of the components, techniques and algorithms that will be used within this project. It also includes any relevant theories that apply to the robot system’s implementation.

Image/Ultrasonic/Infrared Sensors: The Xbox Kinect Sensor is an image sensor intended for use with the Xbox gaming console. It consists of a RGB (red-green-blue) camera and a pair of infrared optics sensors. It is able to track human movement and implements depth imaging by triangulation using its dual infrared sensors [4]. Ultrasonic sensors are range detection sensors which work by transmitting high frequency sounds. It calculates distance by measuring the time it takes for reflections to return to the sensor. It makes use of the speed of sound which is approximately 340 m/s at sea level, but this value can vary depending on conditions such as temperature and humidity. Infrared sensors are another form of range detection sensor that measures infrared light radiating from objects in its field of vision. It can also be used to detect colour and calculate the distance of objects. As it measures the intensity of the light it senses, other light sources can affect its overall reading such as sunlight.

Data Fusion: Data fusion is a theory and technique which is applied to a system in order to synthesise all its raw sensor data into a common form [12]. It makes use of certain conversion and transformation formulas into order to do this. Its implementation becomes particularly important when a system utilises multiple sensors and needs to translate all this information and make a decision. Data fusion ultimately gives the outputted values more significance as it improves the meaning of the data [12].

Image Registration: Image registration is the process of joining multiple images together such that the corresponding data between images are matched to the same coordinates [13]. This technique allows for image stitching of multiple photographs, 3D imaging and depth imaging. This process is most notably used in computer visual systems and medical imaging.

Path Planning and Path Following Algorithms: Path planning algorithms determine and plan paths that the robot can traverse through to reach a target destination. This includes the A* algorithm used to solve for the shortest path [10] as well as the local tangent bug algorithm which requires the direction of the target to operate [11]. These are to be used in conjunction with path following algorithms to allow the robot to move to a target point.

Pose: The pose of the robot refers to its position and orientation within an area [21]. It displays this information through coordinates (x, y) and through an angle (°) relative to the origin of a map (0, 0) in an anticlockwise direction.

Odometry: Odometry refers to the use of encoders to measure a robot’s wheel axes rotations such that information about its pose can be known [23].

Simultaneous Localisation and Mapping (SLAM) Algorithms: SLAM algorithms are used as a means to localise and map a robot’s surrounding area at the same time. It is generally used with visual systems to create 3D environment models [22].

Feature Extraction: Feature extraction refers to the classification of objects in an image by extracting specific features of this image and comparing them with a pattern classifier [21][25]. It allows a robot to distinguish the difference between a wall and an obstacle.

2.6. Project and Competition Constraints

Several project constraints have been set by National Instruments and the university that will affect the design and production of the autonomous robot. Firstly, the system must be designed to include the myRIO as the main or only processing unit. Secondly, the majority of the programming must be accomplished in LabVIEW. A restriction has further been placed on the under graduate group such that only 40% of the myRIO’s FPGA resources can be used. There are also restrictions in budget, with a total of $750 being provided to purchase all the electrical components for the system. However, as the team aims to minimise project spending, component options will become more limited. The size of the robot is also constrained by the height of the cargo terminal housings as well as by the width of the taped areas [14]. Lastly, time is a large constraint in this project as missing milestones relating to the National Instruments competition will result in a forfeit. As such, this constraint will need to be addressed properly to ensure task dates are met.

Competition Planning

3.1. Work Breakdown

This project requires the completion of two sets of tasks: university milestones and National Instruments milestones. The majority of university milestones follow a waterfall type structure where each deliverable builds upon the next. In contrast, the National Instruments milestones have more concurrency between milestones as there is little dependence between each deliverable. As such the approaches taken for each set of deliverables will be different. Tasks have been assigned to each under graduate members as follows:

Song Chen:

• Implement navigation and localisation • Assist in implementation of obstacle avoidance • Research and implement path smoothing for the robot • Filter implementation for localisation

Adrian Mac:

• Implement obstacle avoidance for the robot system • Assist in navigation and localisation • Corner detection using Kinect

Adam Mai:

• Assist in implementation of obstacle avoidance • Implementation of navigation using A* algorithm • Implementation of localisation system using pose estimation and feature extraction • Colour detection using Kinect

3.2. Timeline

The National Instruments milestones define the overall success of the project. As a result, this paper will concentrate more on these milestones than the university deliverables. There are a total of eight National Instruments milestones with the most important one being the competition finals. Each milestone is separated one month apart with the first one beginning on the 16th of March. Due to the independence of each deliverable, they will be undertaken concurrently beginning from the 19th of March to ensure there is enough time for research and implementation of these tasks. In comparison, the university deliverables will be completed in a waterfall sequence beginning from the 26th of March and continuing until the 30th of October. The most important deliverable is the honours thesis where work is scheduled to commence on the 27th of July.

3.3. Task Allocation of Robot Modules

Each member of the under graduate and post graduate groups has been allocated a specific system module to complete. Obstacle avoidance has been assigned to Adrian Mac while Adam Mai and Song Chen will handle navigation and localisation. As the navigation and localisation module is the foundation of the system, two members have been assigned for its successful completion. The post graduate group will be responsible for the motor movement, obstacle handling, and construction the robot. They will also be responsible for purchasing and constructing a test course for the robot.

3.4. Management Strategy

Various services and strategies will be used to manage this project. The main form of communication that will be used with stakeholders will be email and weekly meetings. For the post graduate and under graduate groups, this will be maintained through a messaging application, emails and monthly meetings. Documentations will be shared through the Google Drive and file exchange services. There are currently two different Google Drive folders which are being shared for this project: one between the under graduate members only and another between the under graduate and post graduate groups. These files can be accessed via hyperlinks to the folders. Additionally, at the request of National Instruments, video recordings of each milestone will be stored on YouTube. All the links for project data storage have been listed in Appendix C. Google Drive has been selected in preference of other services as it is free to use and allows members to simultaneously access and edit files. Finally, the budget will be managed through a budget estimate and by actively tracking all project spending.

3.5. Budget Estimate

The total budget made available to the under graduate group is $750 with $250 being provided for every group member. The current budget estimate was calculated to be $320 which includes an overhead of twenty percent. This covers the costs of all the sensors and electronic components required for the system. The list has been composed with cost minimisation as its main aim and does not include any mechanical components such as motors, wheels, batteries and the frame to create the robot. These materials will be covered by the post graduate group.

3.6. Risk Analysis

The three risks that have severe impacts on the project have been identified as follows: the under graduate and/or post graduate group missing milestones, damage to key components and delays in hardware orders.

The most important risk is missing milestones. This is because if a single National Instruments milestone is missed, the team will be forced to forfeit the competition. Additionally, any intermediate milestones between groups that are missed, will cause delays in the project. This in turn will affect the overall outcome of whether National Instruments milestones will be met on time. Hence monthly meetings will be held to review progress of both teams as well as to evaluate problems which may hinder future work. If work is behind schedule, it will be re-distributed amongst members to ensure its completion. Furthermore, weekly meetings shall be held with supervisors to allow members to seek advice and report progress. The Gantt chart will also be followed closely and changed to adapt to any situation.

The risk of damage cannot be completely eliminated due to the unpredictable nature of delivery services and transportation. As such, this risk will be mitigated by generating a set of rules for members to follow when transporting and storing components. Additionally, to limit the movement of these components and the myRIO, a locker will be hired to store all components.

Hardware delays are a very common problem that can occur during the project. It is a risk that can severely impact upon the project’s status if it is an essential item to complete milestones. As such to minimise this risk, hardware components should be ordered at least two weeks in advance of when it is needed. It may need to be ordered even earlier depending on how long delivery is expected to take. In the case where there are delays, members should seek components that can be borrowed from the university or other groups.

Deliverables and Milestones

This project has two sets of milestones: one for the university and one for National Instruments. As the project requires the collaboration between the under graduate and post graduate groups, there will be two sets of milestone tables for the university.

Milestones.png

References

[1] National Instruments 2015, NI Autonomous Robotics Competition 2015, National Instruments, viewed 15 April 2015, <http://australia.ni.com/ni-arc>. [2] Tribelhorn, B & Dodds, Z 2007, "Evaluating the Roomba: A low-cost, ubiquitous platform for robotics research and education", paper presented to IEEE International Conference on Robotics and Automation, Roma, 10-14 April, viewed 14 April 2015. [3] N. Ali, "Human Tracking and Robot Navigation using Multiple Sensors", University of Adelaide, Adelaide, SA, 2013. [4] Z. Islam, "Robot Navigation using Multiple Depth Sensors", University of Adelaide, Adelaide, SA, 2013. [5] Y. Pang and Y. Wang, "Autonomous Robot Navigation using a movable Kinect 3D sensor", University of Adelaide, Adelaide, SA, 2014. [6] G. Csaba and Z.Vámossy, "Fuzzy Based Obstacle Avoidance for Mobil Robots with Kinect Sensor", in 4th IEEE International Symposium on Logistics and Industrial Informatics, Smolenice, Slovakia, 2012. [7] P. Benavidez, "Mobile Robot Navigation and Target Tracking System", in 6th International Conference on System of Systems Engineering, Aluquerque, New Mexico, USA, 2011. pp. 299-304 [8] D. Correa et al., "Mobile Robots Navigation in Indoor Environments Using Kinect Sensor", in Second Brazilian Conference on Critical Embedded Systems, 2012 © IEEE, doi: 10.1109, pp 36-41. [9] J. Borenstein and Y Koren, "Real-Time Obstacle Avoidance for Fast Mobile Robots", IEEE Trans. Syst. Man Cybern. (1971–1995), vol. 19, no. 5, 1989, pp. 1179-1187. [10] X. Cui and H. Shi, "A*-based Pathfinding in Modern Computer Games", Victoria University, Melbourne, Victoria, 2011. [11] J. Ng, "An Analysis of Mobile Robot Navigation Algorithms in Unknown Environments", University of Western Australia, Perth, WA, 2010. [12] H.B. Mitchell and M. Batya, “Common Representational Format,” in Data Fusion: Concepts and Ideas, 2nd ed. Berlin, Germany: Springer Berlin Heidelberg, 2012, pp. 51-52. [13] A. Goshtasby, "Introduction", in Image Registration: Principles, Tools and Methods, 1st ed. Springer, London, England, 2012, Ch 1, pp. 1-3. [14] National Instruments 2015, National Instruments Autonomous Robotics Competition: 2015 Competition Task and Rules, National Instruments, viewed 16 May 2015, <http://australia.ni.com/sites/default/files/NI%20ARC%202015%20Task%20and%20Rules%20Documentation_0.pdf>. [15] National Instruments 2015, NI CompactRIO, National Instruments, viewed 15 April 2015, <http://www.ni.com/compactrio/>. [16] National Instruments 2015, NI myRIO, National Instruments, viewed 16 April 2015, <http://www.ni.com/myrio/>. [17] Cytron Technologies 2013, HC-SR04 User’s Manual, Cytron Technologies, viewed 3 May 2015, <https://docs.google.com/document/d/1Y-yZnNhMYy7rwhAgyL_pfa39RsB-x2qR4vP8saG73rE/edit>. [18] Robot Electronics 2015, SRF05 – Ultra-Sonic Ranger, Robot Electronics, viewed 3 May 2015, < http://www.robot-electronics.co.uk/htm/srf05tech.htm>. [19] MaxBotix 2014, LV-MaxSonar-EZ Series High Performance Sonar Range Finder, MaxBotix, viewed 3 May 2015, < http://maxbotix.com/documents/LV-MaxSonar-EZ_Datasheet.pdf>. [20] N. Ganganath and H. Leung, “Mobile Robot Localisation using Odometry and Kinect Sensor,” in IEEE International Conference on Emerging Signal Processing Applications, Las Vegas, Nevada, 2012, pp. 91-94. [21] X. Smith, “Robot Localisation, Navigation and Control using National Instruments myRIO,” Flinders University, Adelaide, SA, 2014. [22] Y. Zou et al., “Indoor Localization and 3D Scene Reconstruction for Mobile Robots Using the Microsoft Kinect Sensor,” in IEEE International Conference on Industrial Informatics, Beijing, China, 2012, pp. 1182-1187. [23] P. Jensfelt, “Approaches to Mobile Robot Localization in Indoor Environments,” Ph.D dissertation, Dept. Signals, Sensors and Systems, Royal Institute of Technology, Stockholm, Sweden, 2001. [24] M. Likehachev et al., “Anytime Dynamic A*: An Anytime, Replanning Algorithm,” Carnegie Mellon University, Pittsburgh, USA, 2005. [25] F. Shih, “Feature Extraction,” in Image Processing and Pattern Recognition, 1st ed. John Wiley and Sons, New Jersey, USA, 2010, Ch. 8, pp. 269-305.