<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1723574</id>
	<title>Projects - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1723574"/>
	<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php/Special:Contributions/A1723574"/>
	<updated>2026-04-21T12:56:21Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.31.4</generator>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11330</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11330"/>
		<updated>2018-10-17T09:03:25Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|700px|centre|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|700px|centre|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition.&lt;br /&gt;
The webcam is capable to full HD image recording with a field of view of 78 degrees.  The webcam was aimed at the floor in front of the robot chassis with the top of the image ending near the horizon. &lt;br /&gt;
[[File:Webcamc922.png|300px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations were converted to obstacle locations referenced to the robot. Path planning could make decisions based on locations, and the robot could avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11329</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11329"/>
		<updated>2018-10-17T08:53:54Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|700px|centre|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|700px|centre|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|300px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations were converted to obstacle locations referenced to the robot. Path planning could make decisions based on locations, and the robot could avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11328</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11328"/>
		<updated>2018-10-17T08:52:01Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|700px|centre|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|700px|centre|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|300px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11327</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11327"/>
		<updated>2018-10-17T08:51:10Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|700px|centre|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|700px|centre|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|300px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11326</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11326"/>
		<updated>2018-10-17T08:50:18Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|700px|centre|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|700px|centre|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|200px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11325</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11325"/>
		<updated>2018-10-17T08:49:32Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|left|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|200px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11324</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11324"/>
		<updated>2018-10-17T08:49:00Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|right|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|200px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11323</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11323"/>
		<updated>2018-10-17T08:48:27Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|right|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|200px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11322</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11322"/>
		<updated>2018-10-17T08:47:59Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|right|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&amp;lt;br clear=&amp;quot;all&amp;quot; /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|200px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11321</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11321"/>
		<updated>2018-10-17T08:42:51Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Introduction ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigated the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|right|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Environment Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it could &amp;quot;see&amp;quot; track, obstacles, boundaries and the surrounding environment. Various sensors were required so the robot could calculate its position and the locations of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; Lidar (VL53LOX time of flight sensors)&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A Logitech C922 was used for image or video acquisition. &lt;br /&gt;
[[File:Webcamc922.png|200px|left|NI MyRIO - 1900]]&lt;br /&gt;
 &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The team decided to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm and 75mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that could used.&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries were marked on the floor with 75mm wide yello tape. The RGB images were processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its image processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Webcamc922.png&amp;diff=11320</id>
		<title>File:Webcamc922.png</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Webcamc922.png&amp;diff=11320"/>
		<updated>2018-10-17T08:39:05Z</updated>

		<summary type="html">&lt;p&gt;A1723574: NIARC 2018, Project #122&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;NIARC 2018, Project #122&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11319</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11319"/>
		<updated>2018-10-17T08:25:42Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|right|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries and the surrounding environment. A variety of sensors were required so the robot could calculate its position and the positions of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11318</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11318"/>
		<updated>2018-10-17T08:25:12Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|right|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
[[File:myrio.jpg|300px|thumb|left|NI MyRIO - 1900]] &amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries and the surrounding environment. A variety of sensors were required so the robot could calculate its position and the positions of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11317</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11317"/>
		<updated>2018-10-17T08:24:12Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|right|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|right|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;br /&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
[[File:myrio.jpg|300px|thumb|left|NI MyRIO - 1900]]&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries and the surrounding environment. A variety of sensors were required so the robot could calculate its position and the positions of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11316</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11316"/>
		<updated>2018-10-17T08:22:56Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|left|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|left|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
[[File:myrio.jpg|300px|thumb|left|NI MyRIO - 1900]]&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries and the surrounding environment. A variety of sensors were required so the robot could calculate its position and the positions of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11315</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11315"/>
		<updated>2018-10-17T08:14:38Z</updated>

		<summary type="html">&lt;p&gt;A1723574: /* Robot Sensors */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
[[File:myrio.jpg|300px|thumb|left|NI MyRIO - 1900]]&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries and the surrounding environment. A variety of sensors were required so the robot could calculate its position and the positions of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|500px|left|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|500px|left|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11314</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11314"/>
		<updated>2018-10-17T08:13:25Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
[[File:myrio.jpg|300px|thumb|left|NI MyRIO - 1900]]&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries and the surrounding environment. A variety of sensors were required so the robot could calculate its position and the positions of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG|300px|left|Competition Track]]&lt;br /&gt;
&amp;lt;!--[[File:Map.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
[[File:MapDimensions.PNG|300px|thumb|left|Map Dimensions]]&lt;br /&gt;
&amp;lt;!--[[File:MapDimensions.PNG]] --&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11313</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11313"/>
		<updated>2018-10-17T08:11:42Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness, and autonomous decision making. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other for the grand prize.&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
[[File:myrio.jpg|300px|thumb|left|NI MyRIO - 1900]]&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA were programmed using LabVIEW 2017, a graphical programming environment. Additional LabVIEW modules were used to process the received sensor information, such as the &amp;#039;Vision Development Module&amp;#039; and &amp;#039;Control Design and Simulation Module&amp;#039;.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot required a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries and the surrounding environment. A variety of sensors were required so the robot could calculate its position and the positions of surrounding elements. &lt;br /&gt;
&lt;br /&gt;
[[File:Map.png|300px|thumb|left|Competition Track]]&lt;br /&gt;
[[File:Map.PNG]]&lt;br /&gt;
[[File:MapDimensions.PNG|300px|thumb|left|Map Dimensions]]&lt;br /&gt;
[[File:MapDimensions.PNG]]&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11312</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11312"/>
		<updated>2018-10-17T07:58:54Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other. &lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
[[File:myrio.jpg]]&lt;br /&gt;
[[File:myrio.jpg|200px|thumb|left|NI MyRIO - 1900]]&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA are programmed using LabVIEW graphical programming environment.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot requires a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries, and so the robot may know its location within the track.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG]]&lt;br /&gt;
[[File:MapDimensions.PNG]]&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11311</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=11311"/>
		<updated>2018-10-17T07:57:53Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their reconfigurable processor and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where robots must perform various tasks on a track that incorporates various hazardous terrain, and unforeseen obstacles to be avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 platform to achieve autonomous localisation, path planning, and environmental awareness. The live final took place in September where university teams across Australia, New Zealand, and Asia competed against each other. &lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
MyRIO - 1900 containing a dual core ARM processor and Xilinx Zynq FPGA.&lt;br /&gt;
[[File:myrio.jpg]]&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO processor and FPGA are programmed using LabVIEW graphical programming environment.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot requires a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries, and so the robot may know its location within the track.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG]]&lt;br /&gt;
[[File:MapDimensions.PNG]]&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Logitech C922 webcam&lt;br /&gt;
* Range sensors: ultrasonic &amp;amp; VL53LOX time of flight sensors&lt;br /&gt;
* Motor rotation sensors: motor encoders package&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]] &amp;lt;!-- Replace with Logitech webcam images--&amp;gt;&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
The produced line equations can be converted to obstacle locations referenced to the robot. Path planning can make decisions based on locations and the robot can avoid them just as it would avoid obstacles.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Myrio.jpg&amp;diff=11310</id>
		<title>File:Myrio.jpg</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Myrio.jpg&amp;diff=11310"/>
		<updated>2018-10-17T07:57:40Z</updated>

		<summary type="html">&lt;p&gt;A1723574: NIARC 2018, Project #122&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;NIARC 2018, Project #122&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10870</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10870"/>
		<updated>2018-08-29T07:23:35Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Processing Platform ==&lt;br /&gt;
MyRIO - 1900&lt;br /&gt;
Contains CPU and FPGA for processing&lt;br /&gt;
&lt;br /&gt;
== Programming Environment ==&lt;br /&gt;
MyRIO is programmed using LabVIEW Environment&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot requires a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries, and so the robot may know its location within the track.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG]]&lt;br /&gt;
[[File:MapDimensions.PNG]]&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
* Image sensor: Microsoft Kinect&lt;br /&gt;
* Range sensors: ultrasonic, time of flight&lt;br /&gt;
* Motor rotation sensors: motor encoders&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]]&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
*Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
*Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
Such processing will be taxing on the processor. Fortunately the myRIO contains an FPGA that can be used instead. This&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
#Import/read captured RGB image&lt;br /&gt;
#Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
#*The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
#Produce mask around desired colour&lt;br /&gt;
#Erode mask to reduce noise regions to nothing&lt;br /&gt;
#Dilate mask to return mask to original size&lt;br /&gt;
#Isolate edges of mask&lt;br /&gt;
#Calculate equations of the lines that run through the edges&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Depth Image Processing ====&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10869</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10869"/>
		<updated>2018-08-29T07:12:41Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot requires a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries, and so the robot may know its location within the track.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG]]&lt;br /&gt;
[[File:MapDimensions.PNG]]&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
- Image sensor: Microsoft Kinect&lt;br /&gt;
- Range sensors: ultrasonic, time of flight&lt;br /&gt;
- Motor rotation sensors: motor encoders&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]]&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
==== RGB Image Processing ====&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
1. Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
2. Identify wall boundaries&lt;br /&gt;
&lt;br /&gt;
The competition track boundaries are marked on the floor with 50mm wide coloured tape. The RGB images are processed to extract useful information.&lt;br /&gt;
Before any image processing was attempted on the myRIO, the pipeline was determined using Matlab and its processing toolbox.&lt;br /&gt;
&lt;br /&gt;
==== Determining Image Processing Pipeline in Matlab ====&lt;br /&gt;
&lt;br /&gt;
An overview of the pipeline is as follows:&lt;br /&gt;
1. Import/read captured RGB image&lt;br /&gt;
2. Convert RGB (Red-Green-Blue) to HSV (Hue-Saturation-Value)&lt;br /&gt;
The HSV representation of images allows us to easliy: isolate particular colours (Hue range), select colour intensity (Saturation range), and select brightness (Value range)&lt;br /&gt;
3. Produce mask around desired colour&lt;br /&gt;
4. Erode mask to reduce noise regions so nothing&lt;br /&gt;
5. Dilate mask to return mask to correct size&lt;br /&gt;
6. Isolate edges of masked colour region&lt;br /&gt;
7. Calculate equations of remaining lines and line endpoints&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Depth Image Processing ====&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10868</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10868"/>
		<updated>2018-08-29T06:43:14Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot requires a variety of sensors so that it may &amp;quot;see&amp;quot; track obstacles and boundaries, and so the robot may know its location within the competition track.&lt;br /&gt;
&lt;br /&gt;
[[File:Map.PNG]]&lt;br /&gt;
[[File:MapDimensions.PNG]]&lt;br /&gt;
&lt;br /&gt;
The sensors used:&lt;br /&gt;
- Image sensor: Microsoft Kinect&lt;br /&gt;
- Range sensors: ultrasonic, time of flight&lt;br /&gt;
- Motor rotation sensors: motor encoders&lt;br /&gt;
-&lt;br /&gt;
-&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations. The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]]&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
=== RGB Image Processing ===&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
1. Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
2. Identify wall boundaries&lt;br /&gt;
3. Identify &lt;br /&gt;
&lt;br /&gt;
The competition track requires the robot to have the ability to identify boundaries on the floor that are marked with 50mm wide coloured tape. The RGB images.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10867</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10867"/>
		<updated>2018-08-29T06:36:27Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Robot Sensors ==&lt;br /&gt;
The robot uses a variety of sensors so that it may &amp;quot;see&amp;quot; obstacles and boundaries ahead and around the chassis as the track is navigated.&lt;br /&gt;
[[File:Map.png]]&lt;br /&gt;
[[File:MapDimensions.png]]&lt;br /&gt;
&lt;br /&gt;
A camera is used as one of the various sensors for the purpose of determining accurate position and target location estimations.&lt;br /&gt;
&lt;br /&gt;
=== Image Sensor ===&lt;br /&gt;
The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision because of it&amp;#039;s capabilities, availability, and cost. &lt;br /&gt;
This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]]&lt;br /&gt;
[[File:Kinect.png]]&lt;br /&gt;
&lt;br /&gt;
=== RGB Image Processing ===&lt;br /&gt;
The competition requires the robot to use the colour images for the following purposes:&lt;br /&gt;
1. Identify boundaries on the floor that are marked with 50mm wide coloured tape&lt;br /&gt;
2. Identify wall boundaries&lt;br /&gt;
3. Identify &lt;br /&gt;
&lt;br /&gt;
The competition track requires the robot to have the ability to identify boundaries on the floor that are marked with 50mm wide coloured tape. The RGB images.&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:MapDimensions.PNG&amp;diff=10866</id>
		<title>File:MapDimensions.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:MapDimensions.PNG&amp;diff=10866"/>
		<updated>2018-08-29T06:36:01Z</updated>

		<summary type="html">&lt;p&gt;A1723574: NIARC 2018 map with dimensions&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;NIARC 2018 map with dimensions&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Map.PNG&amp;diff=10865</id>
		<title>File:Map.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Map.PNG&amp;diff=10865"/>
		<updated>2018-08-29T06:35:24Z</updated>

		<summary type="html">&lt;p&gt;A1723574: NIARC 2018 Map&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;NIARC 2018 Map&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10864</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10864"/>
		<updated>2018-08-29T06:15:46Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Image Processing ==&lt;br /&gt;
A camera will be used as one of the various sensors for the purpose of determining accurate position and target location estimations.&lt;br /&gt;
&lt;br /&gt;
The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision. This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect 01.jpg]]&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10863</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10863"/>
		<updated>2018-08-29T06:15:12Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Image Processing ==&lt;br /&gt;
A camera will be used as one of the various sensors for the purpose of determining accurate position and target location estimations.&lt;br /&gt;
&lt;br /&gt;
The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision. This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect.jpg]]&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10862</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10862"/>
		<updated>2018-08-29T06:14:56Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Image Processing ==&lt;br /&gt;
A camera will be used as one of the various sensors for the purpose of determining accurate position and target location estimations.&lt;br /&gt;
&lt;br /&gt;
The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision. This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect.png]]&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10861</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10861"/>
		<updated>2018-08-29T06:13:13Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Image Processing ==&lt;br /&gt;
A camera will be used as one of the various sensors for the purpose of determining accurate position and target location estimations.&lt;br /&gt;
&lt;br /&gt;
The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision. This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Kinect.jpg]]&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Kinect_01.jpg&amp;diff=10860</id>
		<title>File:Kinect 01.jpg</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Kinect_01.jpg&amp;diff=10860"/>
		<updated>2018-08-29T06:12:32Z</updated>

		<summary type="html">&lt;p&gt;A1723574: Microsoft, (2018), Microsoft Kinect [ONLINE]. Available at: https://www.gamesmen.com.au/p-o-xbox-360-kinect-sensor-bnd [Accessed 29 August 2018].&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Microsoft, (2018), Microsoft Kinect [ONLINE]. Available at: https://www.gamesmen.com.au/p-o-xbox-360-kinect-sensor-bnd [Accessed 29 August 2018].&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10859</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=10859"/>
		<updated>2018-08-29T06:06:33Z</updated>

		<summary type="html">&lt;p&gt;A1723574: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew&lt;br /&gt;
&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI MyRIO-1900 FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition taking place in September 2018.&lt;br /&gt;
&lt;br /&gt;
== Image Processing ==&lt;br /&gt;
A camera will be used as one of the various sensors for the purpose of determining accurate position and target location estimations.&lt;br /&gt;
&lt;br /&gt;
The Microsoft Xbox 360 Kinect Sensor USB 2.0 camera is an ideal candidate for the robot&amp;#039;s vision. This camera produces two types of images: RGB images and depth (range) images. The RGB images consist of three channels: red, green, and blue. The depth images are represented with one channel, with each pixel value representing the distance from the camera to the first obstruction encountered.&lt;br /&gt;
&lt;br /&gt;
[[File:Example.jpg]]&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=9849</id>
		<title>Projects:2018s1-122 NI Autonomous Robotics Competition</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-122_NI_Autonomous_Robotics_Competition&amp;diff=9849"/>
		<updated>2018-04-03T07:51:51Z</updated>

		<summary type="html">&lt;p&gt;A1723574: /* Supervisors */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Supervisors ==&lt;br /&gt;
Dr Hong Gunn Chew,&lt;br /&gt;
Dr Braden Phillips&lt;br /&gt;
&lt;br /&gt;
== Honours Students ==&lt;br /&gt;
Alexey Havrilenko&lt;br /&gt;
&lt;br /&gt;
Bradley Thompson&lt;br /&gt;
&lt;br /&gt;
Joseph Lawrie&lt;br /&gt;
&lt;br /&gt;
Michael Prendergast&lt;br /&gt;
&lt;br /&gt;
== Project Description ==&lt;br /&gt;
Each year, National Instruments (NI) sponsors a competition to showcase the robotics capabilities of students by&lt;br /&gt;
building autonomous robots using one of their microcontroller and FPGA products. In 2018, the competition focuses&lt;br /&gt;
on the theme &amp;#039;Fast Track to the Future&amp;#039; where team robots will perform various tasks on a track that incorporates off-road and slippery terrain, and unforeseen obstacles to be overcome or avoided autonomously.&lt;br /&gt;
This project investigates the use of the NI FPGA/LabView platform for autonomous vehicles. The group will apply for&lt;br /&gt;
the competition in March 2018, with the competition to take place in September 2018&lt;/div&gt;</summary>
		<author><name>A1723574</name></author>
		
	</entry>
</feed>