<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1631858</id>
	<title>Projects - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1631858"/>
	<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php/Special:Contributions/A1631858"/>
	<updated>2026-05-03T20:12:29Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.31.4</generator>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1744</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1744"/>
		<updated>2014-10-29T13:53:46Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to extend the functionality and navigation performance of an existing robot platform by adding a movable 3D Kinect sensor. The robot will be able to navigate through the environment and perform collision avoidance whilst it searches for an object of interest.&lt;br /&gt;
&lt;br /&gt;
Objectives:&lt;br /&gt;
&lt;br /&gt;
--Collision avoidance&lt;br /&gt;
&lt;br /&gt;
--Object recognition&lt;br /&gt;
&lt;br /&gt;
--Path planning&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The hardware structure is based on the previous project robot structure. The existing platforms are eight ultrasonic sensors, one Mux-shield board, one Wild-thumper board and four motors. To implement object recognition and path planning, a kinect sensor and a servo motor is added to the existing platform.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Hardware.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The kinect image processing code and the Path Planning &amp;amp; Collision avoidance code are written in C++. Kinect image processing program not only output the object information, but also output wide range environment information. The output data then are used by the decision making program. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Object Recognition:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The Object Recognitoin is performed by 3D Kinect Sensor. In our project, the robot could recognize a simple object with a simple color. An example is showing below: A yellow ball is recognized.&lt;br /&gt;
&lt;br /&gt;
[[File:31Object.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Path planning:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
In this project, the path planning mathod is &amp;#039;Find the open space&amp;#039;. An example is showing below: the red line are suggesting the open spaces that detected&lt;br /&gt;
&lt;br /&gt;
[[File:31Openspace.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Collision avoidance:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Collision avoidance objective is performed by acoustic sensors. The logic of avoidance is showing below&lt;br /&gt;
&lt;br /&gt;
[[File:31Avoidance.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Testing:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
A test result is showing below. It is suggesting that the robot is able to detect object, find the open space for navigation, and avoid collision.&lt;br /&gt;
&lt;br /&gt;
[[File:31Test.PNG]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1743</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1743"/>
		<updated>2014-10-29T13:53:06Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to extend the functionality and navigation performance of an existing robot platform by adding a movable 3D Kinect sensor. The robot will be able to navigate through the environment and perform collision avoidance whilst it searches for an object of interest.&lt;br /&gt;
Objectives:&lt;br /&gt;
&lt;br /&gt;
--Collision avoidance&lt;br /&gt;
&lt;br /&gt;
--Object recognition&lt;br /&gt;
&lt;br /&gt;
--Path planning&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The hardware structure is based on the previous project robot structure. The existing platforms are eight ultrasonic sensors, one Mux-shield board, one Wild-thumper board and four motors. To implement object recognition and path planning, a kinect sensor and a servo motor is added to the existing platform.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Hardware.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The kinect image processing code and the Path Planning &amp;amp; Collision avoidance code are written in C++. Kinect image processing program not only output the object information, but also output wide range environment information. The output data then are used by the decision making program. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Object Recognition:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The Object Recognitoin is performed by 3D Kinect Sensor. In our project, the robot could recognize a simple object with a simple color. An example is showing below: A yellow ball is recognized.&lt;br /&gt;
&lt;br /&gt;
[[File:31Object.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Path planning:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
In this project, the path planning mathod is &amp;#039;Find the open space&amp;#039;. An example is showing below: the red line are suggesting the open spaces that detected&lt;br /&gt;
&lt;br /&gt;
[[File:31Openspace.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Collision avoidance:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Collision avoidance objective is performed by acoustic sensors. The logic of avoidance is showing below&lt;br /&gt;
&lt;br /&gt;
[[File:31Avoidance.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Testing:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
A test result is showing below. It is suggesting that the robot is able to detect object, find the open space for navigation, and avoid collision.&lt;br /&gt;
&lt;br /&gt;
[[File:31Test.PNG]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1742</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1742"/>
		<updated>2014-10-29T13:52:15Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to extend the functionality and navigation performance of an existing robot platform by adding a movable 3D Kinect sensor. The robot will be able to navigate through the environment and perform collision avoidance whilst it searches for an object of interest.&lt;br /&gt;
Objectives:&lt;br /&gt;
&lt;br /&gt;
--Collision avoidance&lt;br /&gt;
&lt;br /&gt;
--Object recognition&lt;br /&gt;
&lt;br /&gt;
--Path planning&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The hardware structure is based on the previous project robot structure. The existing platforms are eight ultrasonic sensors, one Mux-shield board, one Wild-thumper board and four motors. To implement object recognition and path planning, a kinect sensor and a servo motor is added to the existing platform.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Hardware.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The kinect image processing code and the Path Planning &amp;amp; Collision avoidance code are written in C++. Kinect image processing program not only output the object information, but also output wide range environment information. The output data then are used by the decision making program. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Object Recognition:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The Object Recognitoin is performed by 3D Kinect Sensor. In our project, the robot could recognize a simple object with a simple color. An example is showing below: A yellow ball is recognized.&lt;br /&gt;
&lt;br /&gt;
[[File:31Object.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Path planning:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
In this project, the path planning mathod is &amp;#039;Find the open space&amp;#039;. An example is showing below: the red line are suggesting the open spaces that detected&lt;br /&gt;
&lt;br /&gt;
[[File:31Openspace.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Collision avoidance:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
Collision avoidance objective is performed by acoustic sensors. The logic of avoidance is showing below&lt;br /&gt;
&lt;br /&gt;
[[File:31Avoidance.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Testing:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
A test result is showing below. It is suggesting that the robot is able to detect object, find the open space for navigation, and avoid collision.&lt;br /&gt;
&lt;br /&gt;
[[File:31Test.PNG]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Test.PNG&amp;diff=1741</id>
		<title>File:31Test.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Test.PNG&amp;diff=1741"/>
		<updated>2014-10-29T13:52:04Z</updated>

		<summary type="html">&lt;p&gt;A1631858: Test result&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Test result&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Avoidance.PNG&amp;diff=1739</id>
		<title>File:31Avoidance.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Avoidance.PNG&amp;diff=1739"/>
		<updated>2014-10-29T13:48:06Z</updated>

		<summary type="html">&lt;p&gt;A1631858: Collision avoidance logic&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Collision avoidance logic&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1738</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1738"/>
		<updated>2014-10-29T13:47:00Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to extend the functionality and navigation performance of an existing robot platform by adding a movable 3D Kinect sensor. The robot will be able to navigate through the environment and perform collision avoidance whilst it searches for an object of interest.&lt;br /&gt;
Objectives:&lt;br /&gt;
--Collision avoidance&lt;br /&gt;
--Object recognition&lt;br /&gt;
--Path planning&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The hardware structure is based on the previous project robot structure. The existing platforms are eight ultrasonic sensors, one Mux-shield board, one Wild-thumper board and four motors. To implement object recognition and path planning, a kinect sensor and a servo motor is added to the existing platform.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Hardware.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The kinect image processing code and the Path Planning &amp;amp; Collision avoidance code are written in C++. Kinect image processing program not only output the object information, but also output wide range environment information. The output data then are used by the decision making program. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Object Recognition:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The Object Recognitoin is performed by 3D Kinect Sensor. In our project, the robot could recognize a simple object with a simple color. An example is showing below: A yellow ball is recognized.&lt;br /&gt;
&lt;br /&gt;
[[File:31Object.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Path planning:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
In this project, the path planning mathod is &amp;#039;Find the open space&amp;#039;. An example is showing below: the red line are suggesting the open spaces that detected&lt;br /&gt;
&lt;br /&gt;
[[File:31Openspace.PNG]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Openspace.PNG&amp;diff=1737</id>
		<title>File:31Openspace.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Openspace.PNG&amp;diff=1737"/>
		<updated>2014-10-29T13:45:45Z</updated>

		<summary type="html">&lt;p&gt;A1631858: Openspace example&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Openspace example&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1733</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1733"/>
		<updated>2014-10-29T13:41:37Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to extend the functionality and navigation performance of an existing robot platform by adding a movable 3D Kinect sensor. The robot will be able to navigate through the environment and perform collision avoidance whilst it searches for an object of interest.&lt;br /&gt;
Objectives:&lt;br /&gt;
--Collision avoidance&lt;br /&gt;
--Object recognition&lt;br /&gt;
--Path planning&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The hardware structure is based on the previous project robot structure. The existing platforms are eight ultrasonic sensors, one Mux-shield board, one Wild-thumper board and four motors. To implement object recognition and path planning, a kinect sensor and a servo motor is added to the existing platform.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Hardware.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The kinect image processing code and the Path Planning &amp;amp; Collision avoidance code are written in C++. Kinect image processing program not only output the object information, but also output wide range environment information. The output data then are used by the decision making program. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Object Recognition:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
[[File:31Object.PNG]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Object.PNG&amp;diff=1731</id>
		<title>File:31Object.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Object.PNG&amp;diff=1731"/>
		<updated>2014-10-29T13:40:56Z</updated>

		<summary type="html">&lt;p&gt;A1631858: Group 31 Object&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Group 31 Object&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1723</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1723"/>
		<updated>2014-10-29T13:19:33Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For Hardware approach, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Hardware.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For software approach, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.PNG]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Hardware.PNG&amp;diff=1722</id>
		<title>File:31Hardware.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Hardware.PNG&amp;diff=1722"/>
		<updated>2014-10-29T13:19:06Z</updated>

		<summary type="html">&lt;p&gt;A1631858: Group31 Hardware approach&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Group31 Hardware approach&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1721</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1721"/>
		<updated>2014-10-29T13:15:59Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For Hardware approach, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Hardware.png]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For software approach, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.PNG]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1720</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1720"/>
		<updated>2014-10-29T13:15:10Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For Hardware approach, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.  &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Hardware.png]]&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For software approach, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:31Software.png]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1719</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1719"/>
		<updated>2014-10-29T13:11:22Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
[[File:-31Robot.PNG]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For Hardware approach, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.  &lt;br /&gt;
[[File:Hardware.png]]&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Software Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
For software approach, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. &lt;br /&gt;
[[File:31Software.png]]&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Software.PNG&amp;diff=1718</id>
		<title>File:31Software.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:31Software.PNG&amp;diff=1718"/>
		<updated>2014-10-29T13:10:54Z</updated>

		<summary type="html">&lt;p&gt;A1631858: Software diagram&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Software diagram&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:-31Robot.PNG&amp;diff=1716</id>
		<title>File:-31Robot.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:-31Robot.PNG&amp;diff=1716"/>
		<updated>2014-10-29T13:08:56Z</updated>

		<summary type="html">&lt;p&gt;A1631858: -31Robot&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;-31Robot&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1703</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1703"/>
		<updated>2014-10-29T12:46:56Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
 [[File:Hardware.png]]&lt;br /&gt;
For Hardware approach, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.  &lt;br /&gt;
 &lt;br /&gt;
For software approach, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. At last, we arrange the whole process into this order: &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1701</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1701"/>
		<updated>2014-10-29T12:46:15Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
 [[File:Hardware.PNG]]&lt;br /&gt;
For Hardware approach, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.  &lt;br /&gt;
 &lt;br /&gt;
For software approach, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. At last, we arrange the whole process into this order: &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Hardware.png&amp;diff=1699</id>
		<title>File:Hardware.png</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Hardware.png&amp;diff=1699"/>
		<updated>2014-10-29T12:44:59Z</updated>

		<summary type="html">&lt;p&gt;A1631858: A1631858 uploaded a new version of &amp;amp;quot;File:Hardware.png&amp;amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#31 Hardware approach&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1696</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=1696"/>
		<updated>2014-10-29T12:42:24Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Hardware and the other part is Software approach. &lt;br /&gt;
 [[File:Hardware.png]]&lt;br /&gt;
For Hardware approach, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.  &lt;br /&gt;
 &lt;br /&gt;
For software approach, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. At last, we arrange the whole process into this order: &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hardware Approach:&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Hardware.png&amp;diff=1695</id>
		<title>File:Hardware.png</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Hardware.png&amp;diff=1695"/>
		<updated>2014-10-29T12:41:44Z</updated>

		<summary type="html">&lt;p&gt;A1631858: #31 Hardware approach&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#31 Hardware approach&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=413</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=413"/>
		<updated>2014-10-05T15:06:29Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Wild-thumper board and the other part is the C++ coding.  &lt;br /&gt;
For Wild-thumper board part, connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential. Secondly, the code that upload to the microcontoller contains the wheels control command, Ultrasonic sensors and the Kinect sensor activating command.   &lt;br /&gt;
For C++ coding part, first of all serial port communication issue has been taking serious because it use to be a big problem when processing the project. Then the data type. Thirdly is the Kinect sensor programming. After that, we write the logic code, test it and modify. At last, we arrange the whole process into this order: &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=412</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=412"/>
		<updated>2014-10-05T14:49:34Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
  &lt;br /&gt;
There are two parts to approach. One part is the Wild-thumper board and the other part is the C++ coding.&lt;br /&gt;
For Wild-thumper board part,  connect the Kinect 3D sensor and 8 acoustic sensors to the board correctly is essential&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=411</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=411"/>
		<updated>2014-10-05T14:35:50Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;  &lt;br /&gt;
    &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039; &lt;br /&gt;
     &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=410</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=410"/>
		<updated>2014-10-05T14:35:29Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;   &lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;   &lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=409</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=409"/>
		<updated>2014-10-05T14:35:11Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
   &lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=408</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=408"/>
		<updated>2014-10-05T14:34:41Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
Yinzia Pang&lt;br /&gt;
   &lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
Danny Gibbins&lt;br /&gt;
   &lt;br /&gt;
Braden Phillips&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=407</id>
		<title>Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2014S1-31_Autonomous_Robot_Navigation_using_a_Movable_Kinect_3D_sensor&amp;diff=407"/>
		<updated>2014-10-05T14:34:17Z</updated>

		<summary type="html">&lt;p&gt;A1631858: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Project Introduction:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
The goal of this project is to connect a Kinect 3D sensor to the existing robot which is current navihas 8 acoustic sensors&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Overall Approach:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Team Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
Yinzia Pang&lt;br /&gt;
Yingzheng Wang&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
Danny Gibbins&lt;br /&gt;
Braden Phillips&lt;/div&gt;</summary>
		<author><name>A1631858</name></author>
		
	</entry>
</feed>