<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1661336</id>
	<title>Projects - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1661336"/>
	<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php/Special:Contributions/A1661336"/>
	<updated>2026-05-02T15:38:15Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.31.4</generator>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12737</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12737"/>
		<updated>2019-06-17T05:08:44Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Final Results */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
We have successfully tried to use four rovers to form up multiple formations as shown below.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[File:Real rect after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real line after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real star after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real tri after.PNG]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
We find the formations we can choose out of four rovers would be very limited, so we expanded the number of the rovers to six, hence we could come up with more formations.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12736</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12736"/>
		<updated>2019-06-17T05:08:01Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
We have successfully tried to use four rovers to form up multiple formations as shown below.&lt;br /&gt;
&lt;br /&gt;
[[File:Real rect after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real line after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real star after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real tri after.PNG]]&lt;br /&gt;
&lt;br /&gt;
We find the formations we can choose out of four rovers would be very limited, so we expanded the number of the rovers to six, hence we could come up with more formations.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12735</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12735"/>
		<updated>2019-06-17T05:07:27Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Final Results */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
We have successfully tried to use four rovers to form up multiple formations as shown below.&lt;br /&gt;
&lt;br /&gt;
[[File:Real rect after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real line after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real star after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real tri after.PNG]]&lt;br /&gt;
&lt;br /&gt;
We find the formations we can choose out of four rovers would be very limited, so we expanded the number of the rovers to six, hence we could come up with more formations.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12734</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12734"/>
		<updated>2019-06-17T05:03:12Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Final Results */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
[[File:Real rect after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real line after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real star after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real tri after.PNG]]&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_tri_after.PNG&amp;diff=12733</id>
		<title>File:Real tri after.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_tri_after.PNG&amp;diff=12733"/>
		<updated>2019-06-17T05:02:48Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12732</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12732"/>
		<updated>2019-06-17T05:02:15Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Final Results */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
[[File:Real rect after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real line after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real star after.PNG]]&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_star_after.PNG&amp;diff=12731</id>
		<title>File:Real star after.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_star_after.PNG&amp;diff=12731"/>
		<updated>2019-06-17T05:01:53Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12730</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12730"/>
		<updated>2019-06-17T05:01:26Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Final Results */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
[[File:Real rect after.PNG]]&lt;br /&gt;
&lt;br /&gt;
[[File:Real line after.PNG]]&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_line_after.PNG&amp;diff=12729</id>
		<title>File:Real line after.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_line_after.PNG&amp;diff=12729"/>
		<updated>2019-06-17T05:00:57Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12728</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12728"/>
		<updated>2019-06-17T04:59:56Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Final Results */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
[[File:Real rect after.PNG]]&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12727</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12727"/>
		<updated>2019-06-17T04:57:51Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Final Results */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_rect_after.PNG&amp;diff=12726</id>
		<title>File:Real rect after.PNG</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Real_rect_after.PNG&amp;diff=12726"/>
		<updated>2019-06-17T04:56:53Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Star_after.png&amp;diff=12725</id>
		<title>File:Star after.png</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=File:Star_after.png&amp;diff=12725"/>
		<updated>2019-06-17T04:55:10Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12724</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12724"/>
		<updated>2019-06-17T04:53:31Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Final Results ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12723</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12723"/>
		<updated>2019-06-17T04:50:54Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* System Design */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12722</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12722"/>
		<updated>2019-06-17T04:49:52Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* System Design */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. &lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.&lt;br /&gt;
To test whether our algorithm is working or not, I have implemented a scene like figure here in V-REP using the built-in models called Pioneer3dx, it is a two-wheel rover and obeys the basic physical law. Based on that, we developed a motor controller to manipulate the movement of the Pioneer3dx. When Fan’s control algorithm executes and outputs a set of coordinates of trajectory, the V-REP would take those coordinates in, and feed them onto the rovers respectively, the motor controller then make the wheels spin by calculating the needed velocity and angle between the starting position and the destination. &lt;br /&gt;
After the justification in the virtual environment, the testing would be transferred from V-REP to a physical model which is developed by Anh. Just as the way V-REP runs the output from Fan’s algorithm, the real rovers also takes in a set of coordinates, from two different points, the Arduino controller on board would control the rover to move to that desired position and eventually finish the formation.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12721</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12721"/>
		<updated>2019-06-17T04:49:12Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Objectives */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
5. Integrating all member’s work into the virtual environment and test.&lt;br /&gt;
&lt;br /&gt;
6. Check for more possible formations.&lt;br /&gt;
&lt;br /&gt;
7. Obstacle avoidance after formation.&lt;br /&gt;
&lt;br /&gt;
8. Transfer from virtual environment to the physical platform and test.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be&lt;br /&gt;
containing four agents, it is a distributed system, which means each agent in this&lt;br /&gt;
system can make their own decision. But even with this ability, the four agents still&lt;br /&gt;
need to communicate with each other about the velocity, the current position, the&lt;br /&gt;
destination and the angle of the agent. The exchange of these data allows them to&lt;br /&gt;
make an agreement so that they can achieve the desired formation. This part was&lt;br /&gt;
done by Fan.&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to&lt;br /&gt;
approach final result, such as the ultra-sonic sensor, or the infrared distance meter.&lt;br /&gt;
In this case, we use the camera instead to produce a coordinate system to measure&lt;br /&gt;
the distances between objects and the agents. We also use the camera to apply&lt;br /&gt;
image recognition so that the agents are able to recognize the object in its “sight”.&lt;br /&gt;
By passing these information into processor, our algorithm will provide an optimized&lt;br /&gt;
path to the agent, the agent then give commands to the motors. The description&lt;br /&gt;
above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12720</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12720"/>
		<updated>2019-06-17T04:47:42Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Conclusion */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be&lt;br /&gt;
containing four agents, it is a distributed system, which means each agent in this&lt;br /&gt;
system can make their own decision. But even with this ability, the four agents still&lt;br /&gt;
need to communicate with each other about the velocity, the current position, the&lt;br /&gt;
destination and the angle of the agent. The exchange of these data allows them to&lt;br /&gt;
make an agreement so that they can achieve the desired formation. This part was&lt;br /&gt;
done by Fan.&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to&lt;br /&gt;
approach final result, such as the ultra-sonic sensor, or the infrared distance meter.&lt;br /&gt;
In this case, we use the camera instead to produce a coordinate system to measure&lt;br /&gt;
the distances between objects and the agents. We also use the camera to apply&lt;br /&gt;
image recognition so that the agents are able to recognize the object in its “sight”.&lt;br /&gt;
By passing these information into processor, our algorithm will provide an optimized&lt;br /&gt;
path to the agent, the agent then give commands to the motors. The description&lt;br /&gt;
above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
At the end of this project, we have an algorithm capable of doing formation control by taking in the desired coordinates and then outputs moving trajectory of the rovers. We performed that in both virtual environment and real world. Despite the fact that the algorithm is real-time and decentralised, we could not make the real rovers become real-time as well since the time consumption and the schedule delay. But we are able to run the real-time simulation in the virtual environment. This proved that in the real world our algorithm is also feasible. &lt;br /&gt;
In both virtual and physical environment, we are able to control four rovers to form up different shapes we want. We also expand the number of rovers to 9 in virtual environment and the numbers to 6 in physical environment.&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12371</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12371"/>
		<updated>2019-03-22T05:17:09Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Objectives */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
&lt;br /&gt;
2. Route planning.&lt;br /&gt;
&lt;br /&gt;
3. Formation control.&lt;br /&gt;
&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be&lt;br /&gt;
containing four agents, it is a distributed system, which means each agent in this&lt;br /&gt;
system can make their own decision. But even with this ability, the four agents still&lt;br /&gt;
need to communicate with each other about the velocity, the current position, the&lt;br /&gt;
destination and the angle of the agent. The exchange of these data allows them to&lt;br /&gt;
make an agreement so that they can achieve the desired formation. This part was&lt;br /&gt;
done by Fan.&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to&lt;br /&gt;
approach final result, such as the ultra-sonic sensor, or the infrared distance meter.&lt;br /&gt;
In this case, we use the camera instead to produce a coordinate system to measure&lt;br /&gt;
the distances between objects and the agents. We also use the camera to apply&lt;br /&gt;
image recognition so that the agents are able to recognize the object in its “sight”.&lt;br /&gt;
By passing these information into processor, our algorithm will provide an optimized&lt;br /&gt;
path to the agent, the agent then give commands to the motors. The description&lt;br /&gt;
above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12370</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12370"/>
		<updated>2019-03-22T05:16:44Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
== Objectives ==&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
2. Route planning.&lt;br /&gt;
3. Formation control.&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
== System Design ==&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be&lt;br /&gt;
containing four agents, it is a distributed system, which means each agent in this&lt;br /&gt;
system can make their own decision. But even with this ability, the four agents still&lt;br /&gt;
need to communicate with each other about the velocity, the current position, the&lt;br /&gt;
destination and the angle of the agent. The exchange of these data allows them to&lt;br /&gt;
make an agreement so that they can achieve the desired formation. This part was&lt;br /&gt;
done by Fan.&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to&lt;br /&gt;
approach final result, such as the ultra-sonic sensor, or the infrared distance meter.&lt;br /&gt;
In this case, we use the camera instead to produce a coordinate system to measure&lt;br /&gt;
the distances between objects and the agents. We also use the camera to apply&lt;br /&gt;
image recognition so that the agents are able to recognize the object in its “sight”.&lt;br /&gt;
By passing these information into processor, our algorithm will provide an optimized&lt;br /&gt;
path to the agent, the agent then give commands to the motors. The description&lt;br /&gt;
above is only for one single agent.&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12369</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=12369"/>
		<updated>2019-03-22T05:14:20Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project introduces a distributed system that is able to handle multiple agents&lt;br /&gt;
at once. Each individual of this system will carry a camera alone and make own&lt;br /&gt;
decisions. However, the decisions are also depend on the communication between&lt;br /&gt;
the individuals, the data through communication includes the coordinates of&lt;br /&gt;
current positions, the wheel angles, velocities, and the distances from one position&lt;br /&gt;
to another. By integrating these information together, the system will come up&lt;br /&gt;
with a best solution for current scenario and command the each agent to move to&lt;br /&gt;
the desired position.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
Yuan Sun&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Objectives:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Based on the information we know so far, the objectives of this project in the first&lt;br /&gt;
semester can be concluded as below:&lt;br /&gt;
1. Vision-based sensing.&lt;br /&gt;
2. Route planning.&lt;br /&gt;
3. Formation control.&lt;br /&gt;
4. Develop a distributed control system.&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;System Design:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Starting from the highest level of the multi-agent system. This system will be&lt;br /&gt;
containing four agents, it is a distributed system, which means each agent in this&lt;br /&gt;
system can make their own decision. But even with this ability, the four agents still&lt;br /&gt;
need to communicate with each other about the velocity, the current position, the&lt;br /&gt;
destination and the angle of the agent. The exchange of these data allows them to&lt;br /&gt;
make an agreement so that they can achieve the desired formation. This part was&lt;br /&gt;
done by Fan.&lt;br /&gt;
Limited by the tools we have in reality, we have to skip some convenient way to&lt;br /&gt;
approach final result, such as the ultra-sonic sensor, or the infrared distance meter.&lt;br /&gt;
In this case, we use the camera instead to produce a coordinate system to measure&lt;br /&gt;
the distances between objects and the agents. We also use the camera to apply&lt;br /&gt;
image recognition so that the agents are able to recognize the object in its “sight”.&lt;br /&gt;
By passing these information into processor, our algorithm will provide an optimized&lt;br /&gt;
path to the agent, the agent then give commands to the motors. The description&lt;br /&gt;
above is only for one single agent.&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10571</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10571"/>
		<updated>2018-08-12T07:59:37Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project aims to develop control algorithms on the Qbot platform which is manufactured by Quanser. Our algorithms allow a group of Qbots to be able to move in a desired motion, to avoid the obstacles in its way, or to maintain a kind of formation when it is moving.&lt;br /&gt;
&lt;br /&gt;
== To be continued ==&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Tailin Song&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10570</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10570"/>
		<updated>2018-08-12T07:56:19Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Abstract */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
This project aims to develop control algorithms on the Qbot platform which is manufactured by Quanser. Our algorithms allow a group of Qbots to be able to move in a desired motion, to avoid the obstacles in its way, or to maintain a kind of formation when it is moving.&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Tailin Song&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10569</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10569"/>
		<updated>2018-08-12T07:37:27Z</updated>

		<summary type="html">&lt;p&gt;A1661336: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Tailin Song&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10568</id>
		<title>Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s2-270_Autonomous_Ground_Vehicles_Self-Guided_Formation_Control&amp;diff=10568"/>
		<updated>2018-08-12T07:36:46Z</updated>

		<summary type="html">&lt;p&gt;A1661336: Created page with &amp;quot; == Summary ==    == Project Team == &amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;  Liang Xu  Fan Yang  Tailin Song  Qiao Sang  &amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;  Xin Yuan  Yutong Liu  &amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;  Prof. Peng Sh...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Summary ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Group Members:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Liang Xu&lt;br /&gt;
&lt;br /&gt;
Fan Yang&lt;br /&gt;
&lt;br /&gt;
Tailin Song&lt;br /&gt;
&lt;br /&gt;
Qiao Sang&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Advisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Xin Yuan&lt;br /&gt;
&lt;br /&gt;
Yutong Liu&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors:&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Prof. Peng Shi&lt;br /&gt;
&lt;br /&gt;
Prof. Cheng-Chew Lim&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=10567</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=10567"/>
		<updated>2018-08-12T07:32:34Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* Masters Projects(mid year) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Projects ==&lt;br /&gt;
=== 2018 ===&lt;br /&gt;
==== Ingenuity 2018 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 22-23 October 2018&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2018s1-100 Automated Person Identification with Multiple Sensors]]&lt;br /&gt;
* [[Projects:2018s1-101 Classification of Network Traffic Flows using Deep and Transfer Learning]]&lt;br /&gt;
* [[Projects:2018s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2018s1-103 Improving Usability and User Interaction with KALDI Open-Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2018s1-105 Cyber security - Car Hacking]]&lt;br /&gt;
* [[Projects:2018s1-107 Evolution of Spiking Neural Networks for UAV Control]]&lt;br /&gt;
* [[Projects:2018s1-108 Machine Learning Multi-Spectral Simulation]]&lt;br /&gt;
* [[Projects:2018s1-109 High-Resolution Change Prediction using Sparse Spatio-temporal Data]]&lt;br /&gt;
* [[Projects:2018s1-110 Future Submarine Project]]&lt;br /&gt;
* [[Projects:2018s1-111 IoT Connectivity Investigation]]&lt;br /&gt;
* [[Projects:2018s1-112 Automate the 3D Design and Manufacture of Electrical Control Panels using Advanced Digital Technologies]]&lt;br /&gt;
* [[Projects:2018s1-113 AVR Test Rig]]&lt;br /&gt;
* [[Projects:2018s1-115 Passive Radar in the High Frequency Band using Civil Transmissions]]&lt;br /&gt;
* [[Projects:2018s1-116 Data Analytics]]&lt;br /&gt;
* [[Projects:2018s1-119 Design of Calibration Platform for Medical Sensing]]&lt;br /&gt;
* [[Projects:2018s1-121 In-Memory Semantic Processing Using Hyperdimensional Computing]]&lt;br /&gt;
* [[Projects:2018s1-122 NI Autonomous Robotics Competition]]&lt;br /&gt;
* [[Projects:2018s1-128 Software Tool for Fitting Statistical Models to Sea Clutter Data]]&lt;br /&gt;
* [[Projects:2018s1-135 A Low Cost Impedance and Transfer Function Analyser Part 2]]&lt;br /&gt;
* [[Projects:2018s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2018s1-141 CSI Adelaiide:  Who killed the Somerton Man?]]&lt;br /&gt;
* [[Projects:2018s1-142 Modelling the Dynamics of Cryptocurrency Market]]&lt;br /&gt;
* [[Projects:2018s1-145 Simplified Indoor UAV Operations]]&lt;br /&gt;
* [[Projects:2018s1-151 Raspberry Pi as a Core Device for Efficient Biological Field Survey Data Collection]]&lt;br /&gt;
* [[Projects:2018s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2018s1-157 Designing Airway Pressure Control Technology for Sleep Apnea Treatment]]&lt;br /&gt;
* [[Projects:2018s1-160 UAV Platform for Cognitive AI Agent]]&lt;br /&gt;
* [[Projects:2018s1-164 Private but Public on the Blockchain]]&lt;br /&gt;
* [[Projects:2018s1-165 Dual IP Stack Exfiltration - Methods and Defences]]&lt;br /&gt;
* [[Projects:2018s1-167 Security Assessment of Watchem and Moochies Watches]]&lt;br /&gt;
* [[Projects:2018s1-168 Penetration Testing of the SpaceTalk Tracking Watch]]&lt;br /&gt;
* [[Projects:2018s1-169 A Better Security Framework for Wearable Devices]]&lt;br /&gt;
* [[Projects:2018s1-175 Split-ring resonators for measuring spatially-distributed complex permittivity at microwave frequencies]]&lt;br /&gt;
* [[Projects:2018s1-181 BMW Autonomous Vehicle]]&lt;br /&gt;
* [[Projects:2018s1-182 Inertia Characterisation and Modelling in a Renewable Energy and Battery Based Microgrid]]&lt;br /&gt;
* [[Projects:2018s1-191 Quasi-Linear Circuit Theory]]&lt;br /&gt;
* [[Projects:2018s1-192 Karplus-Strong Synthesis of Sound]]&lt;br /&gt;
* [[Projects:2018s1-195 Novel Flexible Materials for Wearable Antennas]]&lt;br /&gt;
* [[Projects:2018s1-196 Concealed Wearable Antennas]]&lt;br /&gt;
* [[Projects:2018s1-136UG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2018s1-118 Design of Wireless Sensors for Sleep Apnea Detection]]&lt;br /&gt;
* [[Projects:2018s1-170 Intelligent Parking Control for Autonomous Ground Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-177 Radio astronomy with software-defined radio]]&lt;br /&gt;
* [[Projects:2018s1-178 Creating microwave antennas with 3D printing]]&lt;br /&gt;
* [[Projects:2018s1-180 Development and Control of a Standalone Power Source for Residential Dwellings and Small Businesses]]&lt;br /&gt;
* [[Projects:2018s1-186 Calculation and Optimisation of Energy Usage of Electric Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-190 Dynamical Modelling of Synchronous Machines]]&lt;br /&gt;
* [[Projects:2018s1-136PG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid year) ====&lt;br /&gt;
* [[Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control]]&lt;br /&gt;
&lt;br /&gt;
=== 2017 ===&lt;br /&gt;
==== Ingenuity 2017 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 30-31 October 2017&lt;br /&gt;
* Prizes&lt;br /&gt;
** Best EEE Wiki: Classifying Network Traffic Flows with Deep-Learning by Kyle Thornton, Clinton Page, Daniel Smit&lt;br /&gt;
** Best EEE Exhibit: Face Recognition using 3D Data by Orbille Piol, Michael Sadler, Jesse Willsmore&lt;br /&gt;
[[File:Ingenuity 2017.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2017s1-100 Face Recognition using 3D Data]]&lt;br /&gt;
* [[Projects:2017s1-101 Classifying Network Traffic Flows with Deep-Learning]]&lt;br /&gt;
* [[Projects:2017s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2017s1-103 Improving Usability and User Interaction with KALDI Open- Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2017s1-105 “CARLOS TC” Tow Bar Testing Facility]]&lt;br /&gt;
* [[Projects:2017s1-106 Inertia Characterisation and Modelling in a Renewable Energy-based Microgrid]]&lt;br /&gt;
* [[Projects:2017s1-107 Protection of a Convoy of Ships Under Attack]]&lt;br /&gt;
* [[Projects:2017s1-108 Stability and Control of 3-D Formations]]&lt;br /&gt;
* [[Projects:2017s1-109 Dynamically Forming Formations ]]&lt;br /&gt;
* [[Projects:2017s1-110 ‘Real-Time’ FPGA Based Object Recognition &amp;amp; Threat Detection in Hardware]]&lt;br /&gt;
* [[Projects:2017s1-111 OTHR Alternative Computing Architecture]]&lt;br /&gt;
* [[Projects:2017s1-120 Hardware Realisation of the Unum 2.0 Number Format]]&lt;br /&gt;
* [[Projects:2017s1-121 Learning Procedural Knowledge using Random Forests]]&lt;br /&gt;
* [[Projects:2017s1-125 Drone Imaging and Classification using Radar]]&lt;br /&gt;
* [[Projects:2017s1-127 Sound Trilateration for Positioning of the Sound Source]]&lt;br /&gt;
* [[Projects:2017s1-135 A Low Cost Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2017s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2017s1-156 Interrogating a Glucose Monitor]]&lt;br /&gt;
* [[Projects:2017s1-157 Automated Classification of Brain Activity during Sleep]]&lt;br /&gt;
* [[Projects:2017s1-160 Cooperating Autonomous Vehicles]]&lt;br /&gt;
* [[Projects:2017s1-165 Forensic Investigation of Fitness Devices]]&lt;br /&gt;
* [[Projects:2017s1-167a Applications of Blockchain to Equity Fund Raising]] &lt;br /&gt;
* [[Projects:2017s1-167b Real Time Video Steam Substitution]]&lt;br /&gt;
* [[Projects:2017s1-167c Smart Grid Security]]&lt;br /&gt;
* [[Projects:2017s1-167d Twitterbots]]&lt;br /&gt;
* [[Projects:2017s1-175 Split-Ring Resonators for Measuring Spatially-Distributed Complex Permittivity at Microwave Frequencies]]&lt;br /&gt;
* [[Projects:2017s1-176 Smart Mirror with Raspberry Pi]]&lt;br /&gt;
* [[Projects:2017s1-177 Radio Astronomy with Software-Defined Radio]]&lt;br /&gt;
* [[Projects:2017s1-180 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2017s1-181 BMW Autonomous Vehicle Project Camera Based Lane Detection in a Road Vehicle for Autonomous Driving]]&lt;br /&gt;
* [[Projects:2017s1-182 BMW Autonomous Vehicle Project Development of Vehicle Control Algorithm]]&lt;br /&gt;
* [[Projects:2017s1-185 BMW Autonomous Vehicle Project Implementation of a Steering Angle Controller on a Lab Test Bench]]&lt;br /&gt;
* [[Projects:2017s1-186 Playing Music Through a Tesla Coil]]&lt;br /&gt;
* [[Projects:2017s1-190 Modelling and Validation for Synchronous Generators]]&lt;br /&gt;
* [[Projects:2017s1-191 Power Electronics for Inductive Power Transfer (IPT)]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2017s1-122 On-Chip Learning]]&lt;br /&gt;
* [[Projects:2017s1-150 Statistical Natural Language Processing]]&lt;br /&gt;
* [[Projects:2017s1-158 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2017s1-159 Detecting Penguin’s Heart Sounds]]&lt;br /&gt;
* [[Projects:2017s1-166 Development of TV Ad Blocker]]&lt;br /&gt;
* [[Projects:2017s1-170 Formation Control and Obstacle Avoidance for Heterogeneous Multi-Agent Systems (Unmanned Aerial Vehicles and Robots)]]&lt;br /&gt;
* [[Projects:2017s1-183 BMW Autonomous Vehicle Project Development of a sensor fusion algorithm to determine the current vehicle position in a local tangential plane ]]&lt;br /&gt;
* [[Projects:2017s1-184 BMW Autonomous Vehicle Project Implement the Longitudinal Control Algorithm of the Vehicle]]&lt;br /&gt;
* [[Projects:2017s1-195 Solar Aquaponics ]]&lt;br /&gt;
* [[Projects:2017s1-196 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel textile antennas for wearable wireless communications]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2017 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* 5 June 2018&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2018.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-201 Detection and Classification in an Indoor Environment Using WiFi]]&lt;br /&gt;
* [[Projects:2017s2-205 Multi-Profile Parallel Speech-to Text Transcriber]]&lt;br /&gt;
* [[Projects:2017s2-220 Alternative Approaches to AI for the Soccer Table]]&lt;br /&gt;
* [[Projects:2017s2-225 Digital Microphone Array using MEMS Microphones]]&lt;br /&gt;
* [[Projects:2017s2-275 Creating Microwave Antennas with 3D Printing]]&lt;br /&gt;
* [[Projects:2017s2-290 The Magnetorquer]]&lt;br /&gt;
* [[Projects:2017s2-291 Measurement of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2017s2-292 Wide-Area Sun Sensor]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-215 Wireless Power Transfer]]&lt;br /&gt;
* [[Projects:2017s2-235 An On-line 10 kHz to 1 MHz Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel Textile Antennas for Wearable Wireless Communications]]&lt;br /&gt;
* [[Projects:2017s2-270 Reconfiguration on Multi-Agent Systems (Robots Systems)]]&lt;br /&gt;
* [[Projects:2017s2-285 Short-term Solutions for the South Australian Electric Power System]]&lt;br /&gt;
* [[Projects:2017s2-295 Feral Animal Detection using IR Thermal Imagery]]&lt;br /&gt;
&lt;br /&gt;
=== 2016 ===&lt;br /&gt;
==== Ingenuity 2016 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 27-28 October 2016&lt;br /&gt;
&lt;br /&gt;
[[File:Ingenuity_2016.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2016s1-101 Predicting Power Outages from Weather Patterns]]&lt;br /&gt;
* [[Projects:2016s1-102 Classifying Internet Applications and Detecting Malicious Traffic from Network Communications]]&lt;br /&gt;
* [[Projects:2016s1-105 Non-Contact Photoplethysmogram]]&lt;br /&gt;
* [[Projects:2016s1-106 Airborne Antenna Measurement Platform]]&lt;br /&gt;
* [[Projects:2016s1-109 Development, Characterisation and Modelling of Renewable Energy-Based Microgrid]]&lt;br /&gt;
* [[Projects:2016s1-122 A Complete Model for a Synchronous Machine]]&lt;br /&gt;
* [[Projects:2016s1-126 A Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2016s1-128 Evaluating Programming Languages for Educational Robotics Kits]]&lt;br /&gt;
* [[Projects:2016s1-132 RF Transceiver Design for a Portable Radar]]&lt;br /&gt;
* [[Projects:2016s1-145 Indoor localisation using Bluetooth LE for Event Advertising]]&lt;br /&gt;
* [[Projects:2016s1-146 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2016s1-160a Cyber Security - IoT and CAN Bus Security]]&lt;br /&gt;
* [[Projects:2016s1-160b Cyber Security - e-Government and Network Security]]&lt;br /&gt;
* [[Projects:2016s1-160c Cyber Security - Personal Networks and Devices]]&lt;br /&gt;
* [[Projects:2016s1-171 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2016s1-172 Computer Aided Testing of Batteries for Energy Storage Applications]]&lt;br /&gt;
* [[Projects:2016s1-187 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s1-196 Wireless Power Transfer]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2016s1-120 Attacking Cancer with Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-121 Measurement of  Transformer Parameters]]&lt;br /&gt;
* [[Projects:2016s1-131 ECG Enhancement with Advanced Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-141 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2016s1-142 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2016s1-175 Environment Exploring Based on Inertia Measurement Unit and Computer Vision ]]&lt;br /&gt;
* [[Projects:2016s1-180 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2016s1-181 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2016s1-190 Inductive Power Transfer ]]&lt;br /&gt;
* [[Projects:2016s1-197 Sound Triangulation for Invisible Keyboards]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* Tuesday 6 June 2017&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2017.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-216 GPS Receiver Location and Atmosphere Characterisation]]&lt;br /&gt;
* [[Projects:2016s2-235 Personal Radar for Safer Walk &amp;amp; Text]]&lt;br /&gt;
* [[Projects:2016s2-236 Electronic Controller for Spatial Microwave Modulator]]&lt;br /&gt;
* [[Projects:2016s2-255 Solar Aquaponics]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-215 Bhutan Power System Islanding and Special Protection Devices]]&lt;br /&gt;
* [[Projects:2016s2-220 Path Planning and Collision Avoidance for Aduino Robots]]&lt;br /&gt;
* [[Projects:2016s2-230 New Materials for Wearable Antennas in Flexible Electronics]]&lt;br /&gt;
* [[Projects:2016s2-240 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2016s2-245 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s2-246 Feral Cat Detector]]&lt;br /&gt;
* [[Projects:2016s2-250 On-Line Mains Power Cable Time Domain Reflectometry]]&lt;br /&gt;
&lt;br /&gt;
=== 2015 ===&lt;br /&gt;
==== Ingenuity 2015 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 26-27 October 2015&lt;br /&gt;
[[File:Ingenuity_2015.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2015s1-01 LaunchBox]]&lt;br /&gt;
* [[Projects:2015s1-04 Detecting Cyber Malicious Command-Control (C2) Network Traffic Communications]]&lt;br /&gt;
* [[Projects:2015s1-05 Multi-Profile Parallel Transcriber]]&lt;br /&gt;
* [[Projects:2015s1-06 Performance Evaluation of KALDI Open Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2015s1-07 Remote AVR Control for Embedded Generation]]&lt;br /&gt;
* [[Projects:2015s1-08 Developing a Home Energy Management System]]&lt;br /&gt;
* [[Projects:2015s1-10 Lagrangian Modelling of Synchronous Machines]]&lt;br /&gt;
* [[Projects:2015s1-11 Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2015s1-12 An Open-Source Local Area Network (LAN)]]&lt;br /&gt;
* [[Projects:2015s1-13 A One-Time Pad Generator]]&lt;br /&gt;
* [[Projects:2015s1-15 AI for a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-16 System Engineering a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-17 Analysis of Electrical and Software Design in the Effectiveness of Robotics STEM Outreach Programs]]&lt;br /&gt;
* [[Projects:2015s1-18 ARM Processor For Digital Systems Practicals]]&lt;br /&gt;
* [[Projects:2015s1-21 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2015s1-25 Indoor localisation using Bluetooth LE for event advertising]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-28 Wireless Rotation Detector]]&lt;br /&gt;
* [[Projects:2015s1-31 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2015s1-32 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2015s1-36 Heartbeat Perception App]]&lt;br /&gt;
* [[Projects:2015s1-40 Flexible ad-hoc Network A:  Physical Layer]]&lt;br /&gt;
* [[Projects:2015s1-42 Rule-based AI Agent Development: Tic Tac Toe]]&lt;br /&gt;
* [[Projects:2015s1-46 Channel Measurements for Search &amp;amp; Rescue]]&lt;br /&gt;
* [[Projects:2015s1-45 Analysis and Visualisation of Packet Data for Cyber-Security Purposes]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (UG)]]&lt;br /&gt;
* [[Projects:2015s1-56 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2015s1-61 Computer Aided Measurement and Analysis of Equal Efficiency Characteristics of Electrical Machines]]&lt;br /&gt;
* [[Projects:2015s1-70 Design of Power Line Communication Coupler for Single-Wire Earth Return Lines]]&lt;br /&gt;
* [[Projects:2015s1-73 Improved Electric Micro-Bus Design for Nepal]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2015s1-09 Development of a Deadman Switch for Tramline Traction Simulation Tool]]&lt;br /&gt;
* [[Projects:2015s1-22 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-35 Brain computer interface control for biomedical applications]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (PG)]]&lt;br /&gt;
* [[Projects:2015s1-58 Design And Development Of A New Respiratory Monitor For Detection Of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2015s1-71 Inductive Power Transfers]]&lt;br /&gt;
* [[Projects:2015s1-72 Wind Turbine Control Simulator]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* 31 May 2016&lt;br /&gt;
[[File:MidyearExpo_2016.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-201 Development of Energy Storage Knowledge Bank]]&lt;br /&gt;
* [[Projects:2015s2-210 Automated Classification of Heartbeats in Long-Term ECG]]&lt;br /&gt;
* [[Projects:2015s2-211 Health Visa]]&lt;br /&gt;
* [[Projects:2015s2-212 TV Control and Monitoring]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-202 An On-line PLC frequency Impedance Analyser]]&lt;br /&gt;
* [[Projects:2015s2-203 Analysis of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2015s2-204 Unbalanced Operation of Permanent Magnet Generators]]&lt;br /&gt;
* [[Projects:2015s2-206 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2015s2-207 Tracking, Herding and Routing by Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2015s2-209 Automated Classification of Brain Activity During Sleep]]&lt;br /&gt;
* [[Projects:2015s2-216 Feral Cat Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2014 ===&lt;br /&gt;
==== Ingenuity 2014 ====&lt;br /&gt;
Ingenuity 2014 was held at the Adelaide Convention Centre on Thursday 30 October.  It showcased 40 of the school&amp;#039;s completing final year honours and masters projects.&lt;br /&gt;
[[File:Ingenuity_2014_group_shot.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Final Year Projects ====&lt;br /&gt;
* [[Projects:2014S1-01 Development of Fully Automated Educational and Training Tool for Wind and Solar Energy using National Instruments’ ELVIS Based System]]&lt;br /&gt;
* [[Projects:2014S1-04 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2014S1-06 Bell Ringing Robot: Hawkear]]&lt;br /&gt;
* [[Projects:2014S1-10 Development of Machine Learning Techniques for Analysing Network Communications]]&lt;br /&gt;
* [[Projects:2014S1-11 Wireless Rotation Detector for Sport Equipment]]&lt;br /&gt;
* [[Projects:2014S1-12 Exploring RF Energy Harvesting for Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-13 S-band Communication for Small Satellite]]&lt;br /&gt;
* [[Projects:2014S1-15 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2014S1-16 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2014S1-21 Design And Development of a New Respiratory Monitor for Detection of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2014S1-23 Real-Time Adaptive Filters]]&lt;br /&gt;
* [[Projects:2014S1-24 AI Agent Development for an Autonomous Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-26 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2014S1-29 Measurement and Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2014S1-33 Software-Defined Radio for VLF Transmission]]&lt;br /&gt;
* [[Projects:2014S1-35 Human Activity Recognition to Support Independent Living]]&lt;br /&gt;
* [[Projects:2014S1-36 What are Social Appliances? Building your Tomorrow Today…]]&lt;br /&gt;
* [[Projects:2014S1-37 Wireless Monitoring and Control of Wine Fermentation Process]]&lt;br /&gt;
* [[Projects:2014S1-42 Current-Voltage Tracer Experiment]]&lt;br /&gt;
* [[Projects:2014S1-44 Cracking the Voynich Manuscript Code]]&lt;br /&gt;
* [[Projects:2014S1-45 Is Secure Communication Possible?]]&lt;br /&gt;
* [[Projects:2014S1-47 Robotic Arm for Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-48 FPGA-based Software GPS Receiver]]&lt;br /&gt;
* [[Projects:2014S1-49 Can Solar PV cells be used as Telecommunications Receivers?]]&lt;br /&gt;
* [[Projects:2014S1-50 Exploiting HF Emitters of Opportunity for OTH Radar]]&lt;br /&gt;
* [[Projects:2014S1-51 Heart Signal Processing Software for Evaluating Pacemaker Effectiveness]]&lt;br /&gt;
* [[Projects:2014S1-53 Object Profiling for Custom Wheelchair Seating and Pressure Care]]&lt;br /&gt;
* [[Projects:2014S1-56 Inter-Satellite Links for CubeSats]]&lt;br /&gt;
* [[Projects:2014S1-57 Autonomous Vehicle Technologies]]&lt;br /&gt;
* [[Projects:2014s2-74 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2014s2-76 Teletraffic Modelling and Analysis of the New Britannia Roundabout]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2014S1-02 Network Optimisation in Distributed Generation Systems]]&lt;br /&gt;
* [[Projects:2014S1-03 Design of a Mobile Energy Storage System for Grid Integration]]&lt;br /&gt;
* [[Projects:2014S1-14 Wearable RFID Antennas]]&lt;br /&gt;
* [[Projects:2014S1-19 Analysis Of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor]]&lt;br /&gt;
* [[Projects:2014S1-38 Semi-Passive Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-39 Tell your Robot where to go with RFID (Improving Autonomous Navigation)]]&lt;br /&gt;
* [[Projects:2014S1-41 Inductive Power Transfer]]&lt;br /&gt;
* [[Projects:2014S1-43 Inverter Drive Experiment]]&lt;br /&gt;
* [[Projects:2014S1-54 Engineering of a CubeSat Power System]]&lt;br /&gt;
* [[Projects:2014s2-71 Calorimetry and Modelling of Lithium-Ion Chemical Batteries]]&lt;br /&gt;
* [[Projects:2014s2-72 Accurate Measurement and Modelling of a Switched-Mode Power Supply]]&lt;br /&gt;
* [[Projects:2014s2-75 Formation Control of Two Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2014s2-78 Investigation the Design and Development of Miniature Specific Gravity Sensor]]&lt;br /&gt;
* [[Projects:2014s2-79 FPGA-base Hardware Iimplementation of Machine-Learning Methods for Handwriting and Speech Recognition]]&lt;br /&gt;
* [[Projects:2014s2-80 Swinging Crane Project]]&lt;br /&gt;
* [[Projects:2014s2-82 Grid Integration of Solar PV Embedded Generation]]&lt;br /&gt;
* [[Projects:2014s2-83 A Testing and Characterising Device for batteries of various chemistries]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[:Category:Projects]]&lt;br /&gt;
&lt;br /&gt;
== Projects Allocation ==&lt;br /&gt;
=== 2014S1 ===&lt;br /&gt;
[[Projects_Allocation_2014S1]]&lt;br /&gt;
=== 2013S2 ===&lt;br /&gt;
[[Projects_Allocation_2013S2]]&lt;br /&gt;
&lt;br /&gt;
== School Resources ==&lt;br /&gt;
The school operates a number of teaching laboratories.&lt;br /&gt;
* [[Projects Lab]]&lt;br /&gt;
* [[Electronics Teaching Labs]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Store ==&lt;br /&gt;
&lt;br /&gt;
* [[Electronics Store (EM316)]]&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=10566</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=10566"/>
		<updated>2018-08-12T07:31:33Z</updated>

		<summary type="html">&lt;p&gt;A1661336: /* 2018 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Projects ==&lt;br /&gt;
=== 2018 ===&lt;br /&gt;
==== Ingenuity 2018 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 22-23 October 2018&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2018s1-100 Automated Person Identification with Multiple Sensors]]&lt;br /&gt;
* [[Projects:2018s1-101 Classification of Network Traffic Flows using Deep and Transfer Learning]]&lt;br /&gt;
* [[Projects:2018s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2018s1-103 Improving Usability and User Interaction with KALDI Open-Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2018s1-105 Cyber security - Car Hacking]]&lt;br /&gt;
* [[Projects:2018s1-107 Evolution of Spiking Neural Networks for UAV Control]]&lt;br /&gt;
* [[Projects:2018s1-108 Machine Learning Multi-Spectral Simulation]]&lt;br /&gt;
* [[Projects:2018s1-109 High-Resolution Change Prediction using Sparse Spatio-temporal Data]]&lt;br /&gt;
* [[Projects:2018s1-110 Future Submarine Project]]&lt;br /&gt;
* [[Projects:2018s1-111 IoT Connectivity Investigation]]&lt;br /&gt;
* [[Projects:2018s1-112 Automate the 3D Design and Manufacture of Electrical Control Panels using Advanced Digital Technologies]]&lt;br /&gt;
* [[Projects:2018s1-113 AVR Test Rig]]&lt;br /&gt;
* [[Projects:2018s1-115 Passive Radar in the High Frequency Band using Civil Transmissions]]&lt;br /&gt;
* [[Projects:2018s1-116 Data Analytics]]&lt;br /&gt;
* [[Projects:2018s1-119 Design of Calibration Platform for Medical Sensing]]&lt;br /&gt;
* [[Projects:2018s1-121 In-Memory Semantic Processing Using Hyperdimensional Computing]]&lt;br /&gt;
* [[Projects:2018s1-122 NI Autonomous Robotics Competition]]&lt;br /&gt;
* [[Projects:2018s1-128 Software Tool for Fitting Statistical Models to Sea Clutter Data]]&lt;br /&gt;
* [[Projects:2018s1-135 A Low Cost Impedance and Transfer Function Analyser Part 2]]&lt;br /&gt;
* [[Projects:2018s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2018s1-141 CSI Adelaiide:  Who killed the Somerton Man?]]&lt;br /&gt;
* [[Projects:2018s1-142 Modelling the Dynamics of Cryptocurrency Market]]&lt;br /&gt;
* [[Projects:2018s1-145 Simplified Indoor UAV Operations]]&lt;br /&gt;
* [[Projects:2018s1-151 Raspberry Pi as a Core Device for Efficient Biological Field Survey Data Collection]]&lt;br /&gt;
* [[Projects:2018s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2018s1-157 Designing Airway Pressure Control Technology for Sleep Apnea Treatment]]&lt;br /&gt;
* [[Projects:2018s1-160 UAV Platform for Cognitive AI Agent]]&lt;br /&gt;
* [[Projects:2018s1-164 Private but Public on the Blockchain]]&lt;br /&gt;
* [[Projects:2018s1-165 Dual IP Stack Exfiltration - Methods and Defences]]&lt;br /&gt;
* [[Projects:2018s1-167 Security Assessment of Watchem and Moochies Watches]]&lt;br /&gt;
* [[Projects:2018s1-168 Penetration Testing of the SpaceTalk Tracking Watch]]&lt;br /&gt;
* [[Projects:2018s1-169 A Better Security Framework for Wearable Devices]]&lt;br /&gt;
* [[Projects:2018s1-175 Split-ring resonators for measuring spatially-distributed complex permittivity at microwave frequencies]]&lt;br /&gt;
* [[Projects:2018s1-181 BMW Autonomous Vehicle]]&lt;br /&gt;
* [[Projects:2018s1-182 Inertia Characterisation and Modelling in a Renewable Energy and Battery Based Microgrid]]&lt;br /&gt;
* [[Projects:2018s1-191 Quasi-Linear Circuit Theory]]&lt;br /&gt;
* [[Projects:2018s1-192 Karplus-Strong Synthesis of Sound]]&lt;br /&gt;
* [[Projects:2018s1-195 Novel Flexible Materials for Wearable Antennas]]&lt;br /&gt;
* [[Projects:2018s1-196 Concealed Wearable Antennas]]&lt;br /&gt;
* [[Projects:2018s1-136UG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2018s1-118 Design of Wireless Sensors for Sleep Apnea Detection]]&lt;br /&gt;
* [[Projects:2018s1-170 Intelligent Parking Control for Autonomous Ground Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-177 Radio astronomy with software-defined radio]]&lt;br /&gt;
* [[Projects:2018s1-178 Creating microwave antennas with 3D printing]]&lt;br /&gt;
* [[Projects:2018s1-180 Development and Control of a Standalone Power Source for Residential Dwellings and Small Businesses]]&lt;br /&gt;
* [[Projects:2018s1-186 Calculation and Optimisation of Energy Usage of Electric Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-190 Dynamical Modelling of Synchronous Machines]]&lt;br /&gt;
* [[Projects:2018s1-136PG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects(mid year) ====&lt;br /&gt;
* [[Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control]]&lt;br /&gt;
&lt;br /&gt;
=== 2017 ===&lt;br /&gt;
==== Ingenuity 2017 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 30-31 October 2017&lt;br /&gt;
* Prizes&lt;br /&gt;
** Best EEE Wiki: Classifying Network Traffic Flows with Deep-Learning by Kyle Thornton, Clinton Page, Daniel Smit&lt;br /&gt;
** Best EEE Exhibit: Face Recognition using 3D Data by Orbille Piol, Michael Sadler, Jesse Willsmore&lt;br /&gt;
[[File:Ingenuity 2017.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2017s1-100 Face Recognition using 3D Data]]&lt;br /&gt;
* [[Projects:2017s1-101 Classifying Network Traffic Flows with Deep-Learning]]&lt;br /&gt;
* [[Projects:2017s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2017s1-103 Improving Usability and User Interaction with KALDI Open- Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2017s1-105 “CARLOS TC” Tow Bar Testing Facility]]&lt;br /&gt;
* [[Projects:2017s1-106 Inertia Characterisation and Modelling in a Renewable Energy-based Microgrid]]&lt;br /&gt;
* [[Projects:2017s1-107 Protection of a Convoy of Ships Under Attack]]&lt;br /&gt;
* [[Projects:2017s1-108 Stability and Control of 3-D Formations]]&lt;br /&gt;
* [[Projects:2017s1-109 Dynamically Forming Formations ]]&lt;br /&gt;
* [[Projects:2017s1-110 ‘Real-Time’ FPGA Based Object Recognition &amp;amp; Threat Detection in Hardware]]&lt;br /&gt;
* [[Projects:2017s1-111 OTHR Alternative Computing Architecture]]&lt;br /&gt;
* [[Projects:2017s1-120 Hardware Realisation of the Unum 2.0 Number Format]]&lt;br /&gt;
* [[Projects:2017s1-121 Learning Procedural Knowledge using Random Forests]]&lt;br /&gt;
* [[Projects:2017s1-125 Drone Imaging and Classification using Radar]]&lt;br /&gt;
* [[Projects:2017s1-127 Sound Trilateration for Positioning of the Sound Source]]&lt;br /&gt;
* [[Projects:2017s1-135 A Low Cost Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2017s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2017s1-156 Interrogating a Glucose Monitor]]&lt;br /&gt;
* [[Projects:2017s1-157 Automated Classification of Brain Activity during Sleep]]&lt;br /&gt;
* [[Projects:2017s1-160 Cooperating Autonomous Vehicles]]&lt;br /&gt;
* [[Projects:2017s1-165 Forensic Investigation of Fitness Devices]]&lt;br /&gt;
* [[Projects:2017s1-167a Applications of Blockchain to Equity Fund Raising]] &lt;br /&gt;
* [[Projects:2017s1-167b Real Time Video Steam Substitution]]&lt;br /&gt;
* [[Projects:2017s1-167c Smart Grid Security]]&lt;br /&gt;
* [[Projects:2017s1-167d Twitterbots]]&lt;br /&gt;
* [[Projects:2017s1-175 Split-Ring Resonators for Measuring Spatially-Distributed Complex Permittivity at Microwave Frequencies]]&lt;br /&gt;
* [[Projects:2017s1-176 Smart Mirror with Raspberry Pi]]&lt;br /&gt;
* [[Projects:2017s1-177 Radio Astronomy with Software-Defined Radio]]&lt;br /&gt;
* [[Projects:2017s1-180 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2017s1-181 BMW Autonomous Vehicle Project Camera Based Lane Detection in a Road Vehicle for Autonomous Driving]]&lt;br /&gt;
* [[Projects:2017s1-182 BMW Autonomous Vehicle Project Development of Vehicle Control Algorithm]]&lt;br /&gt;
* [[Projects:2017s1-185 BMW Autonomous Vehicle Project Implementation of a Steering Angle Controller on a Lab Test Bench]]&lt;br /&gt;
* [[Projects:2017s1-186 Playing Music Through a Tesla Coil]]&lt;br /&gt;
* [[Projects:2017s1-190 Modelling and Validation for Synchronous Generators]]&lt;br /&gt;
* [[Projects:2017s1-191 Power Electronics for Inductive Power Transfer (IPT)]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2017s1-122 On-Chip Learning]]&lt;br /&gt;
* [[Projects:2017s1-150 Statistical Natural Language Processing]]&lt;br /&gt;
* [[Projects:2017s1-158 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2017s1-159 Detecting Penguin’s Heart Sounds]]&lt;br /&gt;
* [[Projects:2017s1-166 Development of TV Ad Blocker]]&lt;br /&gt;
* [[Projects:2017s1-170 Formation Control and Obstacle Avoidance for Heterogeneous Multi-Agent Systems (Unmanned Aerial Vehicles and Robots)]]&lt;br /&gt;
* [[Projects:2017s1-183 BMW Autonomous Vehicle Project Development of a sensor fusion algorithm to determine the current vehicle position in a local tangential plane ]]&lt;br /&gt;
* [[Projects:2017s1-184 BMW Autonomous Vehicle Project Implement the Longitudinal Control Algorithm of the Vehicle]]&lt;br /&gt;
* [[Projects:2017s1-195 Solar Aquaponics ]]&lt;br /&gt;
* [[Projects:2017s1-196 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel textile antennas for wearable wireless communications]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2017 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* 5 June 2018&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2018.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-201 Detection and Classification in an Indoor Environment Using WiFi]]&lt;br /&gt;
* [[Projects:2017s2-205 Multi-Profile Parallel Speech-to Text Transcriber]]&lt;br /&gt;
* [[Projects:2017s2-220 Alternative Approaches to AI for the Soccer Table]]&lt;br /&gt;
* [[Projects:2017s2-225 Digital Microphone Array using MEMS Microphones]]&lt;br /&gt;
* [[Projects:2017s2-275 Creating Microwave Antennas with 3D Printing]]&lt;br /&gt;
* [[Projects:2017s2-290 The Magnetorquer]]&lt;br /&gt;
* [[Projects:2017s2-291 Measurement of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2017s2-292 Wide-Area Sun Sensor]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-215 Wireless Power Transfer]]&lt;br /&gt;
* [[Projects:2017s2-235 An On-line 10 kHz to 1 MHz Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel Textile Antennas for Wearable Wireless Communications]]&lt;br /&gt;
* [[Projects:2017s2-270 Reconfiguration on Multi-Agent Systems (Robots Systems)]]&lt;br /&gt;
* [[Projects:2017s2-285 Short-term Solutions for the South Australian Electric Power System]]&lt;br /&gt;
* [[Projects:2017s2-295 Feral Animal Detection using IR Thermal Imagery]]&lt;br /&gt;
&lt;br /&gt;
=== 2016 ===&lt;br /&gt;
==== Ingenuity 2016 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 27-28 October 2016&lt;br /&gt;
&lt;br /&gt;
[[File:Ingenuity_2016.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2016s1-101 Predicting Power Outages from Weather Patterns]]&lt;br /&gt;
* [[Projects:2016s1-102 Classifying Internet Applications and Detecting Malicious Traffic from Network Communications]]&lt;br /&gt;
* [[Projects:2016s1-105 Non-Contact Photoplethysmogram]]&lt;br /&gt;
* [[Projects:2016s1-106 Airborne Antenna Measurement Platform]]&lt;br /&gt;
* [[Projects:2016s1-109 Development, Characterisation and Modelling of Renewable Energy-Based Microgrid]]&lt;br /&gt;
* [[Projects:2016s1-122 A Complete Model for a Synchronous Machine]]&lt;br /&gt;
* [[Projects:2016s1-126 A Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2016s1-128 Evaluating Programming Languages for Educational Robotics Kits]]&lt;br /&gt;
* [[Projects:2016s1-132 RF Transceiver Design for a Portable Radar]]&lt;br /&gt;
* [[Projects:2016s1-145 Indoor localisation using Bluetooth LE for Event Advertising]]&lt;br /&gt;
* [[Projects:2016s1-146 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2016s1-160a Cyber Security - IoT and CAN Bus Security]]&lt;br /&gt;
* [[Projects:2016s1-160b Cyber Security - e-Government and Network Security]]&lt;br /&gt;
* [[Projects:2016s1-160c Cyber Security - Personal Networks and Devices]]&lt;br /&gt;
* [[Projects:2016s1-171 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2016s1-172 Computer Aided Testing of Batteries for Energy Storage Applications]]&lt;br /&gt;
* [[Projects:2016s1-187 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s1-196 Wireless Power Transfer]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2016s1-120 Attacking Cancer with Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-121 Measurement of  Transformer Parameters]]&lt;br /&gt;
* [[Projects:2016s1-131 ECG Enhancement with Advanced Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-141 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2016s1-142 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2016s1-175 Environment Exploring Based on Inertia Measurement Unit and Computer Vision ]]&lt;br /&gt;
* [[Projects:2016s1-180 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2016s1-181 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2016s1-190 Inductive Power Transfer ]]&lt;br /&gt;
* [[Projects:2016s1-197 Sound Triangulation for Invisible Keyboards]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* Tuesday 6 June 2017&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2017.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-216 GPS Receiver Location and Atmosphere Characterisation]]&lt;br /&gt;
* [[Projects:2016s2-235 Personal Radar for Safer Walk &amp;amp; Text]]&lt;br /&gt;
* [[Projects:2016s2-236 Electronic Controller for Spatial Microwave Modulator]]&lt;br /&gt;
* [[Projects:2016s2-255 Solar Aquaponics]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-215 Bhutan Power System Islanding and Special Protection Devices]]&lt;br /&gt;
* [[Projects:2016s2-220 Path Planning and Collision Avoidance for Aduino Robots]]&lt;br /&gt;
* [[Projects:2016s2-230 New Materials for Wearable Antennas in Flexible Electronics]]&lt;br /&gt;
* [[Projects:2016s2-240 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2016s2-245 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s2-246 Feral Cat Detector]]&lt;br /&gt;
* [[Projects:2016s2-250 On-Line Mains Power Cable Time Domain Reflectometry]]&lt;br /&gt;
&lt;br /&gt;
=== 2015 ===&lt;br /&gt;
==== Ingenuity 2015 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 26-27 October 2015&lt;br /&gt;
[[File:Ingenuity_2015.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2015s1-01 LaunchBox]]&lt;br /&gt;
* [[Projects:2015s1-04 Detecting Cyber Malicious Command-Control (C2) Network Traffic Communications]]&lt;br /&gt;
* [[Projects:2015s1-05 Multi-Profile Parallel Transcriber]]&lt;br /&gt;
* [[Projects:2015s1-06 Performance Evaluation of KALDI Open Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2015s1-07 Remote AVR Control for Embedded Generation]]&lt;br /&gt;
* [[Projects:2015s1-08 Developing a Home Energy Management System]]&lt;br /&gt;
* [[Projects:2015s1-10 Lagrangian Modelling of Synchronous Machines]]&lt;br /&gt;
* [[Projects:2015s1-11 Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2015s1-12 An Open-Source Local Area Network (LAN)]]&lt;br /&gt;
* [[Projects:2015s1-13 A One-Time Pad Generator]]&lt;br /&gt;
* [[Projects:2015s1-15 AI for a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-16 System Engineering a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-17 Analysis of Electrical and Software Design in the Effectiveness of Robotics STEM Outreach Programs]]&lt;br /&gt;
* [[Projects:2015s1-18 ARM Processor For Digital Systems Practicals]]&lt;br /&gt;
* [[Projects:2015s1-21 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2015s1-25 Indoor localisation using Bluetooth LE for event advertising]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-28 Wireless Rotation Detector]]&lt;br /&gt;
* [[Projects:2015s1-31 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2015s1-32 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2015s1-36 Heartbeat Perception App]]&lt;br /&gt;
* [[Projects:2015s1-40 Flexible ad-hoc Network A:  Physical Layer]]&lt;br /&gt;
* [[Projects:2015s1-42 Rule-based AI Agent Development: Tic Tac Toe]]&lt;br /&gt;
* [[Projects:2015s1-46 Channel Measurements for Search &amp;amp; Rescue]]&lt;br /&gt;
* [[Projects:2015s1-45 Analysis and Visualisation of Packet Data for Cyber-Security Purposes]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (UG)]]&lt;br /&gt;
* [[Projects:2015s1-56 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2015s1-61 Computer Aided Measurement and Analysis of Equal Efficiency Characteristics of Electrical Machines]]&lt;br /&gt;
* [[Projects:2015s1-70 Design of Power Line Communication Coupler for Single-Wire Earth Return Lines]]&lt;br /&gt;
* [[Projects:2015s1-73 Improved Electric Micro-Bus Design for Nepal]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2015s1-09 Development of a Deadman Switch for Tramline Traction Simulation Tool]]&lt;br /&gt;
* [[Projects:2015s1-22 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-35 Brain computer interface control for biomedical applications]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (PG)]]&lt;br /&gt;
* [[Projects:2015s1-58 Design And Development Of A New Respiratory Monitor For Detection Of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2015s1-71 Inductive Power Transfers]]&lt;br /&gt;
* [[Projects:2015s1-72 Wind Turbine Control Simulator]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* 31 May 2016&lt;br /&gt;
[[File:MidyearExpo_2016.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-201 Development of Energy Storage Knowledge Bank]]&lt;br /&gt;
* [[Projects:2015s2-210 Automated Classification of Heartbeats in Long-Term ECG]]&lt;br /&gt;
* [[Projects:2015s2-211 Health Visa]]&lt;br /&gt;
* [[Projects:2015s2-212 TV Control and Monitoring]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-202 An On-line PLC frequency Impedance Analyser]]&lt;br /&gt;
* [[Projects:2015s2-203 Analysis of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2015s2-204 Unbalanced Operation of Permanent Magnet Generators]]&lt;br /&gt;
* [[Projects:2015s2-206 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2015s2-207 Tracking, Herding and Routing by Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2015s2-209 Automated Classification of Brain Activity During Sleep]]&lt;br /&gt;
* [[Projects:2015s2-216 Feral Cat Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2014 ===&lt;br /&gt;
==== Ingenuity 2014 ====&lt;br /&gt;
Ingenuity 2014 was held at the Adelaide Convention Centre on Thursday 30 October.  It showcased 40 of the school&amp;#039;s completing final year honours and masters projects.&lt;br /&gt;
[[File:Ingenuity_2014_group_shot.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Final Year Projects ====&lt;br /&gt;
* [[Projects:2014S1-01 Development of Fully Automated Educational and Training Tool for Wind and Solar Energy using National Instruments’ ELVIS Based System]]&lt;br /&gt;
* [[Projects:2014S1-04 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2014S1-06 Bell Ringing Robot: Hawkear]]&lt;br /&gt;
* [[Projects:2014S1-10 Development of Machine Learning Techniques for Analysing Network Communications]]&lt;br /&gt;
* [[Projects:2014S1-11 Wireless Rotation Detector for Sport Equipment]]&lt;br /&gt;
* [[Projects:2014S1-12 Exploring RF Energy Harvesting for Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-13 S-band Communication for Small Satellite]]&lt;br /&gt;
* [[Projects:2014S1-15 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2014S1-16 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2014S1-21 Design And Development of a New Respiratory Monitor for Detection of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2014S1-23 Real-Time Adaptive Filters]]&lt;br /&gt;
* [[Projects:2014S1-24 AI Agent Development for an Autonomous Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-26 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2014S1-29 Measurement and Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2014S1-33 Software-Defined Radio for VLF Transmission]]&lt;br /&gt;
* [[Projects:2014S1-35 Human Activity Recognition to Support Independent Living]]&lt;br /&gt;
* [[Projects:2014S1-36 What are Social Appliances? Building your Tomorrow Today…]]&lt;br /&gt;
* [[Projects:2014S1-37 Wireless Monitoring and Control of Wine Fermentation Process]]&lt;br /&gt;
* [[Projects:2014S1-42 Current-Voltage Tracer Experiment]]&lt;br /&gt;
* [[Projects:2014S1-44 Cracking the Voynich Manuscript Code]]&lt;br /&gt;
* [[Projects:2014S1-45 Is Secure Communication Possible?]]&lt;br /&gt;
* [[Projects:2014S1-47 Robotic Arm for Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-48 FPGA-based Software GPS Receiver]]&lt;br /&gt;
* [[Projects:2014S1-49 Can Solar PV cells be used as Telecommunications Receivers?]]&lt;br /&gt;
* [[Projects:2014S1-50 Exploiting HF Emitters of Opportunity for OTH Radar]]&lt;br /&gt;
* [[Projects:2014S1-51 Heart Signal Processing Software for Evaluating Pacemaker Effectiveness]]&lt;br /&gt;
* [[Projects:2014S1-53 Object Profiling for Custom Wheelchair Seating and Pressure Care]]&lt;br /&gt;
* [[Projects:2014S1-56 Inter-Satellite Links for CubeSats]]&lt;br /&gt;
* [[Projects:2014S1-57 Autonomous Vehicle Technologies]]&lt;br /&gt;
* [[Projects:2014s2-74 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2014s2-76 Teletraffic Modelling and Analysis of the New Britannia Roundabout]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2014S1-02 Network Optimisation in Distributed Generation Systems]]&lt;br /&gt;
* [[Projects:2014S1-03 Design of a Mobile Energy Storage System for Grid Integration]]&lt;br /&gt;
* [[Projects:2014S1-14 Wearable RFID Antennas]]&lt;br /&gt;
* [[Projects:2014S1-19 Analysis Of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor]]&lt;br /&gt;
* [[Projects:2014S1-38 Semi-Passive Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-39 Tell your Robot where to go with RFID (Improving Autonomous Navigation)]]&lt;br /&gt;
* [[Projects:2014S1-41 Inductive Power Transfer]]&lt;br /&gt;
* [[Projects:2014S1-43 Inverter Drive Experiment]]&lt;br /&gt;
* [[Projects:2014S1-54 Engineering of a CubeSat Power System]]&lt;br /&gt;
* [[Projects:2014s2-71 Calorimetry and Modelling of Lithium-Ion Chemical Batteries]]&lt;br /&gt;
* [[Projects:2014s2-72 Accurate Measurement and Modelling of a Switched-Mode Power Supply]]&lt;br /&gt;
* [[Projects:2014s2-75 Formation Control of Two Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2014s2-78 Investigation the Design and Development of Miniature Specific Gravity Sensor]]&lt;br /&gt;
* [[Projects:2014s2-79 FPGA-base Hardware Iimplementation of Machine-Learning Methods for Handwriting and Speech Recognition]]&lt;br /&gt;
* [[Projects:2014s2-80 Swinging Crane Project]]&lt;br /&gt;
* [[Projects:2014s2-82 Grid Integration of Solar PV Embedded Generation]]&lt;br /&gt;
* [[Projects:2014s2-83 A Testing and Characterising Device for batteries of various chemistries]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[:Category:Projects]]&lt;br /&gt;
&lt;br /&gt;
== Projects Allocation ==&lt;br /&gt;
=== 2014S1 ===&lt;br /&gt;
[[Projects_Allocation_2014S1]]&lt;br /&gt;
=== 2013S2 ===&lt;br /&gt;
[[Projects_Allocation_2013S2]]&lt;br /&gt;
&lt;br /&gt;
== School Resources ==&lt;br /&gt;
The school operates a number of teaching laboratories.&lt;br /&gt;
* [[Projects Lab]]&lt;br /&gt;
* [[Electronics Teaching Labs]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Store ==&lt;br /&gt;
&lt;br /&gt;
* [[Electronics Store (EM316)]]&lt;/div&gt;</summary>
		<author><name>A1661336</name></author>
		
	</entry>
</feed>