<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1687420</id>
	<title>Projects - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://projectswiki.eleceng.adelaide.edu.au/projects/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=A1687420"/>
	<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php/Special:Contributions/A1687420"/>
	<updated>2026-04-30T00:04:31Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.31.4</generator>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Submarine_Optronics_System:_Contact_Detection&amp;diff=12236</id>
		<title>Projects:2018s1-110 Submarine Optronics System: Contact Detection</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Submarine_Optronics_System:_Contact_Detection&amp;diff=12236"/>
		<updated>2018-10-21T14:47:51Z</updated>

		<summary type="html">&lt;p&gt;A1687420: Created page with &amp;quot; == Project Team ==  &amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;  Tharidu Maliduwa Arachchige   Jacob Parker  &amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;  Dr. Danny Gibbins   Igor Dzeba (SAAB)  == Abstract == This project involves...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
===Aim===&lt;br /&gt;
The project aimed to fulfil several goals, including:&lt;br /&gt;
#	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
#	Develop a system that &lt;br /&gt;
##	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
##	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
===Motivation===&lt;br /&gt;
&lt;br /&gt;
The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
Three horizon detection methods and two target detection methods were researched and implemented as a result of this research methodology. Of which, two horizon detection methods and one target detection method with possible enhancements are discussed in this thesis. An overview and some results of the methods investigated by the project partner as a comparison to the methods described in detail in this document. Those approaches can be found in the thesis document produced by the project partner, which is listed in the references. &lt;br /&gt;
&lt;br /&gt;
For testing the produced algorithms, a large data set of imagery was required. This was acquired preliminarily by using imagery available from the internet. An issue with this method of data collection lied in the fact that the sources of the imagery were inconsistent, resulting in an inconsistency in the quality and the level of relevance of the imagery to the application. Thus, further imagery was obtained first hand by recording footage of a ship dock, where there were plenty of movement of different ships arriving and departing the docks. Single frames were then extracted from the capture to test the algorithms.&lt;br /&gt;
&lt;br /&gt;
== Horizon Detection Methods ==&lt;br /&gt;
&lt;br /&gt;
===Hough Transform Method===&lt;br /&gt;
A horizon detection algorithm using the Hough Transform was implemented. The algorithm takes in a colour image but performs its processing steps only on luminance component, so the image is converted to greyscale. &lt;br /&gt;
&lt;br /&gt;
An outline of the developed algorithm follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Apply Gaussian filter smoothening of the image.&lt;br /&gt;
#	Execute a Canny edge detector to produce a binary edge map of the image.&lt;br /&gt;
#	Compute the Hough Transform of the binary edge image&lt;br /&gt;
#	Find the peaks in the Hough Transform data that corresponds to the most dominant line in the image.&lt;br /&gt;
#	Plot the found line using its ‘Rho’ and ‘Theta’ components.&lt;br /&gt;
&lt;br /&gt;
MATLAB’s in-built functions for the Canny Edge Detector and Hough related functions were used for this implementation. As discussed in section 2.1, the Canny Edge Detector is optimal when the input image is pre-processed for smoothening, in order to remove image noise. The Canny detector used in the implementation does not incorporate this, so a Gaussian filter with standard deviation of 1.4 [5], is applied to achieve this noise reduction. &lt;br /&gt;
The implementation makes use of two Hough transform related functions - hough() and houghpeaks(). The former is used to form and plot the Hough transform of the edge map produced by the Canny edge detector and the latter is used to find its peak, representing the most dominant line in the input image. By default, the hough function is calibrated around vertical lines such that a vertical line will be read at an angle of 0° and horizontal lines at ±90°. As the plot of the Hough transform is between -90° and 90°, the horizontal lines are plotted at the edges of the graph, causing inaccuracies in the determination of peaks. This was resolved by rotating the input image by 90°, and searching for a now-vertical line in the image as the horizon. This enables the possibility of narrowing the window of the Hough transform plot to -15° and 15°, reducing the computational load. &lt;br /&gt;
&lt;br /&gt;
Once the Hough transform plot is obtained, the houghpeaks() function is used to locate the peak of the Hough transform plot. For this peak the corresponding ρ and θ values are extracted. Compensating for the rotation, these two values can be used to plot a line, on the original image, with equation:&lt;br /&gt;
y=(ρ-xsin(θ))/(cos⁡(θ)).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===DCT-based Method===&lt;br /&gt;
This section discusses a target detection algorithm using the characteristics of the Discrete Courier Transform (DCT) coefficients. The initial step described in the algorithm is a horizon detection algorithm using these characteristics, before modelling the sea-surface with a Gaussian Mixture Model (GMM) for ship or object detection. &lt;br /&gt;
&lt;br /&gt;
This horizon detection process involves decomposing the luminance component of an input image into 8x8 blocks in order to apply DCT to these blocks. Each 8x8 DCT block is then labelled with a t-score¬, which is a ratio of the mean (A ̅) of each block to the maximum mean over the entire image (A ̅_max); that is t=A ̅/A ̅_max . Optimally, the t-scores obtained from the sea segment will have different range to those obtained from the sky regions of the image, producing a bimodal distribution of these values ranging from 0 to 1. Thus, by selecting a threshold for which t values above the threshold belong to sky regions and below the threshold represent blocks in the sea. The reference paper claims that with 95% confidence that the ideal thresholding value is one from the interval between 0.065 and 0.135. However, it was discovered that this was not consistently true for the test data available, and the optimal threshold varied vastly with varying input images. This was solved by using a function that automatically calculates the threshold by analyzing the distribution of the data in question. The function graythresh() is an in-built function in MATLAB, and assumes that there is a clear bimodal separation in the data. &lt;br /&gt;
Using the thresholding, an initial segmentation of sea and sky in the image is obtained. Another variation to the implementation described in the paper lies in the determination of the horizon line. Zhang draws a horizon line approximately using the central points of all bottommost blocks in the sky region. The adapted implementation, however, uses the Hough Transform method discussed in section 4.1.1 on the segmented binary image to find the location of the horizon line. &lt;br /&gt;
&lt;br /&gt;
An outline of the DCT-based horizon detector follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Resize the image to dimensions divisible by 8.&lt;br /&gt;
#	For each 8x8 block of pixels in the image&lt;br /&gt;
#	Calculate its DCT coefficients&lt;br /&gt;
#	Calculate the mean of the DCT coefficients, disregarding the first AC element, for that block (A ̅)  &lt;br /&gt;
#	Obtain the t-score¬ for each block by dividing the means by the maximum of the means, t=A ̅/A ̅_max .&lt;br /&gt;
#	Obtain the threshold value for all t-scores. This is the value that separates t-scores belonging to the sky from those belonging to the sea pixels. Using this value, test each t-score to determine whether the 8x8 block, from which the score derived from, belongs to the sky component or sea component of the image and obtain a sea-sky segmentation of the image.&lt;br /&gt;
#	Input the segmented image to the Hough Transform line detector outlined in section 4.1 to find the horizon using the DCT-based segmentation. &lt;br /&gt;
&lt;br /&gt;
== Target Detection Method ==&lt;br /&gt;
&lt;br /&gt;
===DCT and GMM Based Target Method===&lt;br /&gt;
This section will describe the primary target detection method developed in this project, which uses the DCT coefficients as obtained in the DCT-based horizon detector outlined in section 4.1.2. The approach taken in this implementation is to split the image into two components, sea and sky, using a horizon line detector. From the literature, the algorithm from uses this horizon detector to find the line between sea and sky. However, the algorithm adapted for this project will use the Hough Transform derived horizon line detector outlined in section 4.1.1 for a more accurate detection. &lt;br /&gt;
The target detection process will use the average of the DCT coefficients in each 8x8 blocks to segment the entire image into two components - sky and not-sky. This achieved by taking the DCT coefficients in each 8x8 block of image pixels and taking the normalized mean as in the DCT-based horizon detection algorithm (4.1.2). Similarly, a threshold value that best separates the DCT averages from sky regions to other regions is found. All blocks with a DCT average less than this threshold value can be classified as sky, while the remaining blocks are classified as not sky. This process will achieve a ‘sea’ and ‘not-sky’ segmentation of the image.&lt;br /&gt;
While this is an effective way to eliminate the sky as a background from the image, the complex textures on the sea surfaces such as waves and wakes, some regions in the sea will pass through the threshold as significant foreground pixels. Thus, the DCT average threshold testing is only applied to the area of the image that resides above the horizon line, which is found using the Hough Transform horizon line detector. &lt;br /&gt;
For the sea region of the image, the texture-based features are extracted as three regions from each 8x8 block of DCT coefficients, as explained in the literature (section 2.2), and listed in a feature vector, X. The feature vector is used to train a Gaussian Mixture Model to represent a sea background model. Hence, the sea pixels used for the training must be a sufficient amount of pixels below the horizon line, to ensure that pixels from potential targets do not contaminate GMM training data. &lt;br /&gt;
Once the GMM is trained, a Mahalanobis distance is calculated for each of the sea-training feature vectors to measure the matching degree between the feature vectors and the Gaussian mixtures and define a threshold to be the maximum distance from a feature vector to the Gaussian mixture model centre. Once this threshold is set, the feature vectors are extracted from the image for all pixels below the horizon line, including any belonging to potential targets. The Mahalanobis distances between each vector and the GMM for the sea background are then calculated and compared with the defined threshold. All calculated distances that are less than the threshold are classified as belonging to sea regions, while others maybe sky or an anomaly. This produces another segmentation of ‘sea’ and ‘not-sea’.&lt;br /&gt;
By subtracting the sea segmentation from the sky segmentation, a binary mask can be obtained where the two background components – sea and sky – will be 0s and potential targets will be 1s. The binary will however have false detections due to outliers in the image data in both the sea and sky regions which can be cleaned up using morphological operations such as erosion and dilation. &lt;br /&gt;
Finally, the detected target is displayed by applying bounding boxes on the remaining regions of 1s in the final binary mask and overlaid on the original input image. An outline of the algorithm follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Resize the image to dimensions divisible by 8.&lt;br /&gt;
#	For each 8x8 block of pixels in the image&lt;br /&gt;
#	Calculate its DCT coefficients&lt;br /&gt;
#	Calculate the mean of the DCT coefficients, disregarding the first direct current element, for that block (A ̅)  &lt;br /&gt;
#	Obtain the t-score¬ for each block by dividing the means by the maximum of the means, t=A ̅/A ̅_max .&lt;br /&gt;
#	Obtain the threshold value for all t-scores. This is the value that separates t-scores belonging to the sky from those belonging to the sea pixels. &lt;br /&gt;
#	Apply the Hough Transform Horizon Detector to find the horizon line.&lt;br /&gt;
#	Using the threshold value, test each t-score from regions above the horizon line to determine whether the 8x8 block belongs to the sky or an anomaly and produce a binary image where the sky background pixels have value 0 and areas of interest have value 1. &lt;br /&gt;
#	Extract the region energies for DCT blocks that are located at least 16 pixels below the horizon and compile the energies into a feature vector of 3 columns, X_train.&lt;br /&gt;
#	Fit a Gaussian Mixture Model to the feature vector.&lt;br /&gt;
#	Compute the Mahalanobis distance between the GMM and each feature vector. Use the maximum distance found as a threshold, T. &lt;br /&gt;
#	Extract the regions energies for all DCT blocks below the horizon line, and compile a feature vector, X.&lt;br /&gt;
#	Calculate the Mahalanobis distance between the GMM and each feature vector.&lt;br /&gt;
#	Use the threshold T to determine whether each block belongs to the sea background or an anomaly on the sea surface.&lt;br /&gt;
#	Combine the two segmentations in order to achieve a binary mask where only regions of interest have high value.&lt;br /&gt;
#	Eliminate any detected regions located sufficiently above and below the horizon line to reduce false detections. &lt;br /&gt;
#	Perform a morphological dilation to increase the detection size and apply a bounding box around the detection. The dilation will help ensure the entire object is bound within the applied box.&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=12235</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=12235"/>
		<updated>2018-10-21T14:47:43Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Honours Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Projects ==&lt;br /&gt;
=== 2018 ===&lt;br /&gt;
==== Ingenuity 2018 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 22-23 October 2018&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2018s1-100 Automated Person Identification with Multiple Sensors]]&lt;br /&gt;
* [[Projects:2018s1-101 Classification of Network Traffic Flows using Deep and Transfer Learning]]&lt;br /&gt;
* [[Projects:2018s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2018s1-103 Improving Usability and User Interaction with KALDI Open-Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2018s1-105 Cyber security - Car Hacking]]&lt;br /&gt;
* [[Projects:2018s1-107 Evolution of Spiking Neural Networks for UAV Control]]&lt;br /&gt;
* [[Projects:2018s1-108 Machine Learning Multi-Spectral Simulation]]&lt;br /&gt;
* [[Projects:2018s1-109 High-Resolution Change Prediction using Sparse Spatio-temporal Data]]&lt;br /&gt;
* [[Projects:2018s1-110 Submarine Optronics System: Contact Detection]]&lt;br /&gt;
* [[Projects:2018s1-111 IoT Connectivity Investigation]]&lt;br /&gt;
* [[Projects:2018s1-112 Automate the 3D Design and Manufacture of Electrical Control Panels using Advanced Digital Technologies]]&lt;br /&gt;
* [[Projects:2018s1-113 AVR Test Rig]]&lt;br /&gt;
* [[Projects:2018s1-115 Passive Radar in the High Frequency Band using Civil Transmissions]]&lt;br /&gt;
* [[Projects:2018s1-116 Data Analytics]]&lt;br /&gt;
* [[Projects:2018s1-119 Design of Calibration Platform for Medical Sensing]]&lt;br /&gt;
* [[Projects:2018s1-121 In-Memory Semantic Processing Using Hyperdimensional Computing]]&lt;br /&gt;
* [[Projects:2018s1-122 NI Autonomous Robotics Competition]]&lt;br /&gt;
* [[Projects:2018s1-128 Software Tool for Fitting Statistical Models to Sea Clutter Data]]&lt;br /&gt;
* [[Projects:2018s1-135 A Low Cost Impedance and Transfer Function Analyser Part 2]]&lt;br /&gt;
* [[Projects:2018s1-136UG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
* [[Projects:2018s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2018s1-141 CSI Adelaide:  Who killed the Somerton Man?]]&lt;br /&gt;
* [[Projects:2018s1-142 Modelling the Dynamics of Cryptocurrency Market]]&lt;br /&gt;
* [[Projects:2018s1-145 Simplified Indoor UAV Operations]]&lt;br /&gt;
* [[Projects:2018s1-151 Raspberry Pi as a Core Device for Efficient Biological Field Survey Data Collection]]&lt;br /&gt;
* [[Projects:2018s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2018s1-157 Designing Airway Pressure Control Technology for Sleep Apnea Treatment]]&lt;br /&gt;
* [[Projects:2018s1-160 UAV Platform for Cognitive AI Agent]]&lt;br /&gt;
* [[Projects:2018s1-164 Private but Public on the Blockchain]]&lt;br /&gt;
* [[Projects:2018s1-165 Dual IP Stack Exfiltration - Methods and Defences]]&lt;br /&gt;
* [[Projects:2018s1-167 Security Assessment of Watchem and Moochies Watches]]&lt;br /&gt;
* [[Projects:2018s1-168 Penetration Testing of the SpaceTalk Tracking Watch]]&lt;br /&gt;
* [[Projects:2018s1-169 A Better Security Framework for Wearable Devices]]&lt;br /&gt;
* [[Projects:2018s1-175 Split-ring resonators for measuring spatially-distributed complex permittivity at microwave frequencies]]&lt;br /&gt;
* [[Projects:2018s1-181 BMW Autonomous Vehicle]]&lt;br /&gt;
* [[Projects:2018s1-182 Inertia Characterisation and Modelling in a Renewable Energy and Battery Based Microgrid]]&lt;br /&gt;
* [[Projects:2018s1-191 Quasi-Linear Circuit Theory]]&lt;br /&gt;
* [[Projects:2018s1-192 Karplus-Strong Synthesis of Sound]]&lt;br /&gt;
* [[Projects:2018s1-195 Novel Flexible Materials for Wearable Antennas]]&lt;br /&gt;
* [[Projects:2018s1-196 Concealed Wearable Antennas]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2018s1-118 Design of Wireless Sensors for Sleep Apnea Detection]]&lt;br /&gt;
* [[Projects:2018s1-136PG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
* [[Projects:2018s1-170 Intelligent Parking Control for Autonomous Ground Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-177 Radio astronomy with software-defined radio]]&lt;br /&gt;
* [[Projects:2018s1-178 Creating microwave antennas with 3D printing]]&lt;br /&gt;
* [[Projects:2018s1-180 Development and Control of a Standalone Power Source for Residential Dwellings and Small Businesses]]&lt;br /&gt;
* [[Projects:2018s1-186 Calculation and Optimisation of Energy Usage of Electric Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-190 Dynamical Modelling of Synchronous Machines]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2018 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* June 2019&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2018s2-235UG PMU Test generator]]&lt;br /&gt;
* [[Projects:2018s2-226 Software Library for Inverse Synthetic Aperture Radar]]&lt;br /&gt;
* [[Projects:2018s2-279 High-Gain Antennas for Terahertz Communications]]&lt;br /&gt;
* [[Projects:2018s2-280 Assessment of Port Pirie for Higher Renewable PV Energy Integration]]&lt;br /&gt;
* [[Projects:2018s2-293 Detailed Analysis of Ferromagnetism in the Periodic Domain]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid year) ====&lt;br /&gt;
* [[Projects:2018s2-235PG PMU Test generator]]&lt;br /&gt;
* [[Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control]]&lt;br /&gt;
* [[Projects:2018s2-285 SA Power System Modelling and Analysis]]&lt;br /&gt;
* [[Projects:2018s2-297 Wireless Rotation Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2017 ===&lt;br /&gt;
==== Ingenuity 2017 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 30-31 October 2017&lt;br /&gt;
* Prizes&lt;br /&gt;
** Best EEE Wiki: Classifying Network Traffic Flows with Deep-Learning by Kyle Thornton, Clinton Page, Daniel Smit&lt;br /&gt;
** Best EEE Exhibit: Face Recognition using 3D Data by Orbille Piol, Michael Sadler, Jesse Willsmore&lt;br /&gt;
[[File:Ingenuity 2017.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2017s1-100 Face Recognition using 3D Data]]&lt;br /&gt;
* [[Projects:2017s1-101 Classifying Network Traffic Flows with Deep-Learning]]&lt;br /&gt;
* [[Projects:2017s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2017s1-103 Improving Usability and User Interaction with KALDI Open- Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2017s1-105 “CARLOS TC” Tow Bar Testing Facility]]&lt;br /&gt;
* [[Projects:2017s1-106 Inertia Characterisation and Modelling in a Renewable Energy-based Microgrid]]&lt;br /&gt;
* [[Projects:2017s1-107 Protection of a Convoy of Ships Under Attack]]&lt;br /&gt;
* [[Projects:2017s1-108 Stability and Control of 3-D Formations]]&lt;br /&gt;
* [[Projects:2017s1-109 Dynamically Forming Formations ]]&lt;br /&gt;
* [[Projects:2017s1-110 ‘Real-Time’ FPGA Based Object Recognition &amp;amp; Threat Detection in Hardware]]&lt;br /&gt;
* [[Projects:2017s1-111 OTHR Alternative Computing Architecture]]&lt;br /&gt;
* [[Projects:2017s1-120 Hardware Realisation of the Unum 2.0 Number Format]]&lt;br /&gt;
* [[Projects:2017s1-121 Learning Procedural Knowledge using Random Forests]]&lt;br /&gt;
* [[Projects:2017s1-125 Drone Imaging and Classification using Radar]]&lt;br /&gt;
* [[Projects:2017s1-127 Sound Trilateration for Positioning of the Sound Source]]&lt;br /&gt;
* [[Projects:2017s1-135 A Low Cost Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2017s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2017s1-156 Interrogating a Glucose Monitor]]&lt;br /&gt;
* [[Projects:2017s1-157 Automated Classification of Brain Activity during Sleep]]&lt;br /&gt;
* [[Projects:2017s1-160 Cooperating Autonomous Vehicles]]&lt;br /&gt;
* [[Projects:2017s1-165 Forensic Investigation of Fitness Devices]]&lt;br /&gt;
* [[Projects:2017s1-167a Applications of Blockchain to Equity Fund Raising]] &lt;br /&gt;
* [[Projects:2017s1-167b Real Time Video Steam Substitution]]&lt;br /&gt;
* [[Projects:2017s1-167c Smart Grid Security]]&lt;br /&gt;
* [[Projects:2017s1-167d Twitterbots]]&lt;br /&gt;
* [[Projects:2017s1-175 Split-Ring Resonators for Measuring Spatially-Distributed Complex Permittivity at Microwave Frequencies]]&lt;br /&gt;
* [[Projects:2017s1-176 Smart Mirror with Raspberry Pi]]&lt;br /&gt;
* [[Projects:2017s1-177 Radio Astronomy with Software-Defined Radio]]&lt;br /&gt;
* [[Projects:2017s1-180 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2017s1-181 BMW Autonomous Vehicle Project Camera Based Lane Detection in a Road Vehicle for Autonomous Driving]]&lt;br /&gt;
* [[Projects:2017s1-182 BMW Autonomous Vehicle Project Development of Vehicle Control Algorithm]]&lt;br /&gt;
* [[Projects:2017s1-185 BMW Autonomous Vehicle Project Implementation of a Steering Angle Controller on a Lab Test Bench]]&lt;br /&gt;
* [[Projects:2017s1-186 Playing Music Through a Tesla Coil]]&lt;br /&gt;
* [[Projects:2017s1-190 Modelling and Validation for Synchronous Generators]]&lt;br /&gt;
* [[Projects:2017s1-191 Power Electronics for Inductive Power Transfer (IPT)]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2017s1-122 On-Chip Learning]]&lt;br /&gt;
* [[Projects:2017s1-150 Statistical Natural Language Processing]]&lt;br /&gt;
* [[Projects:2017s1-158 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2017s1-159 Detecting Penguin’s Heart Sounds]]&lt;br /&gt;
* [[Projects:2017s1-166 Development of TV Ad Blocker]]&lt;br /&gt;
* [[Projects:2017s1-170 Formation Control and Obstacle Avoidance for Heterogeneous Multi-Agent Systems (Unmanned Aerial Vehicles and Robots)]]&lt;br /&gt;
* [[Projects:2017s1-183 BMW Autonomous Vehicle Project Development of a sensor fusion algorithm to determine the current vehicle position in a local tangential plane ]]&lt;br /&gt;
* [[Projects:2017s1-184 BMW Autonomous Vehicle Project Implement the Longitudinal Control Algorithm of the Vehicle]]&lt;br /&gt;
* [[Projects:2017s1-195 Solar Aquaponics ]]&lt;br /&gt;
* [[Projects:2017s1-196 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel textile antennas for wearable wireless communications]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2017 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* 5 June 2018&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2018.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-201 Detection and Classification in an Indoor Environment Using WiFi]]&lt;br /&gt;
* [[Projects:2017s2-205 Multi-Profile Parallel Speech-to Text Transcriber]]&lt;br /&gt;
* [[Projects:2017s2-220 Alternative Approaches to AI for the Soccer Table]]&lt;br /&gt;
* [[Projects:2017s2-225 Digital Microphone Array using MEMS Microphones]]&lt;br /&gt;
* [[Projects:2017s2-275 Creating Microwave Antennas with 3D Printing]]&lt;br /&gt;
* [[Projects:2017s2-290 The Magnetorquer]]&lt;br /&gt;
* [[Projects:2017s2-291 Measurement of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2017s2-292 Wide-Area Sun Sensor]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-215 Wireless Power Transfer]]&lt;br /&gt;
* [[Projects:2017s2-235 An On-line 10 kHz to 1 MHz Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel Textile Antennas for Wearable Wireless Communications]]&lt;br /&gt;
* [[Projects:2017s2-270 Reconfiguration on Multi-Agent Systems (Robots Systems)]]&lt;br /&gt;
* [[Projects:2017s2-285 Short-term Solutions for the South Australian Electric Power System]]&lt;br /&gt;
* [[Projects:2017s2-295 Feral Animal Detection using IR Thermal Imagery]]&lt;br /&gt;
&lt;br /&gt;
=== 2016 ===&lt;br /&gt;
==== Ingenuity 2016 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 27-28 October 2016&lt;br /&gt;
&lt;br /&gt;
[[File:Ingenuity_2016.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2016s1-101 Predicting Power Outages from Weather Patterns]]&lt;br /&gt;
* [[Projects:2016s1-102 Classifying Internet Applications and Detecting Malicious Traffic from Network Communications]]&lt;br /&gt;
* [[Projects:2016s1-105 Non-Contact Photoplethysmogram]]&lt;br /&gt;
* [[Projects:2016s1-106 Airborne Antenna Measurement Platform]]&lt;br /&gt;
* [[Projects:2016s1-109 Development, Characterisation and Modelling of Renewable Energy-Based Microgrid]]&lt;br /&gt;
* [[Projects:2016s1-122 A Complete Model for a Synchronous Machine]]&lt;br /&gt;
* [[Projects:2016s1-126 A Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2016s1-128 Evaluating Programming Languages for Educational Robotics Kits]]&lt;br /&gt;
* [[Projects:2016s1-132 RF Transceiver Design for a Portable Radar]]&lt;br /&gt;
* [[Projects:2016s1-145 Indoor localisation using Bluetooth LE for Event Advertising]]&lt;br /&gt;
* [[Projects:2016s1-146 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2016s1-160a Cyber Security - IoT and CAN Bus Security]]&lt;br /&gt;
* [[Projects:2016s1-160b Cyber Security - e-Government and Network Security]]&lt;br /&gt;
* [[Projects:2016s1-160c Cyber Security - Personal Networks and Devices]]&lt;br /&gt;
* [[Projects:2016s1-171 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2016s1-172 Computer Aided Testing of Batteries for Energy Storage Applications]]&lt;br /&gt;
* [[Projects:2016s1-187 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s1-196 Wireless Power Transfer]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2016s1-120 Attacking Cancer with Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-121 Measurement of  Transformer Parameters]]&lt;br /&gt;
* [[Projects:2016s1-131 ECG Enhancement with Advanced Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-141 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2016s1-142 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2016s1-175 Environment Exploring Based on Inertia Measurement Unit and Computer Vision ]]&lt;br /&gt;
* [[Projects:2016s1-180 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2016s1-181 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2016s1-190 Inductive Power Transfer ]]&lt;br /&gt;
* [[Projects:2016s1-197 Sound Triangulation for Invisible Keyboards]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* Tuesday 6 June 2017&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2017.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-216 GPS Receiver Location and Atmosphere Characterisation]]&lt;br /&gt;
* [[Projects:2016s2-235 Personal Radar for Safer Walk &amp;amp; Text]]&lt;br /&gt;
* [[Projects:2016s2-236 Electronic Controller for Spatial Microwave Modulator]]&lt;br /&gt;
* [[Projects:2016s2-255 Solar Aquaponics]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-215 Bhutan Power System Islanding and Special Protection Devices]]&lt;br /&gt;
* [[Projects:2016s2-220 Path Planning and Collision Avoidance for Aduino Robots]]&lt;br /&gt;
* [[Projects:2016s2-230 New Materials for Wearable Antennas in Flexible Electronics]]&lt;br /&gt;
* [[Projects:2016s2-240 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2016s2-245 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s2-246 Feral Cat Detector]]&lt;br /&gt;
* [[Projects:2016s2-250 On-Line Mains Power Cable Time Domain Reflectometry]]&lt;br /&gt;
&lt;br /&gt;
=== 2015 ===&lt;br /&gt;
==== Ingenuity 2015 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 26-27 October 2015&lt;br /&gt;
[[File:Ingenuity_2015.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2015s1-01 LaunchBox]]&lt;br /&gt;
* [[Projects:2015s1-04 Detecting Cyber Malicious Command-Control (C2) Network Traffic Communications]]&lt;br /&gt;
* [[Projects:2015s1-05 Multi-Profile Parallel Transcriber]]&lt;br /&gt;
* [[Projects:2015s1-06 Performance Evaluation of KALDI Open Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2015s1-07 Remote AVR Control for Embedded Generation]]&lt;br /&gt;
* [[Projects:2015s1-08 Developing a Home Energy Management System]]&lt;br /&gt;
* [[Projects:2015s1-10 Lagrangian Modelling of Synchronous Machines]]&lt;br /&gt;
* [[Projects:2015s1-11 Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2015s1-12 An Open-Source Local Area Network (LAN)]]&lt;br /&gt;
* [[Projects:2015s1-13 A One-Time Pad Generator]]&lt;br /&gt;
* [[Projects:2015s1-15 AI for a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-16 System Engineering a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-17 Analysis of Electrical and Software Design in the Effectiveness of Robotics STEM Outreach Programs]]&lt;br /&gt;
* [[Projects:2015s1-18 ARM Processor For Digital Systems Practicals]]&lt;br /&gt;
* [[Projects:2015s1-21 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2015s1-25 Indoor localisation using Bluetooth LE for event advertising]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-28 Wireless Rotation Detector]]&lt;br /&gt;
* [[Projects:2015s1-31 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2015s1-32 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2015s1-36 Heartbeat Perception App]]&lt;br /&gt;
* [[Projects:2015s1-40 Flexible ad-hoc Network A:  Physical Layer]]&lt;br /&gt;
* [[Projects:2015s1-42 Rule-based AI Agent Development: Tic Tac Toe]]&lt;br /&gt;
* [[Projects:2015s1-46 Channel Measurements for Search &amp;amp; Rescue]]&lt;br /&gt;
* [[Projects:2015s1-45 Analysis and Visualisation of Packet Data for Cyber-Security Purposes]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (UG)]]&lt;br /&gt;
* [[Projects:2015s1-56 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2015s1-61 Computer Aided Measurement and Analysis of Equal Efficiency Characteristics of Electrical Machines]]&lt;br /&gt;
* [[Projects:2015s1-70 Design of Power Line Communication Coupler for Single-Wire Earth Return Lines]]&lt;br /&gt;
* [[Projects:2015s1-73 Improved Electric Micro-Bus Design for Nepal]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2015s1-09 Development of a Deadman Switch for Tramline Traction Simulation Tool]]&lt;br /&gt;
* [[Projects:2015s1-22 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-35 Brain computer interface control for biomedical applications]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (PG)]]&lt;br /&gt;
* [[Projects:2015s1-58 Design And Development Of A New Respiratory Monitor For Detection Of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2015s1-71 Inductive Power Transfers]]&lt;br /&gt;
* [[Projects:2015s1-72 Wind Turbine Control Simulator]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* 31 May 2016&lt;br /&gt;
[[File:MidyearExpo_2016.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-201 Development of Energy Storage Knowledge Bank]]&lt;br /&gt;
* [[Projects:2015s2-210 Automated Classification of Heartbeats in Long-Term ECG]]&lt;br /&gt;
* [[Projects:2015s2-211 Health Visa]]&lt;br /&gt;
* [[Projects:2015s2-212 TV Control and Monitoring]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-202 An On-line PLC frequency Impedance Analyser]]&lt;br /&gt;
* [[Projects:2015s2-203 Analysis of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2015s2-204 Unbalanced Operation of Permanent Magnet Generators]]&lt;br /&gt;
* [[Projects:2015s2-206 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2015s2-207 Tracking, Herding and Routing by Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2015s2-209 Automated Classification of Brain Activity During Sleep]]&lt;br /&gt;
* [[Projects:2015s2-216 Feral Cat Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2014 ===&lt;br /&gt;
==== Ingenuity 2014 ====&lt;br /&gt;
Ingenuity 2014 was held at the Adelaide Convention Centre on Thursday 30 October.  It showcased 40 of the school&amp;#039;s completing final year honours and masters projects.&lt;br /&gt;
[[File:Ingenuity_2014_group_shot.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Final Year Projects ====&lt;br /&gt;
* [[Projects:2014S1-01 Development of Fully Automated Educational and Training Tool for Wind and Solar Energy using National Instruments’ ELVIS Based System]]&lt;br /&gt;
* [[Projects:2014S1-04 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2014S1-06 Bell Ringing Robot: Hawkear]]&lt;br /&gt;
* [[Projects:2014S1-10 Development of Machine Learning Techniques for Analysing Network Communications]]&lt;br /&gt;
* [[Projects:2014S1-11 Wireless Rotation Detector for Sport Equipment]]&lt;br /&gt;
* [[Projects:2014S1-12 Exploring RF Energy Harvesting for Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-13 S-band Communication for Small Satellite]]&lt;br /&gt;
* [[Projects:2014S1-15 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2014S1-16 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2014S1-21 Design And Development of a New Respiratory Monitor for Detection of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2014S1-23 Real-Time Adaptive Filters]]&lt;br /&gt;
* [[Projects:2014S1-24 AI Agent Development for an Autonomous Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-26 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2014S1-29 Measurement and Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2014S1-33 Software-Defined Radio for VLF Transmission]]&lt;br /&gt;
* [[Projects:2014S1-35 Human Activity Recognition to Support Independent Living]]&lt;br /&gt;
* [[Projects:2014S1-36 What are Social Appliances? Building your Tomorrow Today…]]&lt;br /&gt;
* [[Projects:2014S1-37 Wireless Monitoring and Control of Wine Fermentation Process]]&lt;br /&gt;
* [[Projects:2014S1-42 Current-Voltage Tracer Experiment]]&lt;br /&gt;
* [[Projects:2014S1-44 Cracking the Voynich Manuscript Code]]&lt;br /&gt;
* [[Projects:2014S1-45 Is Secure Communication Possible?]]&lt;br /&gt;
* [[Projects:2014S1-47 Robotic Arm for Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-48 FPGA-based Software GPS Receiver]]&lt;br /&gt;
* [[Projects:2014S1-49 Can Solar PV cells be used as Telecommunications Receivers?]]&lt;br /&gt;
* [[Projects:2014S1-50 Exploiting HF Emitters of Opportunity for OTH Radar]]&lt;br /&gt;
* [[Projects:2014S1-51 Heart Signal Processing Software for Evaluating Pacemaker Effectiveness]]&lt;br /&gt;
* [[Projects:2014S1-53 Object Profiling for Custom Wheelchair Seating and Pressure Care]]&lt;br /&gt;
* [[Projects:2014S1-56 Inter-Satellite Links for CubeSats]]&lt;br /&gt;
* [[Projects:2014S1-57 Autonomous Vehicle Technologies]]&lt;br /&gt;
* [[Projects:2014s2-74 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2014s2-76 Teletraffic Modelling and Analysis of the New Britannia Roundabout]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2014S1-02 Network Optimisation in Distributed Generation Systems]]&lt;br /&gt;
* [[Projects:2014S1-03 Design of a Mobile Energy Storage System for Grid Integration]]&lt;br /&gt;
* [[Projects:2014S1-14 Wearable RFID Antennas]]&lt;br /&gt;
* [[Projects:2014S1-19 Analysis Of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor]]&lt;br /&gt;
* [[Projects:2014S1-38 Semi-Passive Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-39 Tell your Robot where to go with RFID (Improving Autonomous Navigation)]]&lt;br /&gt;
* [[Projects:2014S1-41 Inductive Power Transfer]]&lt;br /&gt;
* [[Projects:2014S1-43 Inverter Drive Experiment]]&lt;br /&gt;
* [[Projects:2014S1-54 Engineering of a CubeSat Power System]]&lt;br /&gt;
* [[Projects:2014s2-71 Calorimetry and Modelling of Lithium-Ion Chemical Batteries]]&lt;br /&gt;
* [[Projects:2014s2-72 Accurate Measurement and Modelling of a Switched-Mode Power Supply]]&lt;br /&gt;
* [[Projects:2014s2-75 Formation Control of Two Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2014s2-78 Investigation the Design and Development of Miniature Specific Gravity Sensor]]&lt;br /&gt;
* [[Projects:2014s2-79 FPGA-base Hardware Iimplementation of Machine-Learning Methods for Handwriting and Speech Recognition]]&lt;br /&gt;
* [[Projects:2014s2-80 Swinging Crane Project]]&lt;br /&gt;
* [[Projects:2014s2-82 Grid Integration of Solar PV Embedded Generation]]&lt;br /&gt;
* [[Projects:2014s2-83 A Testing and Characterising Device for batteries of various chemistries]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[:Category:Projects]]&lt;br /&gt;
&lt;br /&gt;
== Resources ==&lt;br /&gt;
=== Project Resources ===&lt;br /&gt;
This contains resources that may be useful for some project groups. For example, manuals for commonly used software tools, or links to useful suppliers may be found here.&lt;br /&gt;
* [[Resources:General Advice]]&lt;br /&gt;
* School links&lt;br /&gt;
** [https://app.smartsheet.com/b/form/bc56c18eb51141649ec3819d5e0d712a Material purchase requests]&lt;br /&gt;
** [https://app.smartsheet.com/b/form/e0e915c66f7445f183aabf34843fac5d Technical support requests]&lt;br /&gt;
** [https://www.eleceng.adelaide.edu.au/projlab/index.php Projects Lab booking system]&lt;br /&gt;
** [https://www.eleceng.adelaide.edu.au/facilitiesbooking/index.php EEE meeting room booking system (only for meetings with supervisors)]&lt;br /&gt;
** [https://eleceng.adelaide.edu.au/intranet/undergraduate/infrastructure-request/ Project bench request form]&lt;br /&gt;
** [https://www.adelaide.edu.au/technology/yourservices/software/personal/ University Licensed software]&lt;br /&gt;
** [https://universityofadelaide.box.com/v/FYPResources FYP resource files]&lt;br /&gt;
* [[Resources:Professional skills]]&lt;br /&gt;
* [[Resources:Project management tools]]&lt;br /&gt;
* [[Resources:Wiki writing resources]]&lt;br /&gt;
* [[Resources:Presentation resources]]&lt;br /&gt;
* [[Resources:Writing resources]]&lt;br /&gt;
* [[Resources:Meeting resources]]&lt;br /&gt;
* [[Resources:Research resources]]&lt;br /&gt;
* [[Resources:Programming language resources]]&lt;br /&gt;
* [[Resources:Electronics suppliers]]&lt;br /&gt;
&lt;br /&gt;
=== School Resources ===&lt;br /&gt;
The school operates a number of teaching laboratories.&lt;br /&gt;
* [[Projects Lab]]&lt;br /&gt;
* [[Electronics Teaching Labs]]&lt;br /&gt;
&lt;br /&gt;
=== Store ===&lt;br /&gt;
&lt;br /&gt;
* [[Electronics Store (EM316)]]&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=12234</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=12234"/>
		<updated>2018-10-21T14:47:07Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Honours Projects */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Projects ==&lt;br /&gt;
=== 2018 ===&lt;br /&gt;
==== Ingenuity 2018 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 22-23 October 2018&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2018s1-100 Automated Person Identification with Multiple Sensors]]&lt;br /&gt;
* [[Projects:2018s1-101 Classification of Network Traffic Flows using Deep and Transfer Learning]]&lt;br /&gt;
* [[Projects:2018s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2018s1-103 Improving Usability and User Interaction with KALDI Open-Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2018s1-105 Cyber security - Car Hacking]]&lt;br /&gt;
* [[Projects:2018s1-107 Evolution of Spiking Neural Networks for UAV Control]]&lt;br /&gt;
* [[Projects:2018s1-108 Machine Learning Multi-Spectral Simulation]]&lt;br /&gt;
* [[Projects:2018s1-109 High-Resolution Change Prediction using Sparse Spatio-temporal Data]]&lt;br /&gt;
* [[Projects:2018s1-110 Future Submarine Project]]&lt;br /&gt;
* [[Projects:2018s1-111 IoT Connectivity Investigation]]&lt;br /&gt;
* [[Projects:2018s1-112 Automate the 3D Design and Manufacture of Electrical Control Panels using Advanced Digital Technologies]]&lt;br /&gt;
* [[Projects:2018s1-113 AVR Test Rig]]&lt;br /&gt;
* [[Projects:2018s1-115 Passive Radar in the High Frequency Band using Civil Transmissions]]&lt;br /&gt;
* [[Projects:2018s1-116 Data Analytics]]&lt;br /&gt;
* [[Projects:2018s1-119 Design of Calibration Platform for Medical Sensing]]&lt;br /&gt;
* [[Projects:2018s1-121 In-Memory Semantic Processing Using Hyperdimensional Computing]]&lt;br /&gt;
* [[Projects:2018s1-122 NI Autonomous Robotics Competition]]&lt;br /&gt;
* [[Projects:2018s1-128 Software Tool for Fitting Statistical Models to Sea Clutter Data]]&lt;br /&gt;
* [[Projects:2018s1-135 A Low Cost Impedance and Transfer Function Analyser Part 2]]&lt;br /&gt;
* [[Projects:2018s1-136UG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
* [[Projects:2018s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2018s1-141 CSI Adelaide:  Who killed the Somerton Man?]]&lt;br /&gt;
* [[Projects:2018s1-142 Modelling the Dynamics of Cryptocurrency Market]]&lt;br /&gt;
* [[Projects:2018s1-145 Simplified Indoor UAV Operations]]&lt;br /&gt;
* [[Projects:2018s1-151 Raspberry Pi as a Core Device for Efficient Biological Field Survey Data Collection]]&lt;br /&gt;
* [[Projects:2018s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2018s1-157 Designing Airway Pressure Control Technology for Sleep Apnea Treatment]]&lt;br /&gt;
* [[Projects:2018s1-160 UAV Platform for Cognitive AI Agent]]&lt;br /&gt;
* [[Projects:2018s1-164 Private but Public on the Blockchain]]&lt;br /&gt;
* [[Projects:2018s1-165 Dual IP Stack Exfiltration - Methods and Defences]]&lt;br /&gt;
* [[Projects:2018s1-167 Security Assessment of Watchem and Moochies Watches]]&lt;br /&gt;
* [[Projects:2018s1-168 Penetration Testing of the SpaceTalk Tracking Watch]]&lt;br /&gt;
* [[Projects:2018s1-169 A Better Security Framework for Wearable Devices]]&lt;br /&gt;
* [[Projects:2018s1-175 Split-ring resonators for measuring spatially-distributed complex permittivity at microwave frequencies]]&lt;br /&gt;
* [[Projects:2018s1-181 BMW Autonomous Vehicle]]&lt;br /&gt;
* [[Projects:2018s1-182 Inertia Characterisation and Modelling in a Renewable Energy and Battery Based Microgrid]]&lt;br /&gt;
* [[Projects:2018s1-191 Quasi-Linear Circuit Theory]]&lt;br /&gt;
* [[Projects:2018s1-192 Karplus-Strong Synthesis of Sound]]&lt;br /&gt;
* [[Projects:2018s1-195 Novel Flexible Materials for Wearable Antennas]]&lt;br /&gt;
* [[Projects:2018s1-196 Concealed Wearable Antennas]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2018s1-118 Design of Wireless Sensors for Sleep Apnea Detection]]&lt;br /&gt;
* [[Projects:2018s1-136PG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
* [[Projects:2018s1-170 Intelligent Parking Control for Autonomous Ground Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-177 Radio astronomy with software-defined radio]]&lt;br /&gt;
* [[Projects:2018s1-178 Creating microwave antennas with 3D printing]]&lt;br /&gt;
* [[Projects:2018s1-180 Development and Control of a Standalone Power Source for Residential Dwellings and Small Businesses]]&lt;br /&gt;
* [[Projects:2018s1-186 Calculation and Optimisation of Energy Usage of Electric Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-190 Dynamical Modelling of Synchronous Machines]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2018 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* June 2019&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2018s2-235UG PMU Test generator]]&lt;br /&gt;
* [[Projects:2018s2-226 Software Library for Inverse Synthetic Aperture Radar]]&lt;br /&gt;
* [[Projects:2018s2-279 High-Gain Antennas for Terahertz Communications]]&lt;br /&gt;
* [[Projects:2018s2-280 Assessment of Port Pirie for Higher Renewable PV Energy Integration]]&lt;br /&gt;
* [[Projects:2018s2-293 Detailed Analysis of Ferromagnetism in the Periodic Domain]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid year) ====&lt;br /&gt;
* [[Projects:2018s2-235PG PMU Test generator]]&lt;br /&gt;
* [[Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control]]&lt;br /&gt;
* [[Projects:2018s2-285 SA Power System Modelling and Analysis]]&lt;br /&gt;
* [[Projects:2018s2-297 Wireless Rotation Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2017 ===&lt;br /&gt;
==== Ingenuity 2017 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 30-31 October 2017&lt;br /&gt;
* Prizes&lt;br /&gt;
** Best EEE Wiki: Classifying Network Traffic Flows with Deep-Learning by Kyle Thornton, Clinton Page, Daniel Smit&lt;br /&gt;
** Best EEE Exhibit: Face Recognition using 3D Data by Orbille Piol, Michael Sadler, Jesse Willsmore&lt;br /&gt;
[[File:Ingenuity 2017.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2017s1-100 Face Recognition using 3D Data]]&lt;br /&gt;
* [[Projects:2017s1-101 Classifying Network Traffic Flows with Deep-Learning]]&lt;br /&gt;
* [[Projects:2017s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2017s1-103 Improving Usability and User Interaction with KALDI Open- Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2017s1-105 “CARLOS TC” Tow Bar Testing Facility]]&lt;br /&gt;
* [[Projects:2017s1-106 Inertia Characterisation and Modelling in a Renewable Energy-based Microgrid]]&lt;br /&gt;
* [[Projects:2017s1-107 Protection of a Convoy of Ships Under Attack]]&lt;br /&gt;
* [[Projects:2017s1-108 Stability and Control of 3-D Formations]]&lt;br /&gt;
* [[Projects:2017s1-109 Dynamically Forming Formations ]]&lt;br /&gt;
* [[Projects:2017s1-110 ‘Real-Time’ FPGA Based Object Recognition &amp;amp; Threat Detection in Hardware]]&lt;br /&gt;
* [[Projects:2017s1-111 OTHR Alternative Computing Architecture]]&lt;br /&gt;
* [[Projects:2017s1-120 Hardware Realisation of the Unum 2.0 Number Format]]&lt;br /&gt;
* [[Projects:2017s1-121 Learning Procedural Knowledge using Random Forests]]&lt;br /&gt;
* [[Projects:2017s1-125 Drone Imaging and Classification using Radar]]&lt;br /&gt;
* [[Projects:2017s1-127 Sound Trilateration for Positioning of the Sound Source]]&lt;br /&gt;
* [[Projects:2017s1-135 A Low Cost Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2017s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2017s1-156 Interrogating a Glucose Monitor]]&lt;br /&gt;
* [[Projects:2017s1-157 Automated Classification of Brain Activity during Sleep]]&lt;br /&gt;
* [[Projects:2017s1-160 Cooperating Autonomous Vehicles]]&lt;br /&gt;
* [[Projects:2017s1-165 Forensic Investigation of Fitness Devices]]&lt;br /&gt;
* [[Projects:2017s1-167a Applications of Blockchain to Equity Fund Raising]] &lt;br /&gt;
* [[Projects:2017s1-167b Real Time Video Steam Substitution]]&lt;br /&gt;
* [[Projects:2017s1-167c Smart Grid Security]]&lt;br /&gt;
* [[Projects:2017s1-167d Twitterbots]]&lt;br /&gt;
* [[Projects:2017s1-175 Split-Ring Resonators for Measuring Spatially-Distributed Complex Permittivity at Microwave Frequencies]]&lt;br /&gt;
* [[Projects:2017s1-176 Smart Mirror with Raspberry Pi]]&lt;br /&gt;
* [[Projects:2017s1-177 Radio Astronomy with Software-Defined Radio]]&lt;br /&gt;
* [[Projects:2017s1-180 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2017s1-181 BMW Autonomous Vehicle Project Camera Based Lane Detection in a Road Vehicle for Autonomous Driving]]&lt;br /&gt;
* [[Projects:2017s1-182 BMW Autonomous Vehicle Project Development of Vehicle Control Algorithm]]&lt;br /&gt;
* [[Projects:2017s1-185 BMW Autonomous Vehicle Project Implementation of a Steering Angle Controller on a Lab Test Bench]]&lt;br /&gt;
* [[Projects:2017s1-186 Playing Music Through a Tesla Coil]]&lt;br /&gt;
* [[Projects:2017s1-190 Modelling and Validation for Synchronous Generators]]&lt;br /&gt;
* [[Projects:2017s1-191 Power Electronics for Inductive Power Transfer (IPT)]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2017s1-122 On-Chip Learning]]&lt;br /&gt;
* [[Projects:2017s1-150 Statistical Natural Language Processing]]&lt;br /&gt;
* [[Projects:2017s1-158 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2017s1-159 Detecting Penguin’s Heart Sounds]]&lt;br /&gt;
* [[Projects:2017s1-166 Development of TV Ad Blocker]]&lt;br /&gt;
* [[Projects:2017s1-170 Formation Control and Obstacle Avoidance for Heterogeneous Multi-Agent Systems (Unmanned Aerial Vehicles and Robots)]]&lt;br /&gt;
* [[Projects:2017s1-183 BMW Autonomous Vehicle Project Development of a sensor fusion algorithm to determine the current vehicle position in a local tangential plane ]]&lt;br /&gt;
* [[Projects:2017s1-184 BMW Autonomous Vehicle Project Implement the Longitudinal Control Algorithm of the Vehicle]]&lt;br /&gt;
* [[Projects:2017s1-195 Solar Aquaponics ]]&lt;br /&gt;
* [[Projects:2017s1-196 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel textile antennas for wearable wireless communications]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2017 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* 5 June 2018&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2018.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-201 Detection and Classification in an Indoor Environment Using WiFi]]&lt;br /&gt;
* [[Projects:2017s2-205 Multi-Profile Parallel Speech-to Text Transcriber]]&lt;br /&gt;
* [[Projects:2017s2-220 Alternative Approaches to AI for the Soccer Table]]&lt;br /&gt;
* [[Projects:2017s2-225 Digital Microphone Array using MEMS Microphones]]&lt;br /&gt;
* [[Projects:2017s2-275 Creating Microwave Antennas with 3D Printing]]&lt;br /&gt;
* [[Projects:2017s2-290 The Magnetorquer]]&lt;br /&gt;
* [[Projects:2017s2-291 Measurement of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2017s2-292 Wide-Area Sun Sensor]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-215 Wireless Power Transfer]]&lt;br /&gt;
* [[Projects:2017s2-235 An On-line 10 kHz to 1 MHz Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel Textile Antennas for Wearable Wireless Communications]]&lt;br /&gt;
* [[Projects:2017s2-270 Reconfiguration on Multi-Agent Systems (Robots Systems)]]&lt;br /&gt;
* [[Projects:2017s2-285 Short-term Solutions for the South Australian Electric Power System]]&lt;br /&gt;
* [[Projects:2017s2-295 Feral Animal Detection using IR Thermal Imagery]]&lt;br /&gt;
&lt;br /&gt;
=== 2016 ===&lt;br /&gt;
==== Ingenuity 2016 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 27-28 October 2016&lt;br /&gt;
&lt;br /&gt;
[[File:Ingenuity_2016.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2016s1-101 Predicting Power Outages from Weather Patterns]]&lt;br /&gt;
* [[Projects:2016s1-102 Classifying Internet Applications and Detecting Malicious Traffic from Network Communications]]&lt;br /&gt;
* [[Projects:2016s1-105 Non-Contact Photoplethysmogram]]&lt;br /&gt;
* [[Projects:2016s1-106 Airborne Antenna Measurement Platform]]&lt;br /&gt;
* [[Projects:2016s1-109 Development, Characterisation and Modelling of Renewable Energy-Based Microgrid]]&lt;br /&gt;
* [[Projects:2016s1-122 A Complete Model for a Synchronous Machine]]&lt;br /&gt;
* [[Projects:2016s1-126 A Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2016s1-128 Evaluating Programming Languages for Educational Robotics Kits]]&lt;br /&gt;
* [[Projects:2016s1-132 RF Transceiver Design for a Portable Radar]]&lt;br /&gt;
* [[Projects:2016s1-145 Indoor localisation using Bluetooth LE for Event Advertising]]&lt;br /&gt;
* [[Projects:2016s1-146 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2016s1-160a Cyber Security - IoT and CAN Bus Security]]&lt;br /&gt;
* [[Projects:2016s1-160b Cyber Security - e-Government and Network Security]]&lt;br /&gt;
* [[Projects:2016s1-160c Cyber Security - Personal Networks and Devices]]&lt;br /&gt;
* [[Projects:2016s1-171 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2016s1-172 Computer Aided Testing of Batteries for Energy Storage Applications]]&lt;br /&gt;
* [[Projects:2016s1-187 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s1-196 Wireless Power Transfer]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2016s1-120 Attacking Cancer with Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-121 Measurement of  Transformer Parameters]]&lt;br /&gt;
* [[Projects:2016s1-131 ECG Enhancement with Advanced Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-141 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2016s1-142 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2016s1-175 Environment Exploring Based on Inertia Measurement Unit and Computer Vision ]]&lt;br /&gt;
* [[Projects:2016s1-180 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2016s1-181 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2016s1-190 Inductive Power Transfer ]]&lt;br /&gt;
* [[Projects:2016s1-197 Sound Triangulation for Invisible Keyboards]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* Tuesday 6 June 2017&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2017.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-216 GPS Receiver Location and Atmosphere Characterisation]]&lt;br /&gt;
* [[Projects:2016s2-235 Personal Radar for Safer Walk &amp;amp; Text]]&lt;br /&gt;
* [[Projects:2016s2-236 Electronic Controller for Spatial Microwave Modulator]]&lt;br /&gt;
* [[Projects:2016s2-255 Solar Aquaponics]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-215 Bhutan Power System Islanding and Special Protection Devices]]&lt;br /&gt;
* [[Projects:2016s2-220 Path Planning and Collision Avoidance for Aduino Robots]]&lt;br /&gt;
* [[Projects:2016s2-230 New Materials for Wearable Antennas in Flexible Electronics]]&lt;br /&gt;
* [[Projects:2016s2-240 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2016s2-245 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s2-246 Feral Cat Detector]]&lt;br /&gt;
* [[Projects:2016s2-250 On-Line Mains Power Cable Time Domain Reflectometry]]&lt;br /&gt;
&lt;br /&gt;
=== 2015 ===&lt;br /&gt;
==== Ingenuity 2015 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 26-27 October 2015&lt;br /&gt;
[[File:Ingenuity_2015.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2015s1-01 LaunchBox]]&lt;br /&gt;
* [[Projects:2015s1-04 Detecting Cyber Malicious Command-Control (C2) Network Traffic Communications]]&lt;br /&gt;
* [[Projects:2015s1-05 Multi-Profile Parallel Transcriber]]&lt;br /&gt;
* [[Projects:2015s1-06 Performance Evaluation of KALDI Open Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2015s1-07 Remote AVR Control for Embedded Generation]]&lt;br /&gt;
* [[Projects:2015s1-08 Developing a Home Energy Management System]]&lt;br /&gt;
* [[Projects:2015s1-10 Lagrangian Modelling of Synchronous Machines]]&lt;br /&gt;
* [[Projects:2015s1-11 Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2015s1-12 An Open-Source Local Area Network (LAN)]]&lt;br /&gt;
* [[Projects:2015s1-13 A One-Time Pad Generator]]&lt;br /&gt;
* [[Projects:2015s1-15 AI for a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-16 System Engineering a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-17 Analysis of Electrical and Software Design in the Effectiveness of Robotics STEM Outreach Programs]]&lt;br /&gt;
* [[Projects:2015s1-18 ARM Processor For Digital Systems Practicals]]&lt;br /&gt;
* [[Projects:2015s1-21 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2015s1-25 Indoor localisation using Bluetooth LE for event advertising]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-28 Wireless Rotation Detector]]&lt;br /&gt;
* [[Projects:2015s1-31 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2015s1-32 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2015s1-36 Heartbeat Perception App]]&lt;br /&gt;
* [[Projects:2015s1-40 Flexible ad-hoc Network A:  Physical Layer]]&lt;br /&gt;
* [[Projects:2015s1-42 Rule-based AI Agent Development: Tic Tac Toe]]&lt;br /&gt;
* [[Projects:2015s1-46 Channel Measurements for Search &amp;amp; Rescue]]&lt;br /&gt;
* [[Projects:2015s1-45 Analysis and Visualisation of Packet Data for Cyber-Security Purposes]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (UG)]]&lt;br /&gt;
* [[Projects:2015s1-56 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2015s1-61 Computer Aided Measurement and Analysis of Equal Efficiency Characteristics of Electrical Machines]]&lt;br /&gt;
* [[Projects:2015s1-70 Design of Power Line Communication Coupler for Single-Wire Earth Return Lines]]&lt;br /&gt;
* [[Projects:2015s1-73 Improved Electric Micro-Bus Design for Nepal]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2015s1-09 Development of a Deadman Switch for Tramline Traction Simulation Tool]]&lt;br /&gt;
* [[Projects:2015s1-22 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-35 Brain computer interface control for biomedical applications]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (PG)]]&lt;br /&gt;
* [[Projects:2015s1-58 Design And Development Of A New Respiratory Monitor For Detection Of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2015s1-71 Inductive Power Transfers]]&lt;br /&gt;
* [[Projects:2015s1-72 Wind Turbine Control Simulator]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* 31 May 2016&lt;br /&gt;
[[File:MidyearExpo_2016.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-201 Development of Energy Storage Knowledge Bank]]&lt;br /&gt;
* [[Projects:2015s2-210 Automated Classification of Heartbeats in Long-Term ECG]]&lt;br /&gt;
* [[Projects:2015s2-211 Health Visa]]&lt;br /&gt;
* [[Projects:2015s2-212 TV Control and Monitoring]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-202 An On-line PLC frequency Impedance Analyser]]&lt;br /&gt;
* [[Projects:2015s2-203 Analysis of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2015s2-204 Unbalanced Operation of Permanent Magnet Generators]]&lt;br /&gt;
* [[Projects:2015s2-206 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2015s2-207 Tracking, Herding and Routing by Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2015s2-209 Automated Classification of Brain Activity During Sleep]]&lt;br /&gt;
* [[Projects:2015s2-216 Feral Cat Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2014 ===&lt;br /&gt;
==== Ingenuity 2014 ====&lt;br /&gt;
Ingenuity 2014 was held at the Adelaide Convention Centre on Thursday 30 October.  It showcased 40 of the school&amp;#039;s completing final year honours and masters projects.&lt;br /&gt;
[[File:Ingenuity_2014_group_shot.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Final Year Projects ====&lt;br /&gt;
* [[Projects:2014S1-01 Development of Fully Automated Educational and Training Tool for Wind and Solar Energy using National Instruments’ ELVIS Based System]]&lt;br /&gt;
* [[Projects:2014S1-04 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2014S1-06 Bell Ringing Robot: Hawkear]]&lt;br /&gt;
* [[Projects:2014S1-10 Development of Machine Learning Techniques for Analysing Network Communications]]&lt;br /&gt;
* [[Projects:2014S1-11 Wireless Rotation Detector for Sport Equipment]]&lt;br /&gt;
* [[Projects:2014S1-12 Exploring RF Energy Harvesting for Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-13 S-band Communication for Small Satellite]]&lt;br /&gt;
* [[Projects:2014S1-15 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2014S1-16 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2014S1-21 Design And Development of a New Respiratory Monitor for Detection of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2014S1-23 Real-Time Adaptive Filters]]&lt;br /&gt;
* [[Projects:2014S1-24 AI Agent Development for an Autonomous Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-26 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2014S1-29 Measurement and Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2014S1-33 Software-Defined Radio for VLF Transmission]]&lt;br /&gt;
* [[Projects:2014S1-35 Human Activity Recognition to Support Independent Living]]&lt;br /&gt;
* [[Projects:2014S1-36 What are Social Appliances? Building your Tomorrow Today…]]&lt;br /&gt;
* [[Projects:2014S1-37 Wireless Monitoring and Control of Wine Fermentation Process]]&lt;br /&gt;
* [[Projects:2014S1-42 Current-Voltage Tracer Experiment]]&lt;br /&gt;
* [[Projects:2014S1-44 Cracking the Voynich Manuscript Code]]&lt;br /&gt;
* [[Projects:2014S1-45 Is Secure Communication Possible?]]&lt;br /&gt;
* [[Projects:2014S1-47 Robotic Arm for Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-48 FPGA-based Software GPS Receiver]]&lt;br /&gt;
* [[Projects:2014S1-49 Can Solar PV cells be used as Telecommunications Receivers?]]&lt;br /&gt;
* [[Projects:2014S1-50 Exploiting HF Emitters of Opportunity for OTH Radar]]&lt;br /&gt;
* [[Projects:2014S1-51 Heart Signal Processing Software for Evaluating Pacemaker Effectiveness]]&lt;br /&gt;
* [[Projects:2014S1-53 Object Profiling for Custom Wheelchair Seating and Pressure Care]]&lt;br /&gt;
* [[Projects:2014S1-56 Inter-Satellite Links for CubeSats]]&lt;br /&gt;
* [[Projects:2014S1-57 Autonomous Vehicle Technologies]]&lt;br /&gt;
* [[Projects:2014s2-74 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2014s2-76 Teletraffic Modelling and Analysis of the New Britannia Roundabout]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2014S1-02 Network Optimisation in Distributed Generation Systems]]&lt;br /&gt;
* [[Projects:2014S1-03 Design of a Mobile Energy Storage System for Grid Integration]]&lt;br /&gt;
* [[Projects:2014S1-14 Wearable RFID Antennas]]&lt;br /&gt;
* [[Projects:2014S1-19 Analysis Of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor]]&lt;br /&gt;
* [[Projects:2014S1-38 Semi-Passive Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-39 Tell your Robot where to go with RFID (Improving Autonomous Navigation)]]&lt;br /&gt;
* [[Projects:2014S1-41 Inductive Power Transfer]]&lt;br /&gt;
* [[Projects:2014S1-43 Inverter Drive Experiment]]&lt;br /&gt;
* [[Projects:2014S1-54 Engineering of a CubeSat Power System]]&lt;br /&gt;
* [[Projects:2014s2-71 Calorimetry and Modelling of Lithium-Ion Chemical Batteries]]&lt;br /&gt;
* [[Projects:2014s2-72 Accurate Measurement and Modelling of a Switched-Mode Power Supply]]&lt;br /&gt;
* [[Projects:2014s2-75 Formation Control of Two Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2014s2-78 Investigation the Design and Development of Miniature Specific Gravity Sensor]]&lt;br /&gt;
* [[Projects:2014s2-79 FPGA-base Hardware Iimplementation of Machine-Learning Methods for Handwriting and Speech Recognition]]&lt;br /&gt;
* [[Projects:2014s2-80 Swinging Crane Project]]&lt;br /&gt;
* [[Projects:2014s2-82 Grid Integration of Solar PV Embedded Generation]]&lt;br /&gt;
* [[Projects:2014s2-83 A Testing and Characterising Device for batteries of various chemistries]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[:Category:Projects]]&lt;br /&gt;
&lt;br /&gt;
== Resources ==&lt;br /&gt;
=== Project Resources ===&lt;br /&gt;
This contains resources that may be useful for some project groups. For example, manuals for commonly used software tools, or links to useful suppliers may be found here.&lt;br /&gt;
* [[Resources:General Advice]]&lt;br /&gt;
* School links&lt;br /&gt;
** [https://app.smartsheet.com/b/form/bc56c18eb51141649ec3819d5e0d712a Material purchase requests]&lt;br /&gt;
** [https://app.smartsheet.com/b/form/e0e915c66f7445f183aabf34843fac5d Technical support requests]&lt;br /&gt;
** [https://www.eleceng.adelaide.edu.au/projlab/index.php Projects Lab booking system]&lt;br /&gt;
** [https://www.eleceng.adelaide.edu.au/facilitiesbooking/index.php EEE meeting room booking system (only for meetings with supervisors)]&lt;br /&gt;
** [https://eleceng.adelaide.edu.au/intranet/undergraduate/infrastructure-request/ Project bench request form]&lt;br /&gt;
** [https://www.adelaide.edu.au/technology/yourservices/software/personal/ University Licensed software]&lt;br /&gt;
** [https://universityofadelaide.box.com/v/FYPResources FYP resource files]&lt;br /&gt;
* [[Resources:Professional skills]]&lt;br /&gt;
* [[Resources:Project management tools]]&lt;br /&gt;
* [[Resources:Wiki writing resources]]&lt;br /&gt;
* [[Resources:Presentation resources]]&lt;br /&gt;
* [[Resources:Writing resources]]&lt;br /&gt;
* [[Resources:Meeting resources]]&lt;br /&gt;
* [[Resources:Research resources]]&lt;br /&gt;
* [[Resources:Programming language resources]]&lt;br /&gt;
* [[Resources:Electronics suppliers]]&lt;br /&gt;
&lt;br /&gt;
=== School Resources ===&lt;br /&gt;
The school operates a number of teaching laboratories.&lt;br /&gt;
* [[Projects Lab]]&lt;br /&gt;
* [[Electronics Teaching Labs]]&lt;br /&gt;
&lt;br /&gt;
=== Store ===&lt;br /&gt;
&lt;br /&gt;
* [[Electronics Store (EM316)]]&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=12233</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Main_Page&amp;diff=12233"/>
		<updated>2018-10-21T14:45:59Z</updated>

		<summary type="html">&lt;p&gt;A1687420: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Projects ==&lt;br /&gt;
=== 2018 ===&lt;br /&gt;
==== Ingenuity 2018 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 22-23 October 2018&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2018s1-100 Automated Person Identification with Multiple Sensors]]&lt;br /&gt;
* [[Projects:2018s1-101 Classification of Network Traffic Flows using Deep and Transfer Learning]]&lt;br /&gt;
* [[Projects:2018s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2018s1-103 Improving Usability and User Interaction with KALDI Open-Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2018s1-105 Cyber security - Car Hacking]]&lt;br /&gt;
* [[Projects:2018s1-107 Evolution of Spiking Neural Networks for UAV Control]]&lt;br /&gt;
* [[Projects:2018s1-108 Machine Learning Multi-Spectral Simulation]]&lt;br /&gt;
* [[Projects:2018s1-109 High-Resolution Change Prediction using Sparse Spatio-temporal Data]]&lt;br /&gt;
* [[Projects:2018s1-110 Submarine Optronics System: Contact Detection]]&lt;br /&gt;
* [[Projects:2018s1-111 IoT Connectivity Investigation]]&lt;br /&gt;
* [[Projects:2018s1-112 Automate the 3D Design and Manufacture of Electrical Control Panels using Advanced Digital Technologies]]&lt;br /&gt;
* [[Projects:2018s1-113 AVR Test Rig]]&lt;br /&gt;
* [[Projects:2018s1-115 Passive Radar in the High Frequency Band using Civil Transmissions]]&lt;br /&gt;
* [[Projects:2018s1-116 Data Analytics]]&lt;br /&gt;
* [[Projects:2018s1-119 Design of Calibration Platform for Medical Sensing]]&lt;br /&gt;
* [[Projects:2018s1-121 In-Memory Semantic Processing Using Hyperdimensional Computing]]&lt;br /&gt;
* [[Projects:2018s1-122 NI Autonomous Robotics Competition]]&lt;br /&gt;
* [[Projects:2018s1-128 Software Tool for Fitting Statistical Models to Sea Clutter Data]]&lt;br /&gt;
* [[Projects:2018s1-135 A Low Cost Impedance and Transfer Function Analyser Part 2]]&lt;br /&gt;
* [[Projects:2018s1-136UG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
* [[Projects:2018s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2018s1-141 CSI Adelaide:  Who killed the Somerton Man?]]&lt;br /&gt;
* [[Projects:2018s1-142 Modelling the Dynamics of Cryptocurrency Market]]&lt;br /&gt;
* [[Projects:2018s1-145 Simplified Indoor UAV Operations]]&lt;br /&gt;
* [[Projects:2018s1-151 Raspberry Pi as a Core Device for Efficient Biological Field Survey Data Collection]]&lt;br /&gt;
* [[Projects:2018s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2018s1-157 Designing Airway Pressure Control Technology for Sleep Apnea Treatment]]&lt;br /&gt;
* [[Projects:2018s1-160 UAV Platform for Cognitive AI Agent]]&lt;br /&gt;
* [[Projects:2018s1-164 Private but Public on the Blockchain]]&lt;br /&gt;
* [[Projects:2018s1-165 Dual IP Stack Exfiltration - Methods and Defences]]&lt;br /&gt;
* [[Projects:2018s1-167 Security Assessment of Watchem and Moochies Watches]]&lt;br /&gt;
* [[Projects:2018s1-168 Penetration Testing of the SpaceTalk Tracking Watch]]&lt;br /&gt;
* [[Projects:2018s1-169 A Better Security Framework for Wearable Devices]]&lt;br /&gt;
* [[Projects:2018s1-175 Split-ring resonators for measuring spatially-distributed complex permittivity at microwave frequencies]]&lt;br /&gt;
* [[Projects:2018s1-181 BMW Autonomous Vehicle]]&lt;br /&gt;
* [[Projects:2018s1-182 Inertia Characterisation and Modelling in a Renewable Energy and Battery Based Microgrid]]&lt;br /&gt;
* [[Projects:2018s1-191 Quasi-Linear Circuit Theory]]&lt;br /&gt;
* [[Projects:2018s1-192 Karplus-Strong Synthesis of Sound]]&lt;br /&gt;
* [[Projects:2018s1-195 Novel Flexible Materials for Wearable Antennas]]&lt;br /&gt;
* [[Projects:2018s1-196 Concealed Wearable Antennas]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2018s1-118 Design of Wireless Sensors for Sleep Apnea Detection]]&lt;br /&gt;
* [[Projects:2018s1-136PG Rate of Change of Mains Frequency Detection]]&lt;br /&gt;
* [[Projects:2018s1-170 Intelligent Parking Control for Autonomous Ground Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-177 Radio astronomy with software-defined radio]]&lt;br /&gt;
* [[Projects:2018s1-178 Creating microwave antennas with 3D printing]]&lt;br /&gt;
* [[Projects:2018s1-180 Development and Control of a Standalone Power Source for Residential Dwellings and Small Businesses]]&lt;br /&gt;
* [[Projects:2018s1-186 Calculation and Optimisation of Energy Usage of Electric Vehicles]]&lt;br /&gt;
* [[Projects:2018s1-190 Dynamical Modelling of Synchronous Machines]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2018 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* June 2019&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2018s2-235UG PMU Test generator]]&lt;br /&gt;
* [[Projects:2018s2-226 Software Library for Inverse Synthetic Aperture Radar]]&lt;br /&gt;
* [[Projects:2018s2-279 High-Gain Antennas for Terahertz Communications]]&lt;br /&gt;
* [[Projects:2018s2-280 Assessment of Port Pirie for Higher Renewable PV Energy Integration]]&lt;br /&gt;
* [[Projects:2018s2-293 Detailed Analysis of Ferromagnetism in the Periodic Domain]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid year) ====&lt;br /&gt;
* [[Projects:2018s2-235PG PMU Test generator]]&lt;br /&gt;
* [[Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control]]&lt;br /&gt;
* [[Projects:2018s2-285 SA Power System Modelling and Analysis]]&lt;br /&gt;
* [[Projects:2018s2-297 Wireless Rotation Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2017 ===&lt;br /&gt;
==== Ingenuity 2017 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 30-31 October 2017&lt;br /&gt;
* Prizes&lt;br /&gt;
** Best EEE Wiki: Classifying Network Traffic Flows with Deep-Learning by Kyle Thornton, Clinton Page, Daniel Smit&lt;br /&gt;
** Best EEE Exhibit: Face Recognition using 3D Data by Orbille Piol, Michael Sadler, Jesse Willsmore&lt;br /&gt;
[[File:Ingenuity 2017.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2017s1-100 Face Recognition using 3D Data]]&lt;br /&gt;
* [[Projects:2017s1-101 Classifying Network Traffic Flows with Deep-Learning]]&lt;br /&gt;
* [[Projects:2017s1-102 HF Radio Automated Link Establishment (ALE) Model]]&lt;br /&gt;
* [[Projects:2017s1-103 Improving Usability and User Interaction with KALDI Open- Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2017s1-105 “CARLOS TC” Tow Bar Testing Facility]]&lt;br /&gt;
* [[Projects:2017s1-106 Inertia Characterisation and Modelling in a Renewable Energy-based Microgrid]]&lt;br /&gt;
* [[Projects:2017s1-107 Protection of a Convoy of Ships Under Attack]]&lt;br /&gt;
* [[Projects:2017s1-108 Stability and Control of 3-D Formations]]&lt;br /&gt;
* [[Projects:2017s1-109 Dynamically Forming Formations ]]&lt;br /&gt;
* [[Projects:2017s1-110 ‘Real-Time’ FPGA Based Object Recognition &amp;amp; Threat Detection in Hardware]]&lt;br /&gt;
* [[Projects:2017s1-111 OTHR Alternative Computing Architecture]]&lt;br /&gt;
* [[Projects:2017s1-120 Hardware Realisation of the Unum 2.0 Number Format]]&lt;br /&gt;
* [[Projects:2017s1-121 Learning Procedural Knowledge using Random Forests]]&lt;br /&gt;
* [[Projects:2017s1-125 Drone Imaging and Classification using Radar]]&lt;br /&gt;
* [[Projects:2017s1-127 Sound Trilateration for Positioning of the Sound Source]]&lt;br /&gt;
* [[Projects:2017s1-135 A Low Cost Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s1-140 Energy Storage Requirements for the SA Grid]]&lt;br /&gt;
* [[Projects:2017s1-155 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2017s1-156 Interrogating a Glucose Monitor]]&lt;br /&gt;
* [[Projects:2017s1-157 Automated Classification of Brain Activity during Sleep]]&lt;br /&gt;
* [[Projects:2017s1-160 Cooperating Autonomous Vehicles]]&lt;br /&gt;
* [[Projects:2017s1-165 Forensic Investigation of Fitness Devices]]&lt;br /&gt;
* [[Projects:2017s1-167a Applications of Blockchain to Equity Fund Raising]] &lt;br /&gt;
* [[Projects:2017s1-167b Real Time Video Steam Substitution]]&lt;br /&gt;
* [[Projects:2017s1-167c Smart Grid Security]]&lt;br /&gt;
* [[Projects:2017s1-167d Twitterbots]]&lt;br /&gt;
* [[Projects:2017s1-175 Split-Ring Resonators for Measuring Spatially-Distributed Complex Permittivity at Microwave Frequencies]]&lt;br /&gt;
* [[Projects:2017s1-176 Smart Mirror with Raspberry Pi]]&lt;br /&gt;
* [[Projects:2017s1-177 Radio Astronomy with Software-Defined Radio]]&lt;br /&gt;
* [[Projects:2017s1-180 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2017s1-181 BMW Autonomous Vehicle Project Camera Based Lane Detection in a Road Vehicle for Autonomous Driving]]&lt;br /&gt;
* [[Projects:2017s1-182 BMW Autonomous Vehicle Project Development of Vehicle Control Algorithm]]&lt;br /&gt;
* [[Projects:2017s1-185 BMW Autonomous Vehicle Project Implementation of a Steering Angle Controller on a Lab Test Bench]]&lt;br /&gt;
* [[Projects:2017s1-186 Playing Music Through a Tesla Coil]]&lt;br /&gt;
* [[Projects:2017s1-190 Modelling and Validation for Synchronous Generators]]&lt;br /&gt;
* [[Projects:2017s1-191 Power Electronics for Inductive Power Transfer (IPT)]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2017s1-122 On-Chip Learning]]&lt;br /&gt;
* [[Projects:2017s1-150 Statistical Natural Language Processing]]&lt;br /&gt;
* [[Projects:2017s1-158 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2017s1-159 Detecting Penguin’s Heart Sounds]]&lt;br /&gt;
* [[Projects:2017s1-166 Development of TV Ad Blocker]]&lt;br /&gt;
* [[Projects:2017s1-170 Formation Control and Obstacle Avoidance for Heterogeneous Multi-Agent Systems (Unmanned Aerial Vehicles and Robots)]]&lt;br /&gt;
* [[Projects:2017s1-183 BMW Autonomous Vehicle Project Development of a sensor fusion algorithm to determine the current vehicle position in a local tangential plane ]]&lt;br /&gt;
* [[Projects:2017s1-184 BMW Autonomous Vehicle Project Implement the Longitudinal Control Algorithm of the Vehicle]]&lt;br /&gt;
* [[Projects:2017s1-195 Solar Aquaponics ]]&lt;br /&gt;
* [[Projects:2017s1-196 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel textile antennas for wearable wireless communications]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2017 ====&lt;br /&gt;
* EM205&lt;br /&gt;
* 5 June 2018&lt;br /&gt;
* 11:30-13:00&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2018.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-201 Detection and Classification in an Indoor Environment Using WiFi]]&lt;br /&gt;
* [[Projects:2017s2-205 Multi-Profile Parallel Speech-to Text Transcriber]]&lt;br /&gt;
* [[Projects:2017s2-220 Alternative Approaches to AI for the Soccer Table]]&lt;br /&gt;
* [[Projects:2017s2-225 Digital Microphone Array using MEMS Microphones]]&lt;br /&gt;
* [[Projects:2017s2-275 Creating Microwave Antennas with 3D Printing]]&lt;br /&gt;
* [[Projects:2017s2-290 The Magnetorquer]]&lt;br /&gt;
* [[Projects:2017s2-291 Measurement of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2017s2-292 Wide-Area Sun Sensor]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2017s2-215 Wireless Power Transfer]]&lt;br /&gt;
* [[Projects:2017s2-235 An On-line 10 kHz to 1 MHz Impedance Analyser]]&lt;br /&gt;
* [[Projects:2017s2-245 Novel Textile Antennas for Wearable Wireless Communications]]&lt;br /&gt;
* [[Projects:2017s2-270 Reconfiguration on Multi-Agent Systems (Robots Systems)]]&lt;br /&gt;
* [[Projects:2017s2-285 Short-term Solutions for the South Australian Electric Power System]]&lt;br /&gt;
* [[Projects:2017s2-295 Feral Animal Detection using IR Thermal Imagery]]&lt;br /&gt;
&lt;br /&gt;
=== 2016 ===&lt;br /&gt;
==== Ingenuity 2016 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 27-28 October 2016&lt;br /&gt;
&lt;br /&gt;
[[File:Ingenuity_2016.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2016s1-101 Predicting Power Outages from Weather Patterns]]&lt;br /&gt;
* [[Projects:2016s1-102 Classifying Internet Applications and Detecting Malicious Traffic from Network Communications]]&lt;br /&gt;
* [[Projects:2016s1-105 Non-Contact Photoplethysmogram]]&lt;br /&gt;
* [[Projects:2016s1-106 Airborne Antenna Measurement Platform]]&lt;br /&gt;
* [[Projects:2016s1-109 Development, Characterisation and Modelling of Renewable Energy-Based Microgrid]]&lt;br /&gt;
* [[Projects:2016s1-122 A Complete Model for a Synchronous Machine]]&lt;br /&gt;
* [[Projects:2016s1-126 A Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2016s1-128 Evaluating Programming Languages for Educational Robotics Kits]]&lt;br /&gt;
* [[Projects:2016s1-132 RF Transceiver Design for a Portable Radar]]&lt;br /&gt;
* [[Projects:2016s1-145 Indoor localisation using Bluetooth LE for Event Advertising]]&lt;br /&gt;
* [[Projects:2016s1-146 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2016s1-160a Cyber Security - IoT and CAN Bus Security]]&lt;br /&gt;
* [[Projects:2016s1-160b Cyber Security - e-Government and Network Security]]&lt;br /&gt;
* [[Projects:2016s1-160c Cyber Security - Personal Networks and Devices]]&lt;br /&gt;
* [[Projects:2016s1-171 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2016s1-172 Computer Aided Testing of Batteries for Energy Storage Applications]]&lt;br /&gt;
* [[Projects:2016s1-187 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s1-196 Wireless Power Transfer]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2016s1-120 Attacking Cancer with Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-121 Measurement of  Transformer Parameters]]&lt;br /&gt;
* [[Projects:2016s1-131 ECG Enhancement with Advanced Signal Processing]]&lt;br /&gt;
* [[Projects:2016s1-141 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2016s1-142 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2016s1-175 Environment Exploring Based on Inertia Measurement Unit and Computer Vision ]]&lt;br /&gt;
* [[Projects:2016s1-180 New Computational Methods for the Super Smart Grid]]&lt;br /&gt;
* [[Projects:2016s1-181 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2016s1-190 Inductive Power Transfer ]]&lt;br /&gt;
* [[Projects:2016s1-197 Sound Triangulation for Invisible Keyboards]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* Tuesday 6 June 2017&lt;br /&gt;
&lt;br /&gt;
[[File:MidyearExpo_2017.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-216 GPS Receiver Location and Atmosphere Characterisation]]&lt;br /&gt;
* [[Projects:2016s2-235 Personal Radar for Safer Walk &amp;amp; Text]]&lt;br /&gt;
* [[Projects:2016s2-236 Electronic Controller for Spatial Microwave Modulator]]&lt;br /&gt;
* [[Projects:2016s2-255 Solar Aquaponics]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2016s2-215 Bhutan Power System Islanding and Special Protection Devices]]&lt;br /&gt;
* [[Projects:2016s2-220 Path Planning and Collision Avoidance for Aduino Robots]]&lt;br /&gt;
* [[Projects:2016s2-230 New Materials for Wearable Antennas in Flexible Electronics]]&lt;br /&gt;
* [[Projects:2016s2-240 Electromyographic Signal Processing for Controlling an Exoskeleton]]&lt;br /&gt;
* [[Projects:2016s2-245 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2016s2-246 Feral Cat Detector]]&lt;br /&gt;
* [[Projects:2016s2-250 On-Line Mains Power Cable Time Domain Reflectometry]]&lt;br /&gt;
&lt;br /&gt;
=== 2015 ===&lt;br /&gt;
==== Ingenuity 2015 ====&lt;br /&gt;
* Adelaide Convention Centre&lt;br /&gt;
* 26-27 October 2015&lt;br /&gt;
[[File:Ingenuity_2015.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects ====&lt;br /&gt;
* [[Projects:2015s1-01 LaunchBox]]&lt;br /&gt;
* [[Projects:2015s1-04 Detecting Cyber Malicious Command-Control (C2) Network Traffic Communications]]&lt;br /&gt;
* [[Projects:2015s1-05 Multi-Profile Parallel Transcriber]]&lt;br /&gt;
* [[Projects:2015s1-06 Performance Evaluation of KALDI Open Source Speech Recogniser]]&lt;br /&gt;
* [[Projects:2015s1-07 Remote AVR Control for Embedded Generation]]&lt;br /&gt;
* [[Projects:2015s1-08 Developing a Home Energy Management System]]&lt;br /&gt;
* [[Projects:2015s1-10 Lagrangian Modelling of Synchronous Machines]]&lt;br /&gt;
* [[Projects:2015s1-11 Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2015s1-12 An Open-Source Local Area Network (LAN)]]&lt;br /&gt;
* [[Projects:2015s1-13 A One-Time Pad Generator]]&lt;br /&gt;
* [[Projects:2015s1-15 AI for a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-16 System Engineering a Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2015s1-17 Analysis of Electrical and Software Design in the Effectiveness of Robotics STEM Outreach Programs]]&lt;br /&gt;
* [[Projects:2015s1-18 ARM Processor For Digital Systems Practicals]]&lt;br /&gt;
* [[Projects:2015s1-21 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2015s1-25 Indoor localisation using Bluetooth LE for event advertising]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-28 Wireless Rotation Detector]]&lt;br /&gt;
* [[Projects:2015s1-31 Cracking the Voynich manuscript code]]&lt;br /&gt;
* [[Projects:2015s1-32 Code Cracking: Who Murdered The Somerton Man?]]&lt;br /&gt;
* [[Projects:2015s1-36 Heartbeat Perception App]]&lt;br /&gt;
* [[Projects:2015s1-40 Flexible ad-hoc Network A:  Physical Layer]]&lt;br /&gt;
* [[Projects:2015s1-42 Rule-based AI Agent Development: Tic Tac Toe]]&lt;br /&gt;
* [[Projects:2015s1-46 Channel Measurements for Search &amp;amp; Rescue]]&lt;br /&gt;
* [[Projects:2015s1-45 Analysis and Visualisation of Packet Data for Cyber-Security Purposes]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (UG)]]&lt;br /&gt;
* [[Projects:2015s1-56 RFID in a Light Bulb]]&lt;br /&gt;
* [[Projects:2015s1-61 Computer Aided Measurement and Analysis of Equal Efficiency Characteristics of Electrical Machines]]&lt;br /&gt;
* [[Projects:2015s1-70 Design of Power Line Communication Coupler for Single-Wire Earth Return Lines]]&lt;br /&gt;
* [[Projects:2015s1-73 Improved Electric Micro-Bus Design for Nepal]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2015s1-09 Development of a Deadman Switch for Tramline Traction Simulation Tool]]&lt;br /&gt;
* [[Projects:2015s1-22 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2015s1-26 Autonomous robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2015s1-35 Brain computer interface control for biomedical applications]]&lt;br /&gt;
* [[Projects:2015s1-50 Tracking, Herding and Routing by Autonomous Smart Cars (PG)]]&lt;br /&gt;
* [[Projects:2015s1-58 Design And Development Of A New Respiratory Monitor For Detection Of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2015s1-71 Inductive Power Transfers]]&lt;br /&gt;
* [[Projects:2015s1-72 Wind Turbine Control Simulator]]&lt;br /&gt;
&lt;br /&gt;
==== Mid-year Expo 2016 ====&lt;br /&gt;
* Adelaide University, Engineering and Maths Building, EM205&lt;br /&gt;
* 31 May 2016&lt;br /&gt;
[[File:MidyearExpo_2016.jpg|600px|center]]&lt;br /&gt;
&lt;br /&gt;
==== Honours Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-201 Development of Energy Storage Knowledge Bank]]&lt;br /&gt;
* [[Projects:2015s2-210 Automated Classification of Heartbeats in Long-Term ECG]]&lt;br /&gt;
* [[Projects:2015s2-211 Health Visa]]&lt;br /&gt;
* [[Projects:2015s2-212 TV Control and Monitoring]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects (mid-year) ====&lt;br /&gt;
* [[Projects:2015s2-202 An On-line PLC frequency Impedance Analyser]]&lt;br /&gt;
* [[Projects:2015s2-203 Analysis of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2015s2-204 Unbalanced Operation of Permanent Magnet Generators]]&lt;br /&gt;
* [[Projects:2015s2-206 Solar Aquaponics]]&lt;br /&gt;
* [[Projects:2015s2-207 Tracking, Herding and Routing by Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2015s2-209 Automated Classification of Brain Activity During Sleep]]&lt;br /&gt;
* [[Projects:2015s2-216 Feral Cat Detector]]&lt;br /&gt;
&lt;br /&gt;
=== 2014 ===&lt;br /&gt;
==== Ingenuity 2014 ====&lt;br /&gt;
Ingenuity 2014 was held at the Adelaide Convention Centre on Thursday 30 October.  It showcased 40 of the school&amp;#039;s completing final year honours and masters projects.&lt;br /&gt;
[[File:Ingenuity_2014_group_shot.jpg|1000px|center]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Final Year Projects ====&lt;br /&gt;
* [[Projects:2014S1-01 Development of Fully Automated Educational and Training Tool for Wind and Solar Energy using National Instruments’ ELVIS Based System]]&lt;br /&gt;
* [[Projects:2014S1-04 All Electric Vehicle for City Use]]&lt;br /&gt;
* [[Projects:2014S1-06 Bell Ringing Robot: Hawkear]]&lt;br /&gt;
* [[Projects:2014S1-10 Development of Machine Learning Techniques for Analysing Network Communications]]&lt;br /&gt;
* [[Projects:2014S1-11 Wireless Rotation Detector for Sport Equipment]]&lt;br /&gt;
* [[Projects:2014S1-12 Exploring RF Energy Harvesting for Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-13 S-band Communication for Small Satellite]]&lt;br /&gt;
* [[Projects:2014S1-15 Inexpensive Portable Radar System]]&lt;br /&gt;
* [[Projects:2014S1-16 Automatic Sorter using Computer Vision]]&lt;br /&gt;
* [[Projects:2014S1-21 Design And Development of a New Respiratory Monitor for Detection of Sleep Apnoea]]&lt;br /&gt;
* [[Projects:2014S1-23 Real-Time Adaptive Filters]]&lt;br /&gt;
* [[Projects:2014S1-24 AI Agent Development for an Autonomous Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-26 Brain Computer Interface Control for Biomedical Applications]]&lt;br /&gt;
* [[Projects:2014S1-29 Measurement and Estimation of Transformer Parameters]]&lt;br /&gt;
* [[Projects:2014S1-33 Software-Defined Radio for VLF Transmission]]&lt;br /&gt;
* [[Projects:2014S1-35 Human Activity Recognition to Support Independent Living]]&lt;br /&gt;
* [[Projects:2014S1-36 What are Social Appliances? Building your Tomorrow Today…]]&lt;br /&gt;
* [[Projects:2014S1-37 Wireless Monitoring and Control of Wine Fermentation Process]]&lt;br /&gt;
* [[Projects:2014S1-42 Current-Voltage Tracer Experiment]]&lt;br /&gt;
* [[Projects:2014S1-44 Cracking the Voynich Manuscript Code]]&lt;br /&gt;
* [[Projects:2014S1-45 Is Secure Communication Possible?]]&lt;br /&gt;
* [[Projects:2014S1-47 Robotic Arm for Trash Collecting Robot]]&lt;br /&gt;
* [[Projects:2014S1-48 FPGA-based Software GPS Receiver]]&lt;br /&gt;
* [[Projects:2014S1-49 Can Solar PV cells be used as Telecommunications Receivers?]]&lt;br /&gt;
* [[Projects:2014S1-50 Exploiting HF Emitters of Opportunity for OTH Radar]]&lt;br /&gt;
* [[Projects:2014S1-51 Heart Signal Processing Software for Evaluating Pacemaker Effectiveness]]&lt;br /&gt;
* [[Projects:2014S1-53 Object Profiling for Custom Wheelchair Seating and Pressure Care]]&lt;br /&gt;
* [[Projects:2014S1-56 Inter-Satellite Links for CubeSats]]&lt;br /&gt;
* [[Projects:2014S1-57 Autonomous Vehicle Technologies]]&lt;br /&gt;
* [[Projects:2014s2-74 Antonomous Robotics using NI MyRIO]]&lt;br /&gt;
* [[Projects:2014s2-76 Teletraffic Modelling and Analysis of the New Britannia Roundabout]]&lt;br /&gt;
&lt;br /&gt;
==== Masters Projects ====&lt;br /&gt;
* [[Projects:2014S1-02 Network Optimisation in Distributed Generation Systems]]&lt;br /&gt;
* [[Projects:2014S1-03 Design of a Mobile Energy Storage System for Grid Integration]]&lt;br /&gt;
* [[Projects:2014S1-14 Wearable RFID Antennas]]&lt;br /&gt;
* [[Projects:2014S1-19 Analysis Of Heart Sound Signals using the Wavelet Transform]]&lt;br /&gt;
* [[Projects:2014S1-31 Autonomous Robot Navigation using a Movable Kinect 3D sensor]]&lt;br /&gt;
* [[Projects:2014S1-38 Semi-Passive Wearable Sensors]]&lt;br /&gt;
* [[Projects:2014S1-39 Tell your Robot where to go with RFID (Improving Autonomous Navigation)]]&lt;br /&gt;
* [[Projects:2014S1-41 Inductive Power Transfer]]&lt;br /&gt;
* [[Projects:2014S1-43 Inverter Drive Experiment]]&lt;br /&gt;
* [[Projects:2014S1-54 Engineering of a CubeSat Power System]]&lt;br /&gt;
* [[Projects:2014s2-71 Calorimetry and Modelling of Lithium-Ion Chemical Batteries]]&lt;br /&gt;
* [[Projects:2014s2-72 Accurate Measurement and Modelling of a Switched-Mode Power Supply]]&lt;br /&gt;
* [[Projects:2014s2-75 Formation Control of Two Autonomous Smart Cars]]&lt;br /&gt;
* [[Projects:2014s2-78 Investigation the Design and Development of Miniature Specific Gravity Sensor]]&lt;br /&gt;
* [[Projects:2014s2-79 FPGA-base Hardware Iimplementation of Machine-Learning Methods for Handwriting and Speech Recognition]]&lt;br /&gt;
* [[Projects:2014s2-80 Swinging Crane Project]]&lt;br /&gt;
* [[Projects:2014s2-82 Grid Integration of Solar PV Embedded Generation]]&lt;br /&gt;
* [[Projects:2014s2-83 A Testing and Characterising Device for batteries of various chemistries]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[:Category:Projects]]&lt;br /&gt;
&lt;br /&gt;
== Resources ==&lt;br /&gt;
=== Project Resources ===&lt;br /&gt;
This contains resources that may be useful for some project groups. For example, manuals for commonly used software tools, or links to useful suppliers may be found here.&lt;br /&gt;
* [[Resources:General Advice]]&lt;br /&gt;
* School links&lt;br /&gt;
** [https://app.smartsheet.com/b/form/bc56c18eb51141649ec3819d5e0d712a Material purchase requests]&lt;br /&gt;
** [https://app.smartsheet.com/b/form/e0e915c66f7445f183aabf34843fac5d Technical support requests]&lt;br /&gt;
** [https://www.eleceng.adelaide.edu.au/projlab/index.php Projects Lab booking system]&lt;br /&gt;
** [https://www.eleceng.adelaide.edu.au/facilitiesbooking/index.php EEE meeting room booking system (only for meetings with supervisors)]&lt;br /&gt;
** [https://eleceng.adelaide.edu.au/intranet/undergraduate/infrastructure-request/ Project bench request form]&lt;br /&gt;
** [https://www.adelaide.edu.au/technology/yourservices/software/personal/ University Licensed software]&lt;br /&gt;
** [https://universityofadelaide.box.com/v/FYPResources FYP resource files]&lt;br /&gt;
* [[Resources:Professional skills]]&lt;br /&gt;
* [[Resources:Project management tools]]&lt;br /&gt;
* [[Resources:Wiki writing resources]]&lt;br /&gt;
* [[Resources:Presentation resources]]&lt;br /&gt;
* [[Resources:Writing resources]]&lt;br /&gt;
* [[Resources:Meeting resources]]&lt;br /&gt;
* [[Resources:Research resources]]&lt;br /&gt;
* [[Resources:Programming language resources]]&lt;br /&gt;
* [[Resources:Electronics suppliers]]&lt;br /&gt;
&lt;br /&gt;
=== School Resources ===&lt;br /&gt;
The school operates a number of teaching laboratories.&lt;br /&gt;
* [[Projects Lab]]&lt;br /&gt;
* [[Electronics Teaching Labs]]&lt;br /&gt;
&lt;br /&gt;
=== Store ===&lt;br /&gt;
&lt;br /&gt;
* [[Electronics Store (EM316)]]&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12232</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12232"/>
		<updated>2018-10-21T14:43:45Z</updated>

		<summary type="html">&lt;p&gt;A1687420: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
===Aim===&lt;br /&gt;
The project aimed to fulfil several goals, including:&lt;br /&gt;
#	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
#	Develop a system that &lt;br /&gt;
##	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
##	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
===Motivation===&lt;br /&gt;
&lt;br /&gt;
The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
Three horizon detection methods and two target detection methods were researched and implemented as a result of this research methodology. Of which, two horizon detection methods and one target detection method with possible enhancements are discussed in this thesis. An overview and some results of the methods investigated by the project partner as a comparison to the methods described in detail in this document. Those approaches can be found in the thesis document produced by the project partner, which is listed in the references. &lt;br /&gt;
&lt;br /&gt;
For testing the produced algorithms, a large data set of imagery was required. This was acquired preliminarily by using imagery available from the internet. An issue with this method of data collection lied in the fact that the sources of the imagery were inconsistent, resulting in an inconsistency in the quality and the level of relevance of the imagery to the application. Thus, further imagery was obtained first hand by recording footage of a ship dock, where there were plenty of movement of different ships arriving and departing the docks. Single frames were then extracted from the capture to test the algorithms.&lt;br /&gt;
&lt;br /&gt;
== Horizon Detection Methods ==&lt;br /&gt;
&lt;br /&gt;
===Hough Transform Method===&lt;br /&gt;
A horizon detection algorithm using the Hough Transform was implemented. The algorithm takes in a colour image but performs its processing steps only on luminance component, so the image is converted to greyscale. &lt;br /&gt;
&lt;br /&gt;
An outline of the developed algorithm follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Apply Gaussian filter smoothening of the image.&lt;br /&gt;
#	Execute a Canny edge detector to produce a binary edge map of the image.&lt;br /&gt;
#	Compute the Hough Transform of the binary edge image&lt;br /&gt;
#	Find the peaks in the Hough Transform data that corresponds to the most dominant line in the image.&lt;br /&gt;
#	Plot the found line using its ‘Rho’ and ‘Theta’ components.&lt;br /&gt;
&lt;br /&gt;
MATLAB’s in-built functions for the Canny Edge Detector and Hough related functions were used for this implementation. As discussed in section 2.1, the Canny Edge Detector is optimal when the input image is pre-processed for smoothening, in order to remove image noise. The Canny detector used in the implementation does not incorporate this, so a Gaussian filter with standard deviation of 1.4 [5], is applied to achieve this noise reduction. &lt;br /&gt;
The implementation makes use of two Hough transform related functions - hough() and houghpeaks(). The former is used to form and plot the Hough transform of the edge map produced by the Canny edge detector and the latter is used to find its peak, representing the most dominant line in the input image. By default, the hough function is calibrated around vertical lines such that a vertical line will be read at an angle of 0° and horizontal lines at ±90°. As the plot of the Hough transform is between -90° and 90°, the horizontal lines are plotted at the edges of the graph, causing inaccuracies in the determination of peaks. This was resolved by rotating the input image by 90°, and searching for a now-vertical line in the image as the horizon. This enables the possibility of narrowing the window of the Hough transform plot to -15° and 15°, reducing the computational load. &lt;br /&gt;
&lt;br /&gt;
Once the Hough transform plot is obtained, the houghpeaks() function is used to locate the peak of the Hough transform plot. For this peak the corresponding ρ and θ values are extracted. Compensating for the rotation, these two values can be used to plot a line, on the original image, with equation:&lt;br /&gt;
y=(ρ-xsin(θ))/(cos⁡(θ)).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===DCT-based Method===&lt;br /&gt;
This section discusses a target detection algorithm using the characteristics of the Discrete Courier Transform (DCT) coefficients. The initial step described in the algorithm is a horizon detection algorithm using these characteristics, before modelling the sea-surface with a Gaussian Mixture Model (GMM) for ship or object detection. &lt;br /&gt;
&lt;br /&gt;
This horizon detection process involves decomposing the luminance component of an input image into 8x8 blocks in order to apply DCT to these blocks. Each 8x8 DCT block is then labelled with a t-score¬, which is a ratio of the mean (A ̅) of each block to the maximum mean over the entire image (A ̅_max); that is t=A ̅/A ̅_max . Optimally, the t-scores obtained from the sea segment will have different range to those obtained from the sky regions of the image, producing a bimodal distribution of these values ranging from 0 to 1. Thus, by selecting a threshold for which t values above the threshold belong to sky regions and below the threshold represent blocks in the sea. The reference paper claims that with 95% confidence that the ideal thresholding value is one from the interval between 0.065 and 0.135. However, it was discovered that this was not consistently true for the test data available, and the optimal threshold varied vastly with varying input images. This was solved by using a function that automatically calculates the threshold by analyzing the distribution of the data in question. The function graythresh() is an in-built function in MATLAB, and assumes that there is a clear bimodal separation in the data. &lt;br /&gt;
Using the thresholding, an initial segmentation of sea and sky in the image is obtained. Another variation to the implementation described in the paper lies in the determination of the horizon line. Zhang draws a horizon line approximately using the central points of all bottommost blocks in the sky region. The adapted implementation, however, uses the Hough Transform method discussed in section 4.1.1 on the segmented binary image to find the location of the horizon line. &lt;br /&gt;
&lt;br /&gt;
An outline of the DCT-based horizon detector follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Resize the image to dimensions divisible by 8.&lt;br /&gt;
#	For each 8x8 block of pixels in the image&lt;br /&gt;
#	Calculate its DCT coefficients&lt;br /&gt;
#	Calculate the mean of the DCT coefficients, disregarding the first AC element, for that block (A ̅)  &lt;br /&gt;
#	Obtain the t-score¬ for each block by dividing the means by the maximum of the means, t=A ̅/A ̅_max .&lt;br /&gt;
#	Obtain the threshold value for all t-scores. This is the value that separates t-scores belonging to the sky from those belonging to the sea pixels. Using this value, test each t-score to determine whether the 8x8 block, from which the score derived from, belongs to the sky component or sea component of the image and obtain a sea-sky segmentation of the image.&lt;br /&gt;
#	Input the segmented image to the Hough Transform line detector outlined in section 4.1 to find the horizon using the DCT-based segmentation. &lt;br /&gt;
&lt;br /&gt;
== Target Detection Method ==&lt;br /&gt;
&lt;br /&gt;
===DCT and GMM Based Target Method===&lt;br /&gt;
This section will describe the primary target detection method developed in this project, which uses the DCT coefficients as obtained in the DCT-based horizon detector outlined in section 4.1.2. The approach taken in this implementation is to split the image into two components, sea and sky, using a horizon line detector. From the literature, the algorithm from uses this horizon detector to find the line between sea and sky. However, the algorithm adapted for this project will use the Hough Transform derived horizon line detector outlined in section 4.1.1 for a more accurate detection. &lt;br /&gt;
The target detection process will use the average of the DCT coefficients in each 8x8 blocks to segment the entire image into two components - sky and not-sky. This achieved by taking the DCT coefficients in each 8x8 block of image pixels and taking the normalized mean as in the DCT-based horizon detection algorithm (4.1.2). Similarly, a threshold value that best separates the DCT averages from sky regions to other regions is found. All blocks with a DCT average less than this threshold value can be classified as sky, while the remaining blocks are classified as not sky. This process will achieve a ‘sea’ and ‘not-sky’ segmentation of the image.&lt;br /&gt;
While this is an effective way to eliminate the sky as a background from the image, the complex textures on the sea surfaces such as waves and wakes, some regions in the sea will pass through the threshold as significant foreground pixels. Thus, the DCT average threshold testing is only applied to the area of the image that resides above the horizon line, which is found using the Hough Transform horizon line detector. &lt;br /&gt;
For the sea region of the image, the texture-based features are extracted as three regions from each 8x8 block of DCT coefficients, as explained in the literature (section 2.2), and listed in a feature vector, X. The feature vector is used to train a Gaussian Mixture Model to represent a sea background model. Hence, the sea pixels used for the training must be a sufficient amount of pixels below the horizon line, to ensure that pixels from potential targets do not contaminate GMM training data. &lt;br /&gt;
Once the GMM is trained, a Mahalanobis distance is calculated for each of the sea-training feature vectors to measure the matching degree between the feature vectors and the Gaussian mixtures and define a threshold to be the maximum distance from a feature vector to the Gaussian mixture model centre. Once this threshold is set, the feature vectors are extracted from the image for all pixels below the horizon line, including any belonging to potential targets. The Mahalanobis distances between each vector and the GMM for the sea background are then calculated and compared with the defined threshold. All calculated distances that are less than the threshold are classified as belonging to sea regions, while others maybe sky or an anomaly. This produces another segmentation of ‘sea’ and ‘not-sea’.&lt;br /&gt;
By subtracting the sea segmentation from the sky segmentation, a binary mask can be obtained where the two background components – sea and sky – will be 0s and potential targets will be 1s. The binary will however have false detections due to outliers in the image data in both the sea and sky regions which can be cleaned up using morphological operations such as erosion and dilation. &lt;br /&gt;
Finally, the detected target is displayed by applying bounding boxes on the remaining regions of 1s in the final binary mask and overlaid on the original input image. An outline of the algorithm follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Resize the image to dimensions divisible by 8.&lt;br /&gt;
#	For each 8x8 block of pixels in the image&lt;br /&gt;
#	Calculate its DCT coefficients&lt;br /&gt;
#	Calculate the mean of the DCT coefficients, disregarding the first direct current element, for that block (A ̅)  &lt;br /&gt;
#	Obtain the t-score¬ for each block by dividing the means by the maximum of the means, t=A ̅/A ̅_max .&lt;br /&gt;
#	Obtain the threshold value for all t-scores. This is the value that separates t-scores belonging to the sky from those belonging to the sea pixels. &lt;br /&gt;
#	Apply the Hough Transform Horizon Detector to find the horizon line.&lt;br /&gt;
#	Using the threshold value, test each t-score from regions above the horizon line to determine whether the 8x8 block belongs to the sky or an anomaly and produce a binary image where the sky background pixels have value 0 and areas of interest have value 1. &lt;br /&gt;
#	Extract the region energies for DCT blocks that are located at least 16 pixels below the horizon and compile the energies into a feature vector of 3 columns, X_train.&lt;br /&gt;
#	Fit a Gaussian Mixture Model to the feature vector.&lt;br /&gt;
#	Compute the Mahalanobis distance between the GMM and each feature vector. Use the maximum distance found as a threshold, T. &lt;br /&gt;
#	Extract the regions energies for all DCT blocks below the horizon line, and compile a feature vector, X.&lt;br /&gt;
#	Calculate the Mahalanobis distance between the GMM and each feature vector.&lt;br /&gt;
#	Use the threshold T to determine whether each block belongs to the sea background or an anomaly on the sea surface.&lt;br /&gt;
#	Combine the two segmentations in order to achieve a binary mask where only regions of interest have high value.&lt;br /&gt;
#	Eliminate any detected regions located sufficiently above and below the horizon line to reduce false detections. &lt;br /&gt;
#	Perform a morphological dilation to increase the detection size and apply a bounding box around the detection. The dilation will help ensure the entire object is bound within the applied box.&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12231</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12231"/>
		<updated>2018-10-21T14:42:38Z</updated>

		<summary type="html">&lt;p&gt;A1687420: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
===Aim===&lt;br /&gt;
The project aimed to fulfil several goals, including:&lt;br /&gt;
#	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
#	Develop a system that &lt;br /&gt;
##	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
##	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
===Motivation===&lt;br /&gt;
&lt;br /&gt;
The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
Three horizon detection methods and two target detection methods were researched and implemented as a result of this research methodology. Of which, two horizon detection methods and one target detection method with possible enhancements are discussed in this thesis. An overview and some results of the methods investigated by the project partner as a comparison to the methods described in detail in this document. Those approaches can be found in the thesis document produced by the project partner, which is listed in the references. &lt;br /&gt;
&lt;br /&gt;
For testing the produced algorithms, a large data set of imagery was required. This was acquired preliminarily by using imagery available from the internet. An issue with this method of data collection lied in the fact that the sources of the imagery were inconsistent, resulting in an inconsistency in the quality and the level of relevance of the imagery to the application. Thus, further imagery was obtained first hand by recording footage of a ship dock, where there were plenty of movement of different ships arriving and departing the docks. Single frames were then extracted from the capture to test the algorithms.&lt;br /&gt;
&lt;br /&gt;
== Horizon Detection Methods ==&lt;br /&gt;
&lt;br /&gt;
===Hough Transform Method===&lt;br /&gt;
A horizon detection algorithm using the Hough Transform was implemented. The algorithm takes in a colour image but performs its processing steps only on luminance component, so the image is converted to greyscale. &lt;br /&gt;
&lt;br /&gt;
An outline of the developed algorithm follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Apply Gaussian filter smoothening of the image.&lt;br /&gt;
#	Execute a Canny edge detector to produce a binary edge map of the image.&lt;br /&gt;
#	Compute the Hough Transform of the binary edge image&lt;br /&gt;
#	Find the peaks in the Hough Transform data that corresponds to the most dominant line in the image.&lt;br /&gt;
#	Plot the found line using its ‘Rho’ and ‘Theta’ components.&lt;br /&gt;
&lt;br /&gt;
MATLAB’s in-built functions for the Canny Edge Detector and Hough related functions were used for this implementation. As discussed in section 2.1, the Canny Edge Detector is optimal when the input image is pre-processed for smoothening, in order to remove image noise. The Canny detector used in the implementation does not incorporate this, so a Gaussian filter with standard deviation of 1.4 [5], is applied to achieve this noise reduction. &lt;br /&gt;
The implementation makes use of two Hough transform related functions - hough() and houghpeaks(). The former is used to form and plot the Hough transform of the edge map produced by the Canny edge detector and the latter is used to find its peak, representing the most dominant line in the input image. By default, the hough function is calibrated around vertical lines such that a vertical line will be read at an angle of 0° and horizontal lines at ±90°. As the plot of the Hough transform is between -90° and 90°, the horizontal lines are plotted at the edges of the graph, causing inaccuracies in the determination of peaks. This was resolved by rotating the input image by 90°, and searching for a now-vertical line in the image as the horizon. This enables the possibility of narrowing the window of the Hough transform plot to -15° and 15°, reducing the computational load. &lt;br /&gt;
&lt;br /&gt;
Once the Hough transform plot is obtained, the houghpeaks() function is used to locate the peak of the Hough transform plot. For this peak the corresponding ρ and θ values are extracted. Compensating for the rotation, these two values can be used to plot a line, on the original image, with equation:&lt;br /&gt;
y=(ρ-xsin(θ))/(cos⁡(θ)).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
===DCT-based Method===&lt;br /&gt;
This section discusses a target detection algorithm using the characteristics of the Discrete Courier Transform (DCT) coefficients. The initial step described in the algorithm is a horizon detection algorithm using these characteristics, before modelling the sea-surface with a Gaussian Mixture Model (GMM) for ship or object detection. &lt;br /&gt;
&lt;br /&gt;
This horizon detection process involves decomposing the luminance component of an input image into 8x8 blocks in order to apply DCT to these blocks. Each 8x8 DCT block is then labelled with a t-score¬, which is a ratio of the mean (A ̅) of each block to the maximum mean over the entire image (A ̅_max); that is t=A ̅/A ̅_max . Optimally, the t-scores obtained from the sea segment will have different range to those obtained from the sky regions of the image, producing a bimodal distribution of these values ranging from 0 to 1. Thus, by selecting a threshold for which t values above the threshold belong to sky regions and below the threshold represent blocks in the sea. The reference paper claims that with 95% confidence that the ideal thresholding value is one from the interval between 0.065 and 0.135. However, it was discovered that this was not consistently true for the test data available, and the optimal threshold varied vastly with varying input images. This was solved by using a function that automatically calculates the threshold by analyzing the distribution of the data in question. The function graythresh() is an in-built function in MATLAB, and assumes that there is a clear bimodal separation in the data. &lt;br /&gt;
Using the thresholding, an initial segmentation of sea and sky in the image is obtained. Another variation to the implementation described in the paper lies in the determination of the horizon line. Zhang draws a horizon line approximately using the central points of all bottommost blocks in the sky region. The adapted implementation, however, uses the Hough Transform method discussed in section 4.1.1 on the segmented binary image to find the location of the horizon line. &lt;br /&gt;
&lt;br /&gt;
An outline of the DCT-based horizon detector follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Resize the image to dimensions divisible by 8.&lt;br /&gt;
#	For each 8x8 block of pixels in the image&lt;br /&gt;
#	Calculate its DCT coefficients&lt;br /&gt;
#	Calculate the mean of the DCT coefficients, disregarding the first AC element, for that block (A ̅)  &lt;br /&gt;
#	Obtain the t-score¬ for each block by dividing the means by the maximum of the means, t=A ̅/A ̅_max .&lt;br /&gt;
#	Obtain the threshold value for all t-scores. This is the value that separates t-scores belonging to the sky from those belonging to the sea pixels. Using this value, test each t-score to determine whether the 8x8 block, from which the score derived from, belongs to the sky component or sea component of the image and obtain a sea-sky segmentation of the image.&lt;br /&gt;
#	Input the segmented image to the Hough Transform line detector outlined in section 4.1 to find the horizon using the DCT-based segmentation. &lt;br /&gt;
&lt;br /&gt;
== Target Detection System ==&lt;br /&gt;
&lt;br /&gt;
===DCT and GMM Based Target Method===&lt;br /&gt;
This section will describe the primary target detection method developed in this project, which uses the DCT coefficients as obtained in the DCT-based horizon detector outlined in section 4.1.2. The approach taken in this implementation is to split the image into two components, sea and sky, using a horizon line detector. From the literature, the algorithm from uses this horizon detector to find the line between sea and sky. However, the algorithm adapted for this project will use the Hough Transform derived horizon line detector outlined in section 4.1.1 for a more accurate detection. &lt;br /&gt;
The target detection process will use the average of the DCT coefficients in each 8x8 blocks to segment the entire image into two components - sky and not-sky. This achieved by taking the DCT coefficients in each 8x8 block of image pixels and taking the normalized mean as in the DCT-based horizon detection algorithm (4.1.2). Similarly, a threshold value that best separates the DCT averages from sky regions to other regions is found. All blocks with a DCT average less than this threshold value can be classified as sky, while the remaining blocks are classified as not sky. This process will achieve a ‘sea’ and ‘not-sky’ segmentation of the image.&lt;br /&gt;
While this is an effective way to eliminate the sky as a background from the image, the complex textures on the sea surfaces such as waves and wakes, some regions in the sea will pass through the threshold as significant foreground pixels. Thus, the DCT average threshold testing is only applied to the area of the image that resides above the horizon line, which is found using the Hough Transform horizon line detector. &lt;br /&gt;
For the sea region of the image, the texture-based features are extracted as three regions from each 8x8 block of DCT coefficients, as explained in the literature (section 2.2), and listed in a feature vector, X. The feature vector is used to train a Gaussian Mixture Model to represent a sea background model. Hence, the sea pixels used for the training must be a sufficient amount of pixels below the horizon line, to ensure that pixels from potential targets do not contaminate GMM training data. &lt;br /&gt;
Once the GMM is trained, a Mahalanobis distance is calculated for each of the sea-training feature vectors to measure the matching degree between the feature vectors and the Gaussian mixtures and define a threshold to be the maximum distance from a feature vector to the Gaussian mixture model centre. Once this threshold is set, the feature vectors are extracted from the image for all pixels below the horizon line, including any belonging to potential targets. The Mahalanobis distances between each vector and the GMM for the sea background are then calculated and compared with the defined threshold. All calculated distances that are less than the threshold are classified as belonging to sea regions, while others maybe sky or an anomaly. This produces another segmentation of ‘sea’ and ‘not-sea’.&lt;br /&gt;
By subtracting the sea segmentation from the sky segmentation, a binary mask can be obtained where the two background components – sea and sky – will be 0s and potential targets will be 1s. The binary will however have false detections due to outliers in the image data in both the sea and sky regions which can be cleaned up using morphological operations such as erosion and dilation. &lt;br /&gt;
Finally, the detected target is displayed by applying bounding boxes on the remaining regions of 1s in the final binary mask and overlaid on the original input image. An outline of the algorithm follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Resize the image to dimensions divisible by 8.&lt;br /&gt;
#	For each 8x8 block of pixels in the image&lt;br /&gt;
#	Calculate its DCT coefficients&lt;br /&gt;
#	Calculate the mean of the DCT coefficients, disregarding the first direct current element, for that block (A ̅)  &lt;br /&gt;
#	Obtain the t-score¬ for each block by dividing the means by the maximum of the means, t=A ̅/A ̅_max .&lt;br /&gt;
#	Obtain the threshold value for all t-scores. This is the value that separates t-scores belonging to the sky from those belonging to the sea pixels. &lt;br /&gt;
#	Apply the Hough Transform Horizon Detector to find the horizon line.&lt;br /&gt;
#	Using the threshold value, test each t-score from regions above the horizon line to determine whether the 8x8 block belongs to the sky or an anomaly and produce a binary image where the sky background pixels have value 0 and areas of interest have value 1. &lt;br /&gt;
#	Extract the region energies for DCT blocks that are located at least 16 pixels below the horizon and compile the energies into a feature vector of 3 columns, X_train.&lt;br /&gt;
#	Fit a Gaussian Mixture Model to the feature vector.&lt;br /&gt;
#	Compute the Mahalanobis distance between the GMM and each feature vector. Use the maximum distance found as a threshold, T. &lt;br /&gt;
#	Extract the regions energies for all DCT blocks below the horizon line, and compile a feature vector, X.&lt;br /&gt;
#	Calculate the Mahalanobis distance between the GMM and each feature vector.&lt;br /&gt;
#	Use the threshold T to determine whether each block belongs to the sea background or an anomaly on the sea surface.&lt;br /&gt;
#	Combine the two segmentations in order to achieve a binary mask where only regions of interest have high value.&lt;br /&gt;
#	Eliminate any detected regions located sufficiently above and below the horizon line to reduce false detections. &lt;br /&gt;
#	Perform a morphological dilation to increase the detection size and apply a bounding box around the detection. The dilation will help ensure the entire object is bound within the applied box.&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12230</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12230"/>
		<updated>2018-10-21T14:36:31Z</updated>

		<summary type="html">&lt;p&gt;A1687420: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Aim&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The project aimed to fulfil several goals, including:&lt;br /&gt;
#	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
#	Develop a system that &lt;br /&gt;
##	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
##	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Motivation&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
#The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Literature Review ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
Three horizon detection methods and two target detection methods were researched and implemented as a result of this research methodology. Of which, two horizon detection methods and one target detection method with possible enhancements are discussed in this thesis. An overview and some results of the methods investigated by the project partner as a comparison to the methods described in detail in this document. Those approaches can be found in the thesis document produced by the project partner, which is listed in the references. &lt;br /&gt;
&lt;br /&gt;
For testing the produced algorithms, a large data set of imagery was required. This was acquired preliminarily by using imagery available from the internet. An issue with this method of data collection lied in the fact that the sources of the imagery were inconsistent, resulting in an inconsistency in the quality and the level of relevance of the imagery to the application. Thus, further imagery was obtained first hand by recording footage of a ship dock, where there were plenty of movement of different ships arriving and departing the docks. Single frames were then extracted from the capture to test the algorithms.&lt;br /&gt;
&lt;br /&gt;
== Horizon Detection Methods ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Hough Transform Method&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
A horizon detection algorithm using the Hough Transform was implemented. The algorithm takes in a colour image but performs its processing steps only on luminance component, so the image is converted to greyscale. &lt;br /&gt;
&lt;br /&gt;
An outline of the developed algorithm follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Apply Gaussian filter smoothening of the image.&lt;br /&gt;
#	Execute a Canny edge detector to produce a binary edge map of the image.&lt;br /&gt;
#	Compute the Hough Transform of the binary edge image&lt;br /&gt;
#	Find the peaks in the Hough Transform data that corresponds to the most dominant line in the image.&lt;br /&gt;
#	Plot the found line using its ‘Rho’ and ‘Theta’ components.&lt;br /&gt;
&lt;br /&gt;
MATLAB’s in-built functions for the Canny Edge Detector and Hough related functions were used for this implementation. As discussed in section 2.1, the Canny Edge Detector is optimal when the input image is pre-processed for smoothening, in order to remove image noise. The Canny detector used in the implementation does not incorporate this, so a Gaussian filter with standard deviation of 1.4 [5], is applied to achieve this noise reduction. &lt;br /&gt;
The implementation makes use of two Hough transform related functions - hough() and houghpeaks(). The former is used to form and plot the Hough transform of the edge map produced by the Canny edge detector and the latter is used to find its peak, representing the most dominant line in the input image. By default, the hough function is calibrated around vertical lines such that a vertical line will be read at an angle of 0° and horizontal lines at ±90°. As the plot of the Hough transform is between -90° and 90°, the horizontal lines are plotted at the edges of the graph, causing inaccuracies in the determination of peaks. This was resolved by rotating the input image by 90°, and searching for a now-vertical line in the image as the horizon. This enables the possibility of narrowing the window of the Hough transform plot to -15° and 15°, reducing the computational load. &lt;br /&gt;
&lt;br /&gt;
Once the Hough transform plot is obtained, the houghpeaks() function is used to locate the peak of the Hough transform plot. For this peak the corresponding ρ and θ values are extracted. Compensating for the rotation, these two values can be used to plot a line, on the original image, with equation:&lt;br /&gt;
y=(ρ-xsin(θ))/(cos⁡(θ)).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;DCT-based Method&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
This section discusses a target detection algorithm using the characteristics of the Discrete Courier Transform (DCT) coefficients. The initial step described in the algorithm is a horizon detection algorithm using these characteristics, before modelling the sea-surface with a Gaussian Mixture Model (GMM) for ship or object detection. &lt;br /&gt;
&lt;br /&gt;
This horizon detection process involves decomposing the luminance component of an input image into 8x8 blocks in order to apply DCT to these blocks. Each 8x8 DCT block is then labelled with a t-score¬, which is a ratio of the mean (A ̅) of each block to the maximum mean over the entire image (A ̅_max); that is t=A ̅/A ̅_max . Optimally, the t-scores obtained from the sea segment will have different range to those obtained from the sky regions of the image, producing a bimodal distribution of these values ranging from 0 to 1. Thus, by selecting a threshold for which t values above the threshold belong to sky regions and below the threshold represent blocks in the sea. The reference paper claims that with 95% confidence that the ideal thresholding value is one from the interval between 0.065 and 0.135. However, it was discovered that this was not consistently true for the test data available, and the optimal threshold varied vastly with varying input images. This was solved by using a function that automatically calculates the threshold by analyzing the distribution of the data in question. The function graythresh() is an in-built function in MATLAB, and assumes that there is a clear bimodal separation in the data. &lt;br /&gt;
Using the thresholding, an initial segmentation of sea and sky in the image is obtained. Another variation to the implementation described in the paper lies in the determination of the horizon line. Zhang draws a horizon line approximately using the central points of all bottommost blocks in the sky region. The adapted implementation, however, uses the Hough Transform method discussed in section 4.1.1 on the segmented binary image to find the location of the horizon line. &lt;br /&gt;
&lt;br /&gt;
An outline of the DCT-based horizon detector follows:&lt;br /&gt;
#	Load input image in colour and extract luminance component.&lt;br /&gt;
#	Resize the image to dimensions divisible by 8.&lt;br /&gt;
#	For each 8x8 block of pixels in the image&lt;br /&gt;
#	Calculate its DCT coefficients&lt;br /&gt;
#	Calculate the mean of the DCT coefficients, disregarding the first AC element, for that block (A ̅)  &lt;br /&gt;
#	Obtain the t-score¬ for each block by dividing the means by the maximum of the means, t=A ̅/A ̅_max .&lt;br /&gt;
#	Obtain the threshold value for all t-scores. This is the value that separates t-scores belonging to the sky from those belonging to the sea pixels. Using this value, test each t-score to determine whether the 8x8 block, from which the score derived from, belongs to the sky component or sea component of the image and obtain a sea-sky segmentation of the image.&lt;br /&gt;
#	Input the segmented image to the Hough Transform line detector outlined in section 4.1 to find the horizon using the DCT-based segmentation. &lt;br /&gt;
&lt;br /&gt;
== Target Detection System ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12229</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12229"/>
		<updated>2018-10-21T14:30:01Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Research Methodology */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Aim&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The project aimed to fulfil several goals, including:&lt;br /&gt;
#	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
#	Develop a system that &lt;br /&gt;
##	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
##	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Motivation&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
#The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Literature Review ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
Three horizon detection methods and two target detection methods were researched and implemented as a result of this research methodology. Of which, two horizon detection methods and one target detection method with possible enhancements are discussed in this thesis. An overview and some results of the methods investigated by the project partner as a comparison to the methods described in detail in this document. Those approaches can be found in the thesis document produced by the project partner, which is listed in the references. &lt;br /&gt;
&lt;br /&gt;
For testing the produced algorithms, a large data set of imagery was required. This was acquired preliminarily by using imagery available from the internet. An issue with this method of data collection lied in the fact that the sources of the imagery were inconsistent, resulting in an inconsistency in the quality and the level of relevance of the imagery to the application. Thus, further imagery was obtained first hand by recording footage of a ship dock, where there were plenty of movement of different ships arriving and departing the docks. Single frames were then extracted from the capture to test the algorithms.&lt;br /&gt;
&lt;br /&gt;
== Results ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12228</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12228"/>
		<updated>2018-10-21T14:24:32Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Introduction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Aim&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The project aimed to fulfil several goals, including:&lt;br /&gt;
#	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
#	Develop a system that &lt;br /&gt;
##	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
##	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Motivation&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
#The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Literature Review ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Results ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12227</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12227"/>
		<updated>2018-10-21T14:21:52Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Introduction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Aim&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
#The project aimed to fulfil several goals, including:&lt;br /&gt;
#1.	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
#2.	Develop a system that &lt;br /&gt;
##a.	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
##b.	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Motivation&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
#The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Literature Review ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Results ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12226</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=12226"/>
		<updated>2018-10-21T14:17:11Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Introduction */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;br /&gt;
&lt;br /&gt;
== Introduction ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Aim&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The project aimed to fulfil several goals, including:&lt;br /&gt;
1.	Research and review existing image processing techniques on the topic of object and/or ship detection and recognition through optical images. An understanding of existing techniques is important to determine whether or not and how they can be incorporated into any designs determined for the output of the project.&lt;br /&gt;
2.	Develop a system that &lt;br /&gt;
a.	At a close range to a detected ship, can recognise the ship type, estimate its range from the submarine and estimate its orientation. &lt;br /&gt;
b.	At long range to a detected ship, provide guidance as to what the ship might be (type verification) and provide an estimate of its range and orientation. &lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Motivation&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
The project was sponsored by Saab Australia, who have a major influence in the Australian Defence industry. With the upcoming submarine research and development plans, Saab Australia is highly motivated in conducting research into different technologies that may be comprised in a submarine.&lt;br /&gt;
For more than a century, submarine operators relied solely on direct-view optical periscopes to gain an insight to the environment above the surface of the water [1]. A long part of this time consisted of  operators relied on black and white images of ship silhouettes to identify any vessels that are viewed through the periscope. However, with the evolution of technology, electronic periscopes has been developed, providing many forms of assistance to the operators in many ways. The motivation for the research conducted in this project is to develop and test image processing techniques for ship and threat detection through the optical images that can be obtained through an electronic periscope of a submarine, providing guidance to the operators in identifying the environment around them.&lt;br /&gt;
&lt;br /&gt;
== Literature Review ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Research Methodology ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Results ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
== Conclusion ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;...&amp;#039;&amp;#039;&amp;#039;&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=9971</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=9971"/>
		<updated>2018-04-13T06:40:27Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Project Team */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige &lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=9970</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=9970"/>
		<updated>2018-04-13T06:40:04Z</updated>

		<summary type="html">&lt;p&gt;A1687420: /* Project Team */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
	<entry>
		<id>https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=9969</id>
		<title>Projects:2018s1-110 Future Submarine Project</title>
		<link rel="alternate" type="text/html" href="https://projectswiki.eleceng.adelaide.edu.au/projects/index.php?title=Projects:2018s1-110_Future_Submarine_Project&amp;diff=9969"/>
		<updated>2018-04-13T06:39:26Z</updated>

		<summary type="html">&lt;p&gt;A1687420: Created page&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
== Project Team ==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Students&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Tharidu Maliduwa Arachchige&lt;br /&gt;
&lt;br /&gt;
Jacob Parker&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Supervisors&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Dr. Danny Gibbins &lt;br /&gt;
&lt;br /&gt;
Igor Dzeba (SAAB)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Abstract ==&lt;br /&gt;
This project involves the research into and development of a Contact Detection method to be used with a Submarine Optronics System. The aims of this project include:&lt;br /&gt;
 &lt;br /&gt;
# Detecting the presence of a ship at long range, and if possible estimate the range and aspect of the ship&lt;br /&gt;
# At closer range, the ship or target should be detected, the type of ship must be identified/verified and its range and aspect/orientation information should be determined. &lt;br /&gt;
# Certain challenges must be overcome, such as:&lt;br /&gt;
## Accumulating a sufficient database storing ship models, for the large number of different ship types that would need to be recognised by the system.&lt;br /&gt;
## Limited viewing condition as:&lt;br /&gt;
### The submarine&amp;#039;s periscope often only reach a height of 60cm above the surface of the water&lt;br /&gt;
### Tall waves and curvature of the Earth may mean that the ships are only partially visible&lt;br /&gt;
### Bad weather conditions means that a clear horizon line may not be visible, and ships/targets blend in with the backgrounds&lt;br /&gt;
## There may be non-significant anomalies in the obtained image that are not ships/targets, and should be ignored by the system (e.g. landmasses, birds/sea-ainmals on the surface, infrastructures on the surface of the water).&lt;br /&gt;
&lt;br /&gt;
The project will involve thorough literature review for object detection methods using image processing techniques in maritime applications, software development and testing. This is an industry sponsored project, sponsored by SAAB Australia.&lt;/div&gt;</summary>
		<author><name>A1687420</name></author>
		
	</entry>
</feed>