Projects:2014S1-16 Automatic Sorter using Computer Vision

From Projects
Revision as of 23:11, 29 October 2014 by A1195519 (talk | contribs) (Graphical User Interface (GUI))
Jump to: navigation, search


The aim of this project is to integrate computer vision with autonomous robotics to sort playing cards. The project is primarily a demonstration tool for potential electronic engineering students to interactively show technology capabilities.

Completed Project.

Project information

The objectives of the project identified at the beginning are the following:-

  • Sort a full deck of standard playing cards
  • Use computer vision to differentiate between cards
  • Perform the following sorts:
    • Full Sort
    • Suit Sort
    • Colour Sort o Value Sort
  • Have a focus on electrical engineering, particularly image processing and reduce mechanical requirements


This project was tackled by breaking it into four subsections.

  • Computer Vision
  • Robotics
  • Card Sorting Algorithms
  • Graphical User Interface

Project Breakdown

Image Processing

Image Processing Block Diagram.

The purpose of the image processing software of this project is to distinguish between different cards.

This is done using a four step method:-

  • Find the outline of playing card on the black background
  • Crop and warp the playing card so that it is a perfect rectangle
  • Crop the suit and value images from the top left corner
  • Run Optical Character Recognition software on the suit and value images

The digital image techniques used in the above steps are as follows:-

(Refer links for further details)

Image Processing Steps Graphically Represented. (a) Original Image; (b) Outline of Card; (c) Corners of Card; (d)Suit and Value of Card Identified to be cropped

Robotics and Kinematics

Arduino program

  • Connect with matlab via usb
  • The input from matlab is all the new angles for each servo
  • Output to matlab when robotic arm has finished moving

Matlab Program

  • Decide which set of movements to use depending on where a card is picked from and where it is placed
  • Use inverse kinematics to determine angles of the robotic arm depending on where the card is to be placed and how high the stack is
  • Small calibrations are made to correct the inverse kinematic method
  • Movements are included to ensure robotic arm doesn't bump in to anything
  • Movements are included to ensure robotic arm doesn't pick up two cards stuck together by the electrostatic force between them

Card Sorting Algorithms

Bucket sort.

The cards are sorted via one of the following methods chosen using the GUI:

  • Separate Colours
  • Separate Suits
  • Separate Values
  • Select-A-Card (pick cards you want the robot to find)
  • Full Sort (back to a brand new deck order)

The full sort algorithm is based on a bucket/postman and selection sort and occurs in three stages:

  • Cards are partitioned into buckets dependent on their value shown in the image to the right
  • Each bucket is emptied out progressively onto the board
  • Cards are selected from emptied out buckets in order and placed in sorted stacks dependent on their suit

Graphical User Interface (GUI)

The GUI was implemented to enhance the projects interactiveness. The GUI aids in showing the viewer the image processing as it happens with live snapshots of images along with live decisions made by the computer vision on the value of the card. Similarly, due to the table showing the entire data structure of cards that have been scanned and sorted, the user can not only track the sorting process but ‘see’ what is below the top card in a given stack.

Project Significance

This project acts as a proof of concept for the possible uses of combining computer vision and robotics. It proves that with more time and more advanced hardware the combination of the two could produce systems with great potential. Examples of industries which could benefit from these types of systems include; manufacturing, medical sciences, the military, artificial intelligence and the list goes on. The project also acts as a demonstration to entice future engineering students, and show off the possibilities of electrical engineering.

Team

Group members

  • Mr Daniel Currie
  • Mr Daniel Pacher
  • Mr Jonathan Petrinolis

Supervisors

  • Dr Brian Ng
  • Dr Braden Phillips

Team Member Responsibilities

The project responsibilities are allocated as follows:

  • Mr Daniel Currie - Image Processing
  • Mr Daniel Pacher - Sorting Algorithms/GUI/Hardware Selection
  • Mr Jonathan Petrinolis - Kinematics/Robotic Arm

Resources

  • Bench 16 in Projects Lab
  • Lynxmotion AL5D Robotic Arm
  • Arduino Botboarduino Microcontroller
  • Microsoft Lifecam Camera
  • Matlab
  • Computer