Difference between revisions of "Projects:2018s2-270 Autonomous Ground Vehicles Self-Guided Formation Control"
| Line 2: | Line 2: | ||
== Abstract == | == Abstract == | ||
| − | This project | + | This project introduces a distributed system that is able to handle multiple agents |
| − | + | at once. Each individual of this system will carry a camera alone and make own | |
| − | + | decisions. However, the decisions are also depend on the communication between | |
| + | the individuals, the data through communication includes the coordinates of | ||
| + | current positions, the wheel angles, velocities, and the distances from one position | ||
| + | to another. By integrating these information together, the system will come up | ||
| + | with a best solution for current scenario and command the each agent to move to | ||
| + | the desired position. | ||
== Project Team == | == Project Team == | ||
| Line 12: | Line 17: | ||
Fan Yang | Fan Yang | ||
| − | |||
| − | |||
Qiao Sang | Qiao Sang | ||
| Line 23: | Line 26: | ||
Yutong Liu | Yutong Liu | ||
| + | |||
| + | Yuan Sun | ||
| Line 30: | Line 35: | ||
Prof. Cheng-Chew Lim | Prof. Cheng-Chew Lim | ||
| + | |||
| + | '''Objectives:''' | ||
| + | |||
| + | Based on the information we know so far, the objectives of this project in the first | ||
| + | semester can be concluded as below: | ||
| + | 1. Vision-based sensing. | ||
| + | 2. Route planning. | ||
| + | 3. Formation control. | ||
| + | 4. Develop a distributed control system. | ||
| + | |||
| + | '''System Design:''' | ||
| + | |||
| + | Starting from the highest level of the multi-agent system. This system will be | ||
| + | containing four agents, it is a distributed system, which means each agent in this | ||
| + | system can make their own decision. But even with this ability, the four agents still | ||
| + | need to communicate with each other about the velocity, the current position, the | ||
| + | destination and the angle of the agent. The exchange of these data allows them to | ||
| + | make an agreement so that they can achieve the desired formation. This part was | ||
| + | done by Fan. | ||
| + | Limited by the tools we have in reality, we have to skip some convenient way to | ||
| + | approach final result, such as the ultra-sonic sensor, or the infrared distance meter. | ||
| + | In this case, we use the camera instead to produce a coordinate system to measure | ||
| + | the distances between objects and the agents. We also use the camera to apply | ||
| + | image recognition so that the agents are able to recognize the object in its “sight”. | ||
| + | By passing these information into processor, our algorithm will provide an optimized | ||
| + | path to the agent, the agent then give commands to the motors. The description | ||
| + | above is only for one single agent. | ||
Revision as of 15:44, 22 March 2019
Abstract
This project introduces a distributed system that is able to handle multiple agents at once. Each individual of this system will carry a camera alone and make own decisions. However, the decisions are also depend on the communication between the individuals, the data through communication includes the coordinates of current positions, the wheel angles, velocities, and the distances from one position to another. By integrating these information together, the system will come up with a best solution for current scenario and command the each agent to move to the desired position.
Project Team
Group Members:
Liang Xu
Fan Yang
Qiao Sang
Advisors:
Xin Yuan
Yutong Liu
Yuan Sun
Supervisors:
Prof. Peng Shi
Prof. Cheng-Chew Lim
Objectives:
Based on the information we know so far, the objectives of this project in the first semester can be concluded as below: 1. Vision-based sensing. 2. Route planning. 3. Formation control. 4. Develop a distributed control system.
System Design:
Starting from the highest level of the multi-agent system. This system will be containing four agents, it is a distributed system, which means each agent in this system can make their own decision. But even with this ability, the four agents still need to communicate with each other about the velocity, the current position, the destination and the angle of the agent. The exchange of these data allows them to make an agreement so that they can achieve the desired formation. This part was done by Fan. Limited by the tools we have in reality, we have to skip some convenient way to approach final result, such as the ultra-sonic sensor, or the infrared distance meter. In this case, we use the camera instead to produce a coordinate system to measure the distances between objects and the agents. We also use the camera to apply image recognition so that the agents are able to recognize the object in its “sight”. By passing these information into processor, our algorithm will provide an optimized path to the agent, the agent then give commands to the motors. The description above is only for one single agent.