Friday, 12 February 2016

Team Pi Over Four

Team: Pi Over Four
 
Introduction:
The following test demonstrates the three key areas of the lampbotics sprint 1 which are:
  • Servo movements based on current positioning obtained from face tracking
  • Face tracking though Open CV to obtain X, Y and Z coordinates 
  • Sound also based on current positioning obtained from face tracking
The base design operation of the below tests uses threads to communicated values obtained from the face detection section of the design to initializes the sounds and servos in pre defined action based on the values of X, Y and Z. 

Hardware:
The hardware setup uses the pi camera mounted on a servo bracket and for demonstration proposes only one of the position values was implemented in this case the X position. 
The wiring diagram to setup the servos which is shown below and connects to the raspberry pi though the following pin connections:
  • Servo Red wire connected to 3.3V pin 2 on raspberry pi  
  • Servo Black wire connected to GND pin 6 on raspberry pi   
  • Servo White wire connected to Signal pin 11 on raspberry pi 
 The below diagram also shows due to the fact that the servos which are implemented are limited to only rotate 180 degrees which means that a person can be only tracked within a 180 degree range. 

 Also shown below is the GPIO pin outputs which can be was used to located the pins to connected the servos to the raspberry pi:
Test Demonstration:
The following video clip demonstrates the operation of capturing an image, processing the face position, relaying the positioning information to the servos and sound threads and acting upon the values which only included the X position for demonstration proposes. 
 
 Results:
  • vision:The vision section of the design demonstrate the vision capabilities of the raspberry pi though the python code which was able to process an image and could then identify human features in this case a persons front face. The values obtained from the process of the image could then be sent globally to other threads. Due to the fact that the detection is based on the front features of a human face causes the face to be lost as the camera moves. 
  • sound: The sound thread received the value of the positions sent globally from the vision thread. If the value received indicated that there was add-quit positioning change the code would then initiate the raspberry pi to call a saved mp3 file to be played.
  • motors: The motor thread would receive the value from the vision thread. it then would take this value and compute were it was to move to and send this through the gpio pin to the motor to move to the new position. The motor setup which was used in this demonstration was a simple version and for future work the XMOS board being developed by Filipe Terra would be used to control the multiple servos.

Reflection
From this test we were able to better understand the flow of the code and how all the different threads interacted together. When the servos moved to a new position the position change was not recorded which lead to sporadic movements. To eliminate this type of bug code shown in appendix was developed and which could then be implement to solve this bug.

Authors:
Philip O Connor
David O Mahony
Team Pi Over Four entry 2,
 
Appendix:
#servo range 0 to 180 degrees
past = 90;
Xposition="value taken from face detection"
 
if (Xposition > past+10)
      move=(past+10);
      "motor control code using move value"
      past = (past +10);
else if (Xposition < past-10)
      move=(past-10);
      "motor control code using move value"
      past = (past -10);
else 
      "motors hold last value" 

No comments:

Post a Comment