The face tracking was coded by Sean implementing Python code. David and Herve worked on the threads.
I produced code to receive the values from Sean's python code to the embed. After installing the windows driver from the Mbed site i used a few lines of code to receive the data from the port. I then decided to use case statements to call a thread or threads depending on the number received. As the data was received the direction of the servos was shown on the lcd.
Once everything was combined the face tracking would move the servos reasonably smoothly to find the centre point.
Although we worked as a team, Sean and myself worked on getting the Mbed to link up and receive data from robo realm using python. I have very little working knowledge of python programming but watching Sean using it, it proved to be a powerful script working in parallel with roborealm. I had very little input with roborealm and left it all to Sean's expertise, my comfort zone would be in the physical world of electronics rather rather than getting in depth with programming. I feel an opportunity was missed in getting a more hands on aspect of computer programming from Sean rather than sticking with the Mbed.
I initially used roborealm to face track but didnt know where to begin to link it with the comms. I spent my time on getting the servos operating smoothly from the chassis and initial code to receive data from hyperterminal and then from roborealm through the face tracking.
It can be frustrating to listen and watch how others can quickly have a working model whilst I was still trying to install the windows driver (many thanks to Ali for pointing me in the right direction). Over the 3 weeks the objectives were met with face tracking and the servos operating.
With the merging of the two disciplines of Computer programming and Applied electronics the main emphasis seems to be programming at the moment rather than being applied to electronic circuits.