Friday, 22 April 2016

Heng - Sprint 3: Final Reflection

Author: Heng Zhen Jing
Date: 22/4/2016
Title: Final Reflection

So, it has come to the end of the sprint. In this reflection, an overview of  opinions in terms of  personal, technical, teamwork, and project management will be written. Just a brief  intro on what has been completed in this final sprint. The Kivy interface now has been incorporated into main threading system, and can be successfully triggered by running the main thread. On the Kivy interface, 3 buttons were created, with 1 to perform ColorDetection, another 1 to perform FaceRecognition, and exit button. When one of the buttons is pressed, flags will be sent and recognized by the associated function. The specific function will then run after receiving its flag = True. More of the details can be found at my last blog post.

Personal Reflection
In short, this Module is my favorite among all the others, mainly because of the practical working environment, and friendly/helpful teammates as well. In these 3 sprints, I had gained lots of knowledge regarding GUI frameworks, especially Kivy, which is the main GUI interface we are using in Lampbotics. Besides, in writing codes for MPG123 player to play sound files during Sprint 1, I had garnered knowledge related to play sound over RaspiPi, such as Pygame, Omxplayer and MPG123. All these experiences had boosted up my familiarity and coding ability in Python via Raspberry Pi2. I am very grateful that this course had enabled me to know more about Python, as Python is a very famous programming language used at real working environment. Besides all the knowledge and skills I had gained, a better relationship has been forged among all of us. We were working together constantly towards the same aims and goals. We often communicate and cooperate with each other, and to me, this is the crucial part of this module - teamwork. As this is the only teamwork-based project we have, so Lampbotics is considered a platform for us to practice our teamwork skills and spirits. I am very glad that we all were able to exhibit excellent teamwork and cooperation throughout the project. 

Technical Reflection
In the beginning, I have 0 knowledge regarding Python and Raspberry Pi. With the help of others, I was able to get familiarize with them. For us, electronics based student, assembly and C language are the one we deal with all times, therefore Python does sound a little bit alien to me. It turns out to be pretty simple for me to learn its basic stuff. The whole module was then divided into 3 sprints, and each person will be assigned for a specific task and goals for each sprint. Here are the tasks I've been assigned throughout the sprints:

Sprint 1 - Sound effect of Lampbotics
In this sprint, I had gained knowledge related in playing sound over Raspberry Pi2. Different methods were researched such as Pygame, OmxPlayer, MPG123Player and the list goes on. There are various methods in playing sound, and we had finallly chosen MPG123Player to play sound effects as it is easy to be implemented. Some issues were encountered and solved throughout the process. For instance, settings and configuration of sound by forcing the audio to 3.5mm headphone jack. All these researching, designing and debugging codes had enhanced my knowledge on playing sound over Raspberry Pi2.

Sprint 2 - Graphical User Interface of Lampbotics
I was then assigned to design a GUI for Lampbotics, so it can interacts with users. A Raspberry Pi2 7 inch Touch Display was being used to provide an open source GUI frameworks named Kivy for users to choose the desired modes such as Face Recognition, Color Detection, or Shape Detection. Firstly, a basic Kivy program was written in such a way that it enables a camera screen to be appeared onto the screen, to capture an image, and display it out. This can be refered from my blog entry #6 (Kivy with Camera). 

Sprint 3 - Integration of Kivy
In the final sprint, Kivy was incorporated into main threading design. Instead of being a stand-alone program, it is now functioning as part of the main system. Different flags will be sent according to the buttons pressed, and these flags will be recognized by the associated function. For instance, when button "Color" is pressed, flag indicating "color = True" will be sent and received by ColorDetection program. After receiving the flag, Color Detection program will be run. 

In Sprint 2 and 3, lots of knowledge regarding Kivy, and other GUI frameworks had been gained. To start with, firstly, several GUI frameworks were researched to decide which is the best. From this, pros and cons of each GUI frameworks were determined. Kivy was found out to be the best suitable as it is User Friendly, and its interface is the most elegant among all. After incorporating Kivy interface into main thread, bugs and errors were found, and they were all successfully being surmounted. All the issues faced and changes made can be found at blog entry #8 (Integration of Kivy). Throughout this, I had acquired a better familiarity and skills in designing Kivy codes.

Threads & IPC
In order to integrate all the pieces into one whole system, a thread designed by Kamil was used. Generally, a thread is placeholder information associated with a single use of a program that can handle multiple concurrent users. From the program's point-of-view, a thread is the information needed to serve one individual user or a particular service request. If multiple users are using the program or concurrent requests from other programs occur, a thread is created and maintained for each of them. The thread allows a program to know which user is being served as the program alternately gets re-entered on behalf of different users. By using Kamil's thread, outputs (flags) will be assigned from my program and these will then sent to other programs via IPC. IPC, inter-process communication refers specifically to the mechanisms an operating system provides to allow processes it manages to share data. For instance, the following callback function when a button associated with Face Recognition is pressed will outputs flags indicating "face" = True, and "color" = False. If the flag is set for that particular function, the program within that function will be run.

# Callback function for photo button
def photo_callback(obj):
# Send output flag to trigger Face Recognition Function
        # Define filename with timestamp
        photoName = "thumbnail.jpg"
        # Resize the high res photo to create thumbnail + photoName).resize(photoResize, Image.ANTIALIAS).save(photoPath + "thumbnail.jpg")

Team Reflection
In the very first sprint, we were divided into several small groups, and I was assigned to work with Luke, Shaun and Kamil, with the team name as "Bohemian Raspbian". We are all from different backgrounds, thus giving out different ideas and opinions. We often communicate with each other to ensure understanding of each other codes and program. In sprint 2, all of us were split into ESD and Mechanical teams. I was assigned to ESD side, to provide a GUI for Lampbotics. Effective communications and cooperation can be seen within ESD team members, as each individual program has to be merged into one main system at the end. Furthermore, we also often communicate with Mechanical side, as servos and face recognition function have to be in sync. Threading workshop was also held by Kamil, to standardize our own individual stand-alone program, so as to be incorporated into main threading system. On top of that, everyone is very helpful and willing to exaplain their codes to the others. This teamwork has successfully rapport a better relationship among us, and lead us in designing a better Lampbotics.

Project Management Reflection
In the first sprint, I was assigned to the group "Bohemian Raspbian". We all had agreed on the deliverable deadlines due on each Wednesday, as we only have classes on Mon, Wed and Thurs. Therefore, double classes on Monday were used to integrate our own individual programs into one, debugging and testing. For Wed and Thurs, we were working on our own task. For me, the project progression timeline was well managed, as each of us is contributing to the project effectively. From Sprint 2 onward, we often had meetings on Monday, to explain our own on-going individual programs, aims to be achieved and time frame to hit the goals. During Sprint 3, quick meetings were often held to enable each of us understand the work progression of others as this sprint was meant to integrate all pieces of individual programs into one whole system. These meetings really help alot in term of project management. By having these meetings, we could have a clear goals, then works can be done to achieve those goals within the specified timelines.  

Thoughts on the next generation
As the camera has to be connected to the 7inch Touch Display which is situated at the bottom base, thus camera itself has to be placed at the base as well. In term of user friendliness and design, it is not the best way. This is because, camera (eye) is usually placed at the lamphead, in order to perform a wider range of camera's vision with the help of the rotation, tilting movement of the head. However, due to this constraints, camera was placed at the base and this fixed the camera vision at one position, limiting X and Y coordinates. Therefore, it would be nice if a way to communicate the camera's Pi with Touch Display's Pi can be figured out. This can avoid having ribbon cable to connect camera to Display using only one Pi. Also, Kivy interface can be explored more as there are still alot of widgets such as slider, checkboxes, etc have not been implemented. 

Lastly, I would like to express my greatest gratitude to all the members involved in this project, for helping me throughout the journey. Also, Thanks Jason for keep motivating us in moving forward, and encouraging us to be positive all the time. (Smile!)

Below are the links to each of my blog entry:
Blog 1
Blog 2
Blog 3
Blog 4
Blog 5
Blog 6
Blog 7
Blog 8

No comments:

Post a Comment