Thursday, 9 February 2017

RTOS Case Study - David Walsh



Introduction

The first part of our module in Applied Embedded Operating System was to carry out a case study into how Real Time Operating System (RTOS) works and the platform that the RTOS will be studied on is the Mbed development board. The advantage of using the Mbed platform is that it comes with free online compiler and has plenty of support and programs to verify the functionality of the board.

What is a RTOS?



Operating systems are a computer program that has been designed so a computer can carry out its function, and provides services to other programs that run on the computer. Operating system manages all the software and hardware on a computer and when programs need to access the CPU, memory, and storage it’s the operating system that is charge of giving them access. It seems like when you have a computer that many programs are executing at the same time, but the processor can only run a single thread at any given point. A big part of the operating system is scheduling the programs that run and switching priority between programs gives the sense of happening all at the same time. To understand the function of a RTOS, four parts where studied in our case study: Threads, Mutexs, Semaphores and Queues with memory pools.

Threads




Threads are a way of the RTOS of scheduling the program code, it has its program counter that keeps track of which instruction to execute next. It can leave one thread and go to another with the previous thread been added back to the queue and when the RTOS comes back to that thread it will start back where it left that position from the previous time. When executing a thread if it comes to part of the program where it has a wait / delay, in our study it was thread :: wait (500) for example, the operating system does not wait it goes to next priority in the queue. The advantage of using threads is greater efficiency of the operating system as multiple task can be carried out. If RTOS was not using threads the operating system would be less efficient as would be blocked from leaving tasks when a delay would be used.

Mutex
 The best way to describe mutex is by the key analogy, say that you’re in a bar and there is only one key for the toilet, it’s a shared resource so if the toilet is in use then the key is not available as its in use so you must wait in queue for the key to be handed back to the bar man to get access. It’s the same in code where the function stdio_mutex.lock(); this blocks any other task on the operating system from running till the task that has the mutex releases it by the function stdio_mutex.unlock(). 










Semaphores
Semaphores are like Mutexs as they can be looked at by the analogy of a key system, but with semaphores there can be more than one key available to the operating system to delegate to task that need to be run and are carried out by highest priority in the queue. When using semaphores, it’s all about how many slots available you make available for the RTOS to delegate who gets to operate and who gets to wait in queue. When the task that has the semaphore has completed his task it must release it back to the operating system and it will assign the semaphore to the next in queue. The lines that do this from the code that we ran in the case study are:
Semaphore two_slots(2);
two_slots.wait();
two_slots.release();


From this picture, you can get an idea how semaphores work where one thread can runs release then the next thread can run.

Signals
Signals are used to trigger execution states between threads. The signal function allows you to control or wait for signal flags. In the program that was run for the case study the first thread  in main was generating the signal and sending the signal  0 x 1 to the second thread is waiting for the signal and when it receives the signal it will do whatever task that thread is in charge of.







Queues and Memory pools
Memory pools are fixed size blocks of memory that are thread safe.  They operate much faster and do not suffer from fragmentation. Being thread safe, they can be accessed from other threads, shared memory is used to exchange information between threads. Three of the important lines of the code run to examine memory pools are:
MemoryPool mpool;

message_t *message = mpool.alloc();

mpool.free(message);
The first line creates and initialises 16 memory pool locations. The second line allocates a memory block of type message_t from the memory pool. The third line frees the memory pool, if this line is not inserted in the code the memory pool will stop at 16.
Queues are a way of how to RTOS can schedule task that have to be completed, its run on a first in first out basis. To use queues it you’re code the lines :
queue.put(message);
queue.get();
The first line puts the data into the queue and will stay there  till it’s called, the second line calls the data from the queue and then that task can operate on the data.



















No comments:

Post a Comment