S15: Hand Gesture Recognition using IR Sensors

From Embedded Systems Learning Academy
Revision as of 00:17, 25 May 2015 by Proj user9 (talk | contribs) (Testing & Technical Challenges)

Jump to: navigation, search


Abstract

The aim of the project is to develop hand gesture recognition system using grid of IR proximity sensors. Various hand gestures like swipe, pan etc. can be recognized. These gestures can be used to control different devices or can be used in various applications. The system will recognize different hand gestures based on the values received from IR proximity sensors. We have used Qt to develop the application to demonstrate the working of the project.

Objectives & Introduction

We use various hand gestures in our day-to-day life to communicate while trying to explain someone something, direct them somewhere etc. It would be so cool if we could communicate with various applications running on the computers or different devices around us understand the hand gestures and give the expected output. In order to achieve this, we are using a 3-by-3 grid of analog IR proximity sensors and connecting these sensors via multiplexers to the ADC pins on SJOne Board. As a hand is moved in front of the sensors, the sensor values would in a particular pattern enabling us to detect the gesture and instruct the application to perform the corresponding action.

Team Members & Responsibilities

  • Harita Parekh
    • Implementing algorithm for gesture recognition
    • Implementation of sensor data filters
  • Shruti Rao
    • Implementing algorithm for gesture recognition
    • Interfacing of sensors, multiplexers and controller
  • Sushant Potdar
    • Implementation of final sensor grid
    • Development of the application module

Schedule

Week# Start Date End Date Task Status Actual Completion Date
1 3/22/2015 3/28/2015 Research on the sensors, order sensors and multiplexers Completed 3/28/2015
2 3/29/2015 4/4/2015 Read the data sheet for sensors and understand its working. Test multiplexers Completed 4/04/2015
3 4/05/2015 4/11/2015 Interfacing of sensors, multiplexers and controller Completed 4/15/2015
4 4/12/2015 4/18/2015
  • Implementation of sensor data filters
  • Implement algorithm to recognize left-to-right movement
Completed 4/25/2015
5 4/19/2015 4/25/2015
  • Implementation of final sensor grid
  • Implement algorithm to recognize up-to-down movement
  • Implement algorithm to recognize right-to-left movement
Completed 5/02/2015
6 4/26/2015 5/02/2015
  • Implement algorithm to recognize down-to-up movement
  • Develop the Qt application
Completed 5/09/2015
7 5/03/2015 5/09/2015 Testing and bug fixes Completed 5/15/2015
8 5/10/2015 5/16/2015 Testing and final touches Completed 5/22/2015
9 5/21/2015 5/24/2015 Report Completion Completed 5/24/2015
10 5/25/2015 5/25/2015 Final demo Scheduled 5/25/2015

Parts List & Cost

SR# Component Name Quantity Price per component Total Price
1 Sharp Distance Measuring Sensor Unit (GP2Y0A21YK0F) 9 $14.95 $134.55
2 STMicroelectronics Dual 4-Channel Analog Multiplexer/Demultiplexer (M74HC4052) 3 $0.56 $1.68
3 SJ-One Board 1 $80 $80
4 USB-to-UART converter 1 $7 $7
Total (excluding shipping and taxes) $223.23

Design & Implementation

Hardware Design

The image shows the setup of the project.

S15 244 Grp10 Ges system setup.jpg

Figure : Setup of the project


System Block Diagram:
The system consists of 9 IR proximity sensors, which are arranged in 3x3 grid. The output of the sensors is given to the Analog-to-Digital convertor on the SJOne Board to get the digital equivalent of the voltage given by the sensors. Since there are only 3 ADC channels exposed on the pins on the board, we cannot connect all the sensors directly to the board. For these we have used three multiplexers, which has 3 sensors each connected to its input. The output of the multiplexers is connected to ADC. SJOne board is connected to the laptop via UART-to-USB connection.

S15 244 Grp10 Ges block diagram.png
Figure : System Block Diagram

Proximity Sensor:
This sensor by Sharp measures the distance from an obstacle by bouncing IR rays off the obstacle. This sensor can measure distances from 10 to 80 cms. The sensor returns an analog voltage corresponding to the distance from the obstacle. Depending on which sensor returns valid values, validations could be made and hand movement can be determined. The voltage returned by the sensor increases as the obstacle approaches the sensor. There is no external circuitry required for this sensor. The operating voltage recommended for this sensor is 4.5V to 5.5V.

S15 244 G10 Ges sensor.jpg

Figure : IR Proximity Sensor

Multiplexer:
The chip used in the project is M74HC4052 from STMicroelectronics. This is a dual 4-channel multiplexer/demultiplexer. Due to shortage of ADC pins to interface with the sensors, use of multiplexer is required. The multiplexer takes input from three sensors and enables only one of them at the output. The program logic decides which sensor’s output should be enabled at the multiplexer’s output. A and B control signals select one of the channel out of the four. The operating voltage for the multiplexer is 2 to 6V.

S15 244 G10 Ges mux.jpg

Figure : 4-Channel Dual Multiplexer

USB-to-UART converter:
To communicate to SJONE board over UART there is a need an USB to serial converter and a MAX232 circuit to convert the voltage levels to TTL, which the SJONE board understands. Instead it’s better to use a USB-to-UART converter to avoid the multiple conversions. This is done using CP2102 IC, which is similar to a FTDI chip.

S15 244 Grp10 Ges UART to USB.JPG

Figure : USB-to-UART chip

Hardware Interface

Pin connections for IR Sensor to Multiplexer:
S15 244 Grp10 Ges sensor-to-mux.png
Figure : Pin connections for IR Sensor to Multiplexer


Pin connections on SJOne board:
S15 244 Grp10 Ges sjone pinouts.png

Figure : Pin connections on SJOne board


Connections between SJOne board and USB-to-UART Converter:
S15 244 Grp10 Ges SJOne to UART.JPG
Figure : Connections between SJOne board and USB-to-UART Converter

Software Design

Initialization
SJOne board has 3 ADC pins exposed on Port 0 (0.26) and Port 1 (1.30 and 1.31). To use these pins as ADC, the function should be selected in PINSEL. The GPIO pins on Port 2 are connected to the select pins on multiplexer. These pins should be initialized as output pins. Once, initialization is completed, the function for normalizing the sensor values is called.

S15 244 Grp10 Ges sensor init.png

Figure : Flowchart for initialization of ADC and multiplexer



SJOne board uses UART 3 to communicate with the QT application. UART 3 is initialized to baud rate of 9600 with receiver buffer as 0 and transmission buffer as 32 bytes. Once initialization is completed, the function for processing sensor values is called.



S15 244 Grp10 Ges process init.png

Figure : Flowchart for initialization of UART and process

Filter Algorithm
The current value of sensor is fetched by setting the corresponding values on the multiplexer select pins and reading the output of the ADC. A queue of size 5 is maintained and the fetched value is inserted at the tail of the queue. This queue is sorted using bubble sort. The median value of the queue is checked to be greater than 2000 (sensor returns a voltage corresponding to a value greater than 2000 when the hand is near enough to it) and gesture array for that sensor is set accordingly.

S15 244 Grp10 Ges filter task.png

Figure : Flowchart for filter algorithm

Gesture Recognition Algorithm
Different sets of sensors are monitored in order to recognize a valid pattern in the sensor output and thereby recognize the gesture pattern. (We have assumed that the sensors are numbered 0 through 8 and the corresponding value for the sensor is set by the filter algorithm in gesture[] array).

Pattern 1:

S15 244 Grp10 Ges process1.png

Figure : Flowchart for selection of pattern 1


Here the three sensors present at the top left corner are monitored.

  • If sensor1 value is zero
    • Check the values of sensors in the second column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the third column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Right” to UART 3.



S15 244 Grp10 Ges tlright.gif


  • If sensor3 value is zero
    • Check the values of sensors in the second row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the third row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Down ” to UART 3.



S15 244 Grp10 Ges tldown.gif

Pattern 2:

S15 244 Grp10 Ges process2.png

Figure : Flowchart for selection of pattern 2


Here the three sensors present at the top right corner are monitored.

  • If sensor1 value is zero
    • Check the values of sensors in the second column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the first column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Left ” to UART 3.



S15 244 Grp10 Ges trleft.gif


  • If sensor5 value is zero
    • Check the values of sensors in the second row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the third row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Down ” to UART 3.



S15 244 Grp10 Ges trdown.gif

Pattern 3:

S15 244 Grp10 Ges process3.png

Figure : Flowchart for selection of pattern 3


Here the three sensors present at the bottom left corner are monitored.

  • If sensor7 value is zero
    • Check the values of sensors in the second column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the third column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Right ” to UART 3.



S15 244 Grp10 Ges blright.gif


  • If sensor3 value is zero
    • Check the values of sensors in the second row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the first row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Up ” to UART 3.



S15 244 Grp10 Ges blup.gif

Pattern 4:

S15 244 Grp10 Ges process4.png

Figure : Flowchart for selection of pattern 4


Here the three sensors present at the bottom right corner are monitored.

  • If sensor7 value is zero
    • Check the values of sensors in the second column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the first column. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Left ” to UART 3.



S15 244 Grp10 Ges brleft.gif


  • If sensor5 value is zero
    • Check the values of sensors in the second row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Check the values of sensors in the first row. If the combination of first two or last two sensors is 1, go to next step. Else, update the gesture array for these sensors and check.
      • Send “Up ” to UART 3.



S15 244 Grp10 Ges brup.gif

Application Development
Qt Software
Qt is a cross-platform application framework that is widely used in developing application software that can be run on various software and hardware platforms with little or no change in the underlying codebase while having the speed and the power of native application. It is mainly used to make GUI based applications but there can be applications such as consoles or command-line applications developed in Qt. Qt is preferred by many application programmers as it helps in developing GUI applications in C++ as it uses the standard C++ libraries for backend. Platforms supported by Qt are:

  • Android
  • Embedded Linux
  • Integrity
  • iOS
  • OSX
  • QNX
  • VxWorks
  • Waylands
  • Windows
  • Windows CE
  • Windows RT
  • X11

Qt applications are highly portable from one platform to other as Qt first runs a Qmake function before compiling the source code. It is very similar to ‘cmake’ which is used for cross platform compilation of any source code. The qmake auto generates a makefile depending on the operating system and the compiler used for the project. So if a project is to be ported from windows to linux based system then the qmake auto generated a new makefile with arguments and parameters that the g++ compiler expects.

S15 244 Grp10 Ges qtdevices.png

Figure : Devices supporting Qt

S15 244 Grp10 Ges qt-sdk.png

Figure : Qt-SDK

Gesture Recognition application on Qt
Once the gesture is recognized on the SJONE board, a message is sent over UART2 stating which gesture was sensed. The Qt application gets this message from the COMM port and scrolls the images left and right based on the left/right gesture and it moves a vertical slider up and down which in turn changes the value on a LCD screen display.

The application opens up in a window that has 2 tabs, config and App. The config tab includes the fields required to open the COMM port and test the COMM port using a loopback connection.

S15 244 Grp10 Ges qt tab1.JPG

Figure : Configuration Tab

This tab has the following QObjects

S15 244 Grp10 Ges table1.png

Table 1: QObjects used for Configuration Tab

The App tab includes the objects required to change images and change the value in the vertical slider and lcd number display.

S15 244 Grp10 Ges qt tab2.JPG

Figure : Application Tab

This tab has the following Qobjects

S15 244 Grp10 Ges table2.png

Table 2: QObjects used for Application Tab


Implementation

This section includes implementation, but again, not the details, just the high level. For example, you can list the steps it takes to communicate over a sensor, or the steps needed to write a page of memory onto SPI Flash. You can include sub-sections for each of your component implementation.

Challenge #1: The sensor produces many spikes giving false positive outputs. Resolution: In order to overcome spikes received and deal with false positive, normalization of the sensor output is done. A circular queue of size 5 is maintained for each sensor and each value received from the ADC is stored at the end on the queue. This queue is then sorted and only the median value is considered for computation. This reduces the false positives to a great extent.

Challenge #2: Number of sensors used was far greater than the available ADC pins.

Resolution: Even if we had 9 ADC pins converting values of 9 sensors, we would still be reading the each sensor one by one. Keeping this in mind, in order to overcome the deficit of ADC pins, we have used multiplexer which takes the input from 3 sensors at a time and gives the output of only the selected sensor. In this way, we could read the output of any sensor at any given point of time. The introduction of multiplexer introduces a lag but this lag is not long enough to hinder the operation of the application.

Challenge #3: Qt being a new application for all the team members, it was a challenge to learn its programming style and use the objects.

Challenge #4: Setting up the serial port communication in Qt.

Conclusion

Conclude your project here. You can recap your testing and problems. You should address the "so what" part here to indicate what you ultimately learnt from this project. How has this project increased your knowledge?

Project Video

Gesture Recognition using IP Proximity Sensors

Project Source Code

References

Acknowledgement

All the components where procurred from Amazon, Adafruit and digikey. We are thankful to Preet for his continuous guidance during the project.

References Used

IR Sensor Data Sheet
LPC_USER_MANUAL
Multiplexer Data Sheet
QT Software
Filter code refered from Spring'14 project Virtual Dog