intelligent Gesture recognition device

iGest is a wearable gesture to speech device. It tracks the gestures of the speech impaired people and speaks for them. This helps people with speech disorders like cerebral palsy, dysarthria and cluttering. Typical operation of these devices requires controlled motor interaction. This presumes a level of consistency that cannot be achieved for severe motor impairments particularly athetosis (constant writhing movements) and ataxia (excessive involuntary movements at the end of a gesture). The problem is compounded when the child is visually impaired, a condition found in 60% of children with CP. Visual impairment does not allow precise motor control due to lack of visual feedback.

Android app


iGest is a kinematic sensor based system that is designed to learn existing motor capacity through a gesture recognition algorithm. Movement is captured with a sensors installed in the device. Through sensors, gross movement – arm movements, head movement and gestures are recognized and recorded. The system then learns gesture models that are natural to the them. These data transmitted to a phone then associated with a dictionary of sentences or actions. This data can be used to assist them in using a certain device or even to reinforce motor skills by encouraging arm movement to play games. The unique contribution of this system is that it harnesses existing motor capacity and movement in children to enable communication.

iGest won the Catalytic Grant for Early Stage Enterprises at the Nasscom Social Innovation Forum 2015.

iGest won the Nina Saxena Excellence in Technology Award for the year 2015 in the area of application in Technology to underdeveloped areas and causes.