Description
Gesture-based interfaces provide an intuitive way for users to specify commands and interact with computers, it pertains to recognizing meaningful expressions of motion by a human, involving the hands, arms, and body. In this project we will be using proximity sensors to recognize the gesture. This project concentrates on embedding this feature into GIS which helps disabled users to interact with the tool in effective way. Ultrasonic sensors are one of the most effective proximity computing devices which use speed of sound pulse to compute the distance between the transmitter and object. Arduino open source electronic help us to process the data from these sensors and help transmitting these data in to our GIS system where the data is used to evaluate gestures made by the user. The data sent by the Arduino is later processed and events are triggered based on gesture evaluation. Gesture evaluation uses advanced algorithms which are more efficient than other gesture evaluation algorithms like image subtractions etc. GIS is a venerable tool which used widely by users for interacting with map objects. This project extends this tool by adding the feature called Gesture mode where the user can use gestures to control the map. This plays a major role for people who suffer with polio, who have a hard time using their hand for typing purpose. Proximity based gesture evaluation also opens up the door for all desktop games, which can easily access this tool for advance development. Language and Technologies used are Arduino, APL (Arduino programming language), JAVA, ESRI map package, JSSC Java serial connector.