CS298 Proposal
Accelerometer based motion gestures for Mobile Devices
Neel parikh (neelkparikh@yahoo.com)
Advisor: Dr. Chris Pollett
Committee Members: Dr. Chris Pollett, Dr. Robert Chun and Dr. Mark Stamp
Abstract:
The smart phones today make extensive use of tiny sensors called accelerometers to provide enhanced user interface control. Accelerometers measure the linear acceleration in the x,y,z directions and are often used for such things as controlling orientation of the display screen, for generating motions/movements in various applications, etc. These sensors are basically reducing the need of dedicated navigation and function keys on the mobile device. The goal of this project is to create accelerometer based motion gestures like shake feature, orientation of images, zoom in/out,etc. Also, the project will enhance the functionality of the WebKit browser interface for Google's Android platform to perform gesture like scrolling for browsing a web page.
CS297 Results
- 1. Created a test ‘Hello World’ program for Android platform
- 2. Integrated Sensor Simulator with Android Emulator so the simulator values are reflected on the emulator screen. Added the functionality of automatically playing a set of input values from the simulator for testing purposes.
- 3. Implemented the event of generating alert messages when a phone falls from a resting position and then strikes the ground based on change in accelerometer values
- 4. Implemented accelerometer based ‘Shake feature’ to erase the written text entry.
Proposed Schedule
Week 1:
Aug.26-Sep. 2 | Make CS 298 Proposal |
Week 2 & 3:
Sep.2-15 | Implement image orientation feature and Deliverable 1 due |
Week 4 & 5:
Sep.16-30 | Implement zoom feature and Deliverable 2 due |
Week 6 & 7:
Oct.1-13 | Testing the Android browser with Webkit API’s |
Week 8 & 9:
Oct.14-29 | Integrating browser application with Sensor Simulator and implementing the scrolling feature |
Week 10 & 11:
Nov.1-10 | Documentation of the report work and submission to graduate office and committee. |
Week 12 & 13:
Nov. 11-25 | Make final changes in report and complete all documentation |
Week 14 & 15:
Nov. 26-Dec.10 | Defense and final presentation |
Key Deliverables:
- Software
- 1. Implement the image orientation feature based on accelerometer inputs.
- 2. Implement the zooming in/out feature by changing accelerometer values.
- 3. Implement the scrolling feature in browser based application for navigation
- 4. Additional independent application for testing the Media capabilities of Android platform.
- Report
- 1. Detailed description of software deliverables
- 2. Final report and presentation
Innovations and Challenges
- 1. Android based mobile device is not available in the market yet. Hence, the idea to use a Sensor Simulator to generate real-time accelerometer values and transmit them to the Android emulator for testing is innovative.
- 2. This idea has been enhanced by adding the functionality of continuously playing a set of input values by changing the x,y,z axis of the sensor simulator for real-time testing of motion gestures. This automation has avoided the manual movement of the simulator with mouse pointer for generating accelerometer values
- 3. The coding complexity and physics involved in developing these motion gestures is challenging. Also, Android being a new platform has the constraints of limited resources, unknown bugs and new releases causing changes in the existing API’s
References:
Official page of Android project. "http://code.google.com/android/index.html"
Majoe, D., Schubiger, S., Clay, A., Muller, S. (2007). SQEAK: A Mobile Multi Platform Phone and Networks Gesture Sensor. Proceedings of the 2007 IEEE 2nd International Conference on Pervasive Computing and Applications, 699-704.
Jang, I., Park, W. (2003). Signal Processing of the Accelerometer for Gesture Awareness on Handheld Devices. Proceedings of the 2003 IEEE lntenational Workshop on Robot and Human Interactive Communication, 139-144.
|