kiwiiii at gmail.com
Mon Mar 24 19:02:34 CET 2008
I'm a student in the french engineer school ENSIMAG, and I would like to
work for OpenMoko during the Google Summer of Code.
I'm interested in the accelerometers features : Recognising gestures
is a really important part of the interface between the user and the
With the two accelerometers in the FreeRunner, I think we can recognise
lots of gestures, not only simple ones like a "click" (which is already
recognised by the accelerometers used in the FreeRunner). The main
difficulty is probably to extract the useful data from the gestures
noise : calibration may take time. The goal is to have an almost
pre-calibrated library (an idea from the wish-list in the Wiki is to
allow the user to record its own gestures, but I think it's not easy to
do it simple for the end-user).
The accelerometers could provide not only small gestures recognition
(like the ones listed on the Wiki: up-side-down, shaking,
flipping, ...), but full 3D-space positioning from a start position
(when the software is started).
Then we can imagine lots of uses of the library : improvements in the
control of the phone, programs specially created to use such
control(little games for examples).
The accelerometers gestures could be combined with the touchscreen for a
For example, the gesture navigation can be activated only when pressing
if we are viewing a large picture, zoomed in, we could move through it
by moving the phone, but we don't want it moves all the time.
Other examples given on the Wiki  could be implemented by using the
I looked at the driver for the accelerometers, and it seems it's not yet
working. I don't think I'm able to work on the driver, so I hope it will
work this summer.
I'm also interested in working in the ambient noise detection in second
I hope I'll be part of the OpenMoko project this summer,
More information about the community