GSoC Accelerometer-based Gestures Update

Kombipom kombipom at iinet.net.au
Sun Jun 15 12:39:17 CEST 2008


I think that this project is going to be a big help to get around the
lack of hardware buttons on the freerunner. And add wow-factor of
course :-)

I don't think that gestures should be "hardwired" in your code to any 
particular behavior but should act in the same way as button presses which
can be linked to various user defined actions.  If each gesture is named
with a tag that tag could be linked to a particular behavior in a simple
UI.  This would mean that the interface will do what seems natural to
each person and will be easily customisable.  Some gestures would act as
key presses (tap on case for OK) and some could perform more complex actions
(facedown to change active profile to sleep).  A large number of
gestural inputs could be included with the software but most of them not
utilised by default to avoid the "why the hell did my phone just do
that" problem.

This would also mean that your software doesn't have to handle anything
other than gesture recognition and simple outputs (to Dbus?) event
handling would be done by other software including parts of the new
framework.

Some proposed general gestures:
Positional: facedown, faceup, upright, upsidedown, clocklandscape,
anticlocklandscape
Movement: tap, doubletap, flickback, flickforward, flickclock,
flickanticlock, tiltleftdown tiltrightdown, tilttopdown, tiltbottomdown.
Where flick is a quick jerk and tilt is more of a roll

Would the cpu and battery drain be too great to be running gesture 
recognition whenever the screen is on (and just turn off when in standby) 
or will it always be limited to situations with an initiator like an 
incoming call or a button press?

kombipom



More information about the openmoko-devel mailing list