GSOC and Accelerometer Gestures Idea

Daniel Willmann daniel at openmoko.org
Mon Mar 24 21:42:25 CET 2008


Hello,

On Wed, 19 Mar 2008 21:49:59 +0200
"Paul-Valentin Borza" <paulvalentin at borza.ro> wrote:

> My name is Paul-Valentin Borza (http://www.borza.ro) and I’m working
> on my Bachelor of Computer Science Thesis – Motion Gestures: An
> approach with continuous density hidden Markov models. I’ve designed
> and implemented in c++ continuous density hidden Markov models for
> the data measured by a 3-Axis ±3G accelerometer (the Nintendo Wii
> Remote over Bluetooth).
> 
> There are several alternatives to motion (accelerometer) gestures
> like:
> 
[...]

it certainly looks like you did your homework already. :-)

> Once the models are created and trained, there are three types of
> recognitions:
> 
> Isolated recognition (the user presses a button, makes a gestures,
> releases the button and the gesture is recognized) – Viterbi Beam
> Search

This is definitely interesting, could also be triggered through an
event other than button press, i.e. incoming call

> Connected recognition (the user presses a button, makes several
> gestures, releases the button and those several gestures are
> recognized) – 2-Level

Don't know about the usefulness of this.

> Online recognition (the user just makes a gesture as the
> accelerometer is monitored constantly and the gestures are recognized
> on the fly) – this is the one that should be used on mobile devices

That would be the coolest, but I see several problems, especially
falsely detecting gestures while you are moving around and battery
lifetime. As long as you monitor for gestures the CPU cannot go into
suspend which will dramatically reduce battery lifetime. Maybe we can
achieve almost the same level of seamless recognition by choosing our
trigger sources wisely.

> I’ll stop here with the theory. If someone needs further
> clarification, please ask.
> 
> I believe I have the required skills to build the Accelerometer
> Gestures Idea for the Google Summer of Code.
> 
> Gestures will be an innovation in mobile phones. Just imagine the
> scenario where your phone is on the table and it's ringing... You
> pick the phone, see who's calling and you take your phone to your ear
> to talk (the phone answers the call automatically). The answer mobile
> gesture is exactly this: you take your phone to your ear. Remember
> that this is an involuntary action that you always perform. Pressing
> the green answer button will no longer be needed.

Really cool!

> It's almost like the phone is reading you mind!
> 
> Plus, the user can create his/her own custom gestures.

That's definitely a must.

> I already have a working solution for the Nintendo Wii Remote as
> described earlier and it should be easy to port on OpenMoko.

Cool, how accurate is your detection? Could you, say, hold the Wii/Neo
as a pen and "write" stuff which will then be digitized?

> What do you think?

By all means please submit your application to GSoC. The earlier you
publish your application and timeline the more time we have for giving
feedback in order to refine the application.

Regards,
Daniel Willmann
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 189 bytes
Desc: not available
Url : http://lists.openmoko.org/pipermail/openmoko-devel/attachments/20080324/f885e6b6/signature.pgp


More information about the openmoko-devel mailing list