Alpha 2 Release of Accelerometer-based Gestures, and Screen Orientation

Paul-Valentin Borza paulvalentin at
Fri Aug 15 15:15:45 CEST 2008

On the sensitivity issue... The time that the recognizer considers a gesture
was made is unfortunately hard-coded with a #define in this release (sorry
for that). However, there's something you can do (you can train the
classifier to detect dynamic acceleration - i.e. when you make a gesture -
to be more rigid).
You can't do it in GUI mode, but you can use the console:
There are 2 classes: static acceleration (s.class), and dynamic accelration

Train the dynamic acceleration class:
gesm --neo2 --config /etc/accelges/neo2 --new d.class
gesm --neo2 --config /etc/accelges/neo2 --train d.class

Be aware that this isn't a gesture training, as it's a classifier's class
creation, and training. So when you make shake the phone here, you'll have
to press the screen of the Neo (doesn't matter where) at all time while you
make the move. When you release the screen, your dynamic class will be
Also, once you do this, all the gestures can be considered trash. You'll
have to create, train all the gestures (use the GUI).

Well, acceleration is not direction unfortunately. A gyroscope can solve
this problems - an accelerometer, and a gyroscope will solve these kind of
I can do something in the next release to correct the landscape, move
upwards detects right problem; I will try that.

No, you can't do that right now, as the duration is hard-coded for now.

To tool how a gesture looks like, run:
gesm --neo2 --config /etc/accelges/neo2 --view up.model
or whatever model you like - it's a continuous density left-to-right hidden
Markov model


On Thu, Aug 14, 2008 at 11:31 PM, Daniel Benoy <daniel at> wrote:

> Looks good :)
> Here's my experiences, don't know if these are planned for future releases:
> I don't know if something is wrong for me though because it's really
> sensitive.  Handling my phone ordinarally and gently results in a lot of
> 'shake shake'.  Would it be possible to require a constant shaking motion
> for 2 seconds or something before it registers? Also it doesn't seem to
> factor out gravity (I don't know if that is possible?)  For example, if I
> turn my phone upside down, the screen orientation goes with it (Which works
> great by the way!).  If I jerk my phone to the right, up comes 'left'.
>  That's not right.  (Hahaha punny!)  And if I hold my screen purpendicular
> to the ground, and jerk the phone upwards and then downwards it detects
> 'forward, backward' etc etc etc.  You're the expert so correct me if I'm
> wrong, but can we not detect a reasonably consistant 1G force, and then
> apply a rotation matrix or something to every input value so that things are
> relative to that direction (And only change the known gravity direction if
> 1G is sustained in one direction for a long enough period of time) ?
> These training files, how advanced are they?  Would I be able to write one
> that says something like 'If the accelerometer detects between three and
> five sudden changes in direction over #Gs that occur over a period that's no
> less than 2 seconds but no more than 4 seconds?', or stuff that advanced?
> Thanks for the great work so far :)
> On Thursday 14 August 2008 13:01:35 Paul-Valentin Borza wrote:
> > I'm proud to announce that the new release of accelerometer-based
> > gestures, and screen orientation is now available for download.
> > What you've seen in the video from
> > is now available.
> >
> > This release includes:
> > An application with user interface that allows the user to train the
> > gestures for himself/herself;
> > A listener daemon that sends a notification on the screen of the
> > recognized gesture;
> > Automatically switch of screen orientation for the four possible modes
> > (2xportrait, and 2xlandscape).
> >
> > Here's the direct link for the release:
> >
> > You can find documentation, installation instructions, screenshots
> > etc. on the Wiki:
> > There's a quick way to install it, and a more detailed way... Read
> >
> >
> > I would suggest carefully reading the instructions, and running the
> > gesture listener as soon as you install the package (i.e. before
> > training).
> > Of course, the gestures were not trained for you (unfortunately I had
> > a limited set of training data - only myself), so you'll have to train
> > them for yourself.
> >
> > Have fun with it!
> >
> > Thanks,
> > Paul
> > --
> >
> >
> > _______________________________________________
> > Openmoko community mailing list
> > community at
> >
> >
> --
> Daniel Benoy
> _______________________________________________
> Openmoko community mailing list
> community at

-------------- next part --------------
An HTML attachment was scrubbed...

More information about the community mailing list