GSoC 2008

Andy Green andy at
Tue Mar 25 17:52:14 CET 2008

Hash: SHA1

Somebody in the thread at some point said:
> Somebody in the thread at some point said:
>> The raw accelerometer data is predicated around byte data for X Y Z per
>> sample per motion sensor.
>> Ultimately the "gesture recognition" action is about eating 200 3-byte
>> packets of data a second and issuing only one or two bytes per second
>> about any "gesture" that was seen.
> The Spec document for the accelerometers says the refresh rate can be
> chosen: 100Hz, or 400Hz. The three values are stored as 2's complement
> number in one byte for each axis.

Yes.  In GTA02 the sensors are serviced by the CPU by separate
interrupts, so we have to eat the power consumption of 200
interrupts/sec... I figure 800 interrupts/sec might be a bit much.  So
currently it works at 100Hz.  Maybe it means higher frequency subtleties
are lost, I guess we can find out.

> Somebody in the thread at some point said:
>> Or a rotating shake with axis long side: it looks completely different when
>> device is upright, 45°, or flat (in fact this gesture isn't detectable at all
>> in upright position, in the first place).
>> Only correct way is to calculate real heading and velocity vector of device,
>> as accurate as possible. Then accumulate a "route", and this route you may
> With the two accelerometers, we can calculate the position(and the
> velocity) of the two accelerometers from a start point (position and
> velocity). But this is not enough to have the position of the whole
> phone in the space : we don't know the rotation movement along the
> axis defined by the two accelerometers.
> As we know that the relative position of the two accelerometers is
> fixed, it could help to detect calculation errors, and maybe correct
> them (a little...).

The deal is they are place like this

 /  <-- "top accel" at 45 degree angle at top of board
        to left of transducer

    _  <-- "bottom accel" unrotated to right of mic

Both times "pin 1" is towards the bottom left corner of the PCB.

> Regarding the work on a MPU, if I've understood what I've read on the
> mailing list archives, it's still just and idea, and the FreeRunner
> wont have one, am I right?

Right.  You have to use the main CPU there.

> For the GSoC, I think working on a simple library which uses the CPU
> would be already a good thing. (but we can work with the idea in mind
> that the code will need to be ported for a MPU).


> The library could provide two things :
> * the recognition of typical gestures
> * the position and velocity evolutions in time

Maybe it makes sense to put this functionality into the Interrupt
service routine.  Because if we stay with raw accel data, the userspace
app blocking on /dev/input/event* can also woken at ~100Hz or so I guess
and it is not a great way for power saving.

- -Andy
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Fedora -


More information about the community mailing list