Nad.Oby at gmail.com
Tue Mar 25 07:12:58 CET 2008
Niluge KiWi wrote:
> With the two accelerometers in the FreeRunner, I think we can recognise
> lots of gestures, not only simple ones like a "click" (which is already
> recognised by the accelerometers used in the FreeRunner). The main
> difficulty is probably to extract the useful data from the gestures
> noise : calibration may take time. The goal is to have an almost
> pre-calibrated library (an idea from the wish-list in the Wiki is to
> allow the user to record its own gestures, but I think it's not easy to
> do it simple for the end-user).
Good idea, but consider to store calibration data separately. This will
made the library more general. You want reuse it in other devices.
So the "recorded" gestures.
> The accelerometers could provide not only small gestures recognition
> (like the ones listed on the Wiki: up-side-down, shaking,
> flipping, ...), but full 3D-space positioning from a start position
> (when the software is started).
So long, and thank for all the fish.
More information about the community