some UI and text input thoughts

Christopher Friedt chrisfriedt at gmail.com
Wed Jan 14 04:29:35 CET 2009


Hi everyone,

I wanted to write to the list about two things - 1) zooming in the
browser, photo viewer, etc, and 2) 'swiping' instead of typing

1)

Today I took a break from writing an i2c driver, and thought about how
great mult-touch screens are. Specifically, I really like the ability
of [a certain unnamed device, not that I own one] to zoom in and out
of a page while running the browser. With [said device], the user
places two fingertips on the screen and zooms in by moving their
fingertips closer together. Zooming out is the opposite.

I only know the basics about how touchscreens work, but I do know that
some similar input devices (namely of the synaptics type for laptops)
actually register different paths that the fingertip traces, producing
programmed events accordingly.

Assuming (well, hoping really) that could also apply to the
touch-screen on the Neo or FR, I thought of these two which are
analogous to using a screwdriver:

clockwise circle == zoom in
counter-clockwise circle == zoom out

Are there any hardware engineers who can verify whether or not the
touch-screen on the Neo or FR could register or generate motion-based
events? If not, is this something that could be done easily in
userspace (i.e. inputproto)?

2)

Swiping (as some people have named it), is an input method for mobile
devices with on-screen keyboards where a word is spelled out in
entirety without lifting the finger from the touchscreen.

There are several companies that have claimed a patent on swiping,
including Apple, the company behind T9, and others. I really don't
need to point out the obvious conflict which the USPO has created, but
I thought I would anyway.

In any event, there are many places in the world where software is
non-patentable (and hopefully that remains so). So for those of us who
live in those parts of the world, wouldn't it be nice to 'swipe' on
the Neo or FR ?

I did some really similar lab work at university a while ago which I
think is the underlying mechanism for swiping. The curve that the
finger takes is encoded as a series of corners and points projected
onto the complex coordinate system. The discretized signal is then
complex, but 1-dimensional (like audio), which is really easy to pack
and store in a database. Pattern matching involves Fourier analysis of
the encoded path. The first component of the Fourier-transformed
signal is the 'centre of mass'. The sum of the first two components is
an ellipse that encircles all of the corners. The sum of successive
components result in a signal that increasingly resembles the path of
the finger.

Does anyone have any questions / comments about 'swiping' ?

Would anyone want to implement it on the Neo / FR ?

Cheers,

Chris



More information about the devel mailing list