some UI and text input thoughts

Christopher Friedt chrisfriedt at
Fri Feb 6 17:54:26 CET 2009

All of your insight sounds great. Personally, I'm a little bit more
interested in the 'swyping' concept - has anyone here used it? I've
only seen demonstration videos.

I have no time to do this personally, but it would be nice to be able
to use T9 input, an on-screen-keyboard, a full terminal-capable
keyboard, and a 'swyping' keyboard, for different applications. Maybe
some clever programmer will add a 'swyping' input engine to Android
whenever they decide to create the input method framework.



On Wed, Jan 14, 2009 at 8:36 PM, The Rasterman Carsten Haitzler
<raster at> wrote:
> On Thu, 15 Jan 2009 01:19:52 +0100 Marcos Mezo <mmezo at> babbled:
>> On Thursday 15 January 2009 00:42:49 Carsten Haitzler wrote:
>> > On Wed, 14 Jan 2009 10:01:20 +0000 Andy Green <andy at> babbled:
>> >
>> > nothing to do with hardware here - all to do with software stack. the fact
>> > is there is no generic "zoom" control for apps - and apps have no concept
>> > of one. you also need to think about where in the stack you sit - if you go
>> > to /dev/input... you will be doing your own driver work specific to 1
>> > hardware type - if that changes - you are going to have to adapt. the best
>> > bet is to do this higher up at the x level - but here you hit a problem.
>> > mouse events ONLY go to the window they are on (or to who grabbed the
>> > mouse). so in this case every app/toolkit needs to handle these themselves
>> > - OR you uxe an extension like xevie and we put in an indirector for all
>> > events - this indirector can/will be responsible for:
>> >
>> > 1. filtering (removing events, delaying and getting rid of garbage - it it
>> > wants)
>> > 2. selectively passing along some events and not others possibly with
>> > modifications (eg translate/scale the input/output - needed for a
>> > compositor if you do things like scale the window output in the compositor)
>> > 3. can interpret series of events into gestures and then produce some form
>> > of message (the protocol and standard are yet to be defined) that can be
>> > sent to the root window o the focused window - or possibly just execute
>> > some command etc. etc.
>> >
>> Not really knowing anything about it, but it seems to me that that's what
>> tslib is doing for right mouse button emulation.
> tslib does indeed do this - but it does it at a different layer (between x and
> the driver as tslib also serves as a driver emulation layer).
>> Maybe somebody with the knowledge could extend tslib with some more gestures,
>> like for example the said (counter)clockwise circling that could emulate a
>> mouse wheel, which is not standard, but common for zooming in a lot of photo
>> viewers for example. It would also be useful to scroll texts,.... Or maybe
>> something like ^ for emulating the "up" key or "page-up" or....
>> The problem is always that for example people drawing on the touchscreen
>> (pypennotes?) or for that matter playing with numptyphysics would not be very
>> happy :-), so if implemented it should be at least easily disabled/enabled
>> both with a gui and also with something accesible for scripts to be added on
>> application launchers.
> thats why you'd want to do it above tslib - in x. at that point apps can set
> hints on their windows like "do not interpret gestures" as the app may have
> complex interactions and its own gesture handling in specific cases.
>> Marcos
>> _______________________________________________
>> devel mailing list
>> devel at
> --
> ------------- Codito, ergo sum - "I code, therefore I am" --------------
> The Rasterman (Carsten Haitzler)    raster at
> _______________________________________________
> devel mailing list
> devel at

More information about the devel mailing list