Shortcuts for Pointer Devices in KDE

todd rme toddrme2178 at gmail.com
Wed May 16 12:46:00 UTC 2012


On May 15, 2012 7:02 PM, "Rick Stockton" <rickstockton at reno-computerhelp.com>
wrote:
>
> On Tue, 05/15/2012 at 08:39:17 +0200, todd rme <toddrme2178 at gmail.com>
wrote:
>>
>> Second, I was not really suggesting we implement support for other
>> devices now.  My point is that we should make the system flexible
>> enough that third parties could write their own backends that provide
>> button events (remote control buttons, joystick buttons, etc.) or
>> button-like events (speech recognition, gestures, network signals from
>> a smartphone).....
>
> Gestures and Speech recognition are not like buttons. Button Events are
instantaneous singletons, while Gestures and Speech are recognized over a
period of time. (Gestures are already present and fairly easy to use in Qt
5.0, of course.) Speech would need to be implemented like Gestures, and the
scope of work is large.

Within the gesture or speech recognition system, yes. But at the end of the
gesture or phrase, the recognizer can output a single discrete event,
whichever pre-recorded gesture or phrase was detected.

In principle, a shortcut system doesn't need to care how what sort of
processing is done or how long it takes, as long as a discrete event is
produced at the end.

My suggestion is to make this principle a reality. The shortcut system
would just listen for properly-formatted events provided by plug-ins, and
use those as shortcuts.  These could be shipped with KDE, in the case of
the mouse and keyboard ones, but could also be loaded at runtime from
third-party applications.  The shortcut system would neither know nor care
exactly how the events came about, that would all be handled by the plug-in
in whatever way is most suitable for that plugin.

I know this would require a more substantial changes to the how shortcuts
are handled, but would allow for a much more flexible and future-proof
system that would allow us to integrate a lot of things that are currently
use their own implementation.  It would also provide an abstraction layer
between the underlying button (or button-like) events and the applications,
so applications would neither know nor care what exactly provided the
events they are using.

This sort of abstraction is already used throughout KDE, it is the same
sort of principle used in KIO, Phonon, Solid, plasma dataengines, etc.

-Todd
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.kde.org/pipermail/kde-frameworks-devel/attachments/20120516/3920f626/attachment.html>


More information about the Kde-frameworks-devel mailing list