[Kde-accessibility] FW: ATIA trip report - GNOME presentations & Java tools

Bill Haneman bill.haneman@sun.com
27 Jan 2003 14:07:15 +0000


On Sat, 2003-01-25 at 20:26, Datschge@gmx.net wrote:
> Sounds very nice.
> Especially interesting for me was following quote:
> 
> "Peter also demonstrated the special,
> dynamic features of GOK.  He showed how GOK utilizes the rich accessibility
> architecture of the GNOME platform to enumerate the menu items and toolbar
> elements, and present them on their own dynamic keyboards allowing switch
> and dwell users to directly select them (rather than having to select the
> TAB button on the on-screen keyboard a series of time to get to the toolbar
> entry).  He also demonstrated the special "UI Grab" feature of GOK, which
> creates a special keyboard for dialog boxes and other windows presenting all
> of the controls for direct selection of them."
> 
> How exactly is this going to work, ie. in what regard does it differ from
> standard keyboard input or keyboard controlled pointer? Are on screen elements
> dynamically mapped to specific keys on the keyboard?

Datschge:

GOK, like most onscreen/virtual keyboards, is designed primarily for use
by people who cannot operate a normal keyboard.  All of GOK's keyboards
are virtual, most are dymically created.  GOK supports several "access
methods" or input methods, for instance direct point-and-click of
onscreen keyboard keys, or "dwell selection" whereby a key on the
virtual keyboard is activated if a pointer dwells within the key's
bounds for more than some specified period of time.  These methods, for
instance, can be used by users with head pointers or trackers which
follow their eye movements, with or without the ability on the user's
part to indicate a "click" via a blink or "sip/puff" on a mouth-mounted
switch.  

For other users who cannot reliably point to the screen, but who are
able to control one or more simple switches, the switches are connected
to simple devices which turn switch events into "button" events from the
point of view of the operating system (for instance, XInput button
events).  In these cases GOK can support various scanning modes, so that
for instance a switch press initiates row-by-row followed by
column-by-column scanning of the keyboard so that the user can select
the desired key.  This is of course time-consuming, so it's vital that
the user be able to interact with the application using the minimum
number of such "keystrokes".

GOK works by using AT-SPI, the common API for getting accessibility
information from the GNOME desktop; this API is currently proposed (on
this list) for use by KDE applications as well, to enable KDE
applications to work with assistive technologies like GOK and
screenreaders just as GNOME applications now do.  In GOK's case AT-SPI
allows GOK to query the currently running applications' UI to find
menus, toolbars, and other interface components which the user can
activate, and map those items dynamically onto "keys" on the virtual
keyboard.  This means for instance that a GOK user can navigate a dialog
and activate a pushbutton with only one selection action, rather than
the many selections which would be required by ordinary keyboard
navigation.

If you have more questions about this I encourage you to visit the GOK
site http://www.gok.ca, the GNOME Accessibility GOK site
http://developer.gnome.org/projects/gap/AT/GOK, and also to try GOK
yourself.

best regards,

Bill

> 
> Thanks, Datschge
-- 
Bill Haneman <bill.haneman@sun.com>