[RFC] kscreen and touchscreen rotation

Sebastian Kügler sebas at kde.org
Mon Sep 11 11:14:39 UTC 2017


On Monday, September 11, 2017 12:56:09 PM CEST David Edmundson wrote:
> On Mon, Sep 11, 2017 at 11:58 AM, Sebastian Kügler <sebas at kde.org>
> wrote:
> > Hi all,
> > 
> > One of the things we talked about during Akademy was better support
> > for convertible hardware. I've played around a bit with my Thinkpad
> > X1 Yoga and screen rotation. I can of course rotate the screen
> > "manually" through XRandR (Wayland is another story on that
> > altogether), but that will screw up the touchscreen. Doing some
> > research, I found out that:
> > 
> > - touchscreen and display are completely separate things, the system
> > 
> >   doesn't know they're sitting on top of each other
> > 
> > - They need to be rotated separately
> > - rotation in X goes through XInput2, on Wayland, I don't know
> 
> on wayland
> touch is libinput->kwin->app  (protocol is wl_touch)
> 
> But AFAIK actual screen rotation itself needs to be added to kwin
> wayland, before we can be looking at fixing touch events.
> 
> I'm not 100% sure the input needs transforming when rotated. If you
> look at QtWaylandWindow when the screen rotates, the surface rotates
> too, so if co-ordinates are surface local we shouldn't be changing it?
> It's quite confusing.
> 
> > I'm working on some code that does the rotation from KScreen right
> > now. My idea is to add this to KScreen's API and allow the KScreen
> > user to also rotate the display.
> > 
> > On X, this happens async, and this bears the risk that input ends
> > up on different coordinates during switching (display has already
> > rotated, touchscreen still in process, or some race condition like
> > that -- I don't think we can really prevent that), on Wayland, we
> > should be able to prevent that from happening as we hopefully can
> > make input rotation atomic along with the rotation itself, the
> > protocol and API of screen management in Wayland allow that.
> > 
> > We probably will need to guess which display has a touchscreen
> > attached, but for some cases, I think we can make some pretty
> > reasonable guesses
> > - one display, one touchscreen seems like a the most common case,
> > and is clear
> > - one touchscreen and multiple displays: touchscreen may be on the
> > internal display
> > The mapping heuristic is kept internally, and we can work on that
> > later, once we got the basics in place.
> > 
> > Architecture / API design: My idea right now is to add a new
> > relatively simple type class to the KScreen API, TouchScreen, that
> > can be received from an Output, something like
> > 
> > output->setRotation(Output::Left);
> > TouchscreenList *touchscreens = output->touchscreens();
> > for (auto touchscreen : touchscreens) {
> > 
> >         touchscreens.at(0)->setRotation(Output::Left);
> > 
> > }
> > // ... then setting the new config as usual through
> > SetConfigOperation()
> 
> I don't undertand what this help with?

Changing both, input and display at the same time. This will then go
through either, xrandr or kwayland backends and use xinput or wl_touch
(as you note) to set the input rotation.

Also, the mapping will be in kscreen then.

> That will store a value, but you haven't said who's going to do
> anything with it, which is the important part.
> 
> If we're going to set it to be the same as the screen we may as well
> just have the code that handles input to read the screen rotation.

On X, that's the X server, we're expected to set it through xinput.

On Wayland, that's different, that's why I'm thinking it would have to
be backend-specific, which brings in kscreen which has the backends
already.
-- 
sebas

http://www.kde.org | http://vizZzion.org


More information about the Plasma-devel mailing list