Multiple touchscreen monitors with Ubuntu Core

The documentation for setting up multiple monitors with Ubuntu Core is working well for me.

I have two monitors working well.

My remaining issue with with touchscreen. Both monitors are touchscreen. The touchscreen is working on both monitors. The problem is with that the touchscreen (X,Y) positions for the 2nd monitor are mapping onto the (X,Y) space from the first monitor. Touching the 2nd monitor causes GUI events from 1st monitor.

I can’t figure out how to translate the touchscreen inputs from the second monitor.

Attached is my layout configuration:

layouts:

default: # the default layout

cards:

- card-id: 0

  eDP-1:

    snap-name: service
    state: enabled	
    # mode: 3840x2400@60.0	# Defaults to preferred mode
    position: [0, 0]	
    scale: 1
    # group: 0	# Outputs with the same non-zero value are treated as a single display

  DisplayPort-4:
    snap-name: chromium
    state: enabled	
    # mode: 1920x1080@60.0	# Defaults to preferred mode
    position: [3840, 0]	
    scale: 1
    # group: 0	# Outputs with the same non-zero value are treated as a single display

Unfortunately, that is a limitation of Mir that has yet to be addressed. Here’s the issue on Github:

@wduncan hey, separately from what Alan pointed at, what is the experience you’re looking for? Is it just one user operating it, or potentially two? Is there an on-screen keyboard involved?

We are looking at using something like a nexdock 360 as a support/service tool.

The support tech would plug a 2nd monitor into the system to access service/support/calibration/logging capabilities running with their own GUI interface. The 1st screen would still be displaying our primary application, and we may want our users to do some limited interaction with the primary interface during the support session. The 2nd monitor would be for time-boxed support services. The 2nd monitor would be running applications to diagnose and configure with real-time feedback the application running on the 1st monitor. Once the support visit is finished, the 2nd monitor will be removed. There would not be an onscreen keyboard involved. We are looking at something like a nexDock 360 so the support person has their own keyboard and mouse for the service visit.

@wduncan, I’ve not tried it, so it may not work at all. But as Mir uses libinput it may be possible to hack the /etc/udev/rules.d/ rules with a calibration matrix that shifts the touchscreen input from the nexDock:

https://wayland.freedesktop.org/libinput/doc/latest/device-configuration-via-udev.html

If you try this, I would be interested to hear the results.