Interface autoconnections request for openvino-ai-plugins-gimp

To make the review of your request easier, please use the following template to provide all the required details and also include any other information that may be relevant.


  • name: openvino-ai-plugins-gimp
  • description: OpenVINO™ AI plugins for GIMP adds AI functionality to GIMP for stable diffusion, super resolution, and semantic segmentation. The plugins can run on Intel CPU, NPU, or GPU devices. This snap also provides an application for downloading models that can be used with the stable diffusion plugin.
  • snapcraft: openvino-ai-plugins-gimp-snap/snap/snapcraft.yaml at main · canonical/openvino-ai-plugins-gimp-snap · GitHub
  • upstream: GitHub - intel/openvino-ai-plugins-gimp: GIMP AI plugins with OpenVINO Backend
  • upstream-relation: Currently the snap is published under my personal account (wfrench) but we plan to soon create an Intel or OpenVINO publisher and migrate this snap (as well as one or two others) there.
  • interfaces:
    • <intel-npu>:
      • request-type: auto-connection
      • reasoning: Connects to the custom-device slot provided by the intel-npu-driver snap. This is required by the model-setup app in the snap to access the NPU device char node in /dev/accel/, as the app tries to detect the presence of a NPU and, if found, compiles AI models optimized for the NPU. This is for convenience and seems low risk as access is limited within the custom-device slot definition.
    • <npu-libs>:
      • request-type: auto-connection
      • reasoning: Mounts the runtime libraries for the Intel NPU through a slot defined in the intel-npu-driver content producer snap. These libraries are required by the model-setup app to compile AI models to be run on the NPU.
    • <openvino-libs>:
      • request-type: auto-connection
      • reasoning: Mounts the runtime libraries for OpenVINO through a slot defined in the openvino-toolkit-2404 content producer snap. These libraries are required by the model-setup app to detect hardware and download/compile AI models based on the detected hardware.
    • <home>:
      • request-type: auto-connection
      • reasoning: The model-setup app downloads and installs AI models to a user’s home directory. The models are eventually read by the GIMP snap and the home directory is a convenient shared location that is accessible to both snaps.

home is auto-connected on classic (non-Ubuntu Core) systems - is this intended to be used on Ubuntu Core? If so, we can look at granting this but ideally if there needs to be some common, shared location, a content interface would seem to be more appropriate for this.

As for the various content interfaces provided by the intel-npu-driver and openvino-toolkit-2404 snaps, +1 from me for auto-connect of these.

As for the custom-device interface provided by the intel-npu-driver snap, +1 from me as well since again this snap is maintained by Canonical.

Hey @alexmurray !

home is auto-connected on classic (non-Ubuntu Core) systems - is this intended to be used on Ubuntu Core? If so, we can look at granting this but ideally if there needs to be some common, shared location, a content interface would seem to be more appropriate for this.

No, this is targeting desktop usage only. I’ve seen where other desktop-oriented snaps like gimp and data-science-stack use home so thought it was justified if it provides a better user experience.

I think there are a few issues with using the content interface to provide the models:

  • Each model is on the order of GBs in size, so bundling them all in a snap (or group of snaps) does not feel like a good use of Snap Store resources (and probably there are size restrictions in the Store that we’d exceed?)
  • The models come in a generic format, but then are compiled into device-specific blobs (e.g. optimized for NPU or GPU) by the GIMP plugins at runtime and written to the same directory where the generic models reside, which would not work if the models were to be located inside a snap

Let me know what you think. I understand there are security factors at play, and I’m definitely open to suggestions!

So rather than shipping them inside the snap, I was thinking one snap would fetch them and maintain them and provide them to other snaps via its $SNAP_DATA or $SNAP_COMMON paths and so this would then be exposed as a content interface which other snaps can plug to gain access to the models etc. In that case though the model-setup part would have to run as root so that it can write to these paths.

Otherwise, can you explain where in HOME model-setup currently is storing the models?

So rather than shipping them inside the snap, I was thinking one snap would fetch them and maintain them and provide them to other snaps via its $SNAP_DATA or $SNAP_COMMON paths and so this would then be exposed as a content interface which other snaps can plug to gain access to the models etc. In that case though the model-setup part would have to run as root so that it can write to these paths.

Gotcha. I’m not sure the OpenVINO libraries that get used by the model-setup part are intended or designed to be safely run as root, and I’m hesitant to depart from how the app is used upstream.

I did consider implementing the model-setup part and app in the gimp snap directly so it already has access to home or could write to $SNAP_DATA or $SNAP_COMMON but really wanted to minimize the direct changes to the gimp snap itself as I think that would introduce a fairly large maintenance burden for the maintainers.

Otherwise, can you explain where in HOME model-setup currently is storing the models?

At the moment model-setup will by default install models in a user’s home directory at ~/openvino-ai-plugins-gimp. However, the user can adjust the path by setting the GIMP_OPENVINO_MODELS_PATH shell variable to a different non-hidden location in their home directory.

This feels a bit intrusive to create a new top-level directory in the user’s HOME directory. Perhaps it would be better to use personal-files and place this in a hidden folder, either in the existing ~/.gimp-2.8 folder or ~/.local/share/openvino-ai-plugins-gimp perhaps? There is a history of users being frusted by the ~/snap dir so I think it might be best to avoid a similar situation here.

1 Like

This feels a bit intrusive to create a new top-level directory in the user’s HOME directory. Perhaps it would be better to use personal-files and place this in a hidden folder, either in the existing ~/.gimp-2.8 folder or ~/.local/share/openvino-ai-plugins-gimp perhaps? There is a history of users being frusted by the ~/snap dir so I think it might be best to avoid a similar situation here.

Ah personal-files! Somehow I didn’t realize this interfaces existed or maybe I just forgot. :slight_smile: 100% agree this is a better approach. I’ll update and test and post here when I’ve replaced home with personal-files. Thanks @alexmurray !

@alexmurray I’ve made and tested the changes (both in openvino-ai-plugins-gimp and in gimp itself) using personal-files instead of home. See this PR: Switch from home snap interface to personal-files interface by frenchwr · Pull Request #2 · canonical/openvino-ai-plugins-gimp-snap · GitHub

The openvino-ai-plugins-gimp snap is now pending manual review in the store because of the new interface. To document in this thread for posterity, I removed home from the apps section and substituted this plug:

plugs:
  ...
  dot-local-share-openvino-ai-plugins-gimp:
    interface: personal-files
    write:
      - $HOME/.local/share/openvino-ai-plugins-gimp

I added the same in the gimp snap. If this looks okay I’d like this interface to autoconnect please.

Thanks @wfrench - out of interest - can the gimp snap use just read permission for this same plug? Or does it also need write?

+1 from me for auto-connect of a personal-files named dot-local-share-openvino-ai-plugins-gimp for write access to $HOME/.local/share/openvino-ai-plugins-gimp for openvino-ai-plugins-gimp and gimp (although if we can use read for gimp and any other snaps that consume this contents that would be preferred).

1 Like

Can other @reviewers take a look at this too? Thanks

Thanks @wfrench - out of interest - can the gimp snap use just read permission for this same plug? Or does it also need write?

I attempted this initially, but the GIMP plugins (which run from within the GIMP snap) need write permissions to store versions of models optimized for accelerator devices and other metadata.

Give the discussion above, this request fits the functionality and purpose of the snap. +1 from me as well for granting auto-connect of:

  • personal-files (write access of dot-local-share-openvino-ai-plugins-gimp)
  • custom-device (intel-npu)
  • content (npu-libs & openvino-libs)

+2 votes for, 0 votes against, granting auto-connect of interface dot-local-share-openvino-ai-plugins-gimp, intel-npu, npu-libs and openvino-libs to snap openvino-ai-plugins-gimp. Publisher is vetted. This is now live.

Please let us know when you verify to know if this works as expected! Thanks!

Works a treat! Thanks for the review.