Python-gi and Python 2 in a snap of Pick

I’ve built a desktop colour picker, called Pick (https://kryogenix.org/code/pick/). It’s Python 2 (not 3) and Gtk 3. One helpful person has attempted to package it as a snap in this github pull request. However, they’re having some difficulty with knowing how to do the python-gi dependency, and I don’t know how to help. The snap build process seems to want to pull in python’s GI bindings from PyPI, which isn’t working; alternatively maybe it should be being pulled from an Ubuntu package? Or maybe this is part of the platform? I don’t know any of this stuff and it seems very confusing. Maybe someone here can help or comment on that pull request on how best to do this?

Hi! I’m the one that made the snap :wink:
I think I figured out a possible issue related to this one.
If I install ColourPicker locally the dependency on PyGObject on setup.py doesn’t actually download anything from pypi, it just checks if there’s a version installed locally (might be more distro independent).
It says:

Using /usr/lib/python2.7/dist-packages
Searching for pygobject==3.22.0
Best match: pygobject 3.22.0

In the snap it only looks in the folder “parts/pick-colour-picker/packages”, instead of the whole system, so it doesn’t find the correct dependency.

Looking at the snapcraft.yaml file, you’ve got python-gi listed under stage-packages, which just causes the contents of that package to be copied into the staging area for inclusion in the snap. Your setup.py won’t know to look there, so is trying to download and build it itself.

Try adding python-gi to build-packages too and see if that helps. That will cause Snapcraft to ensure the package is installed on the build machine.

I tried earlier today, but it still won’t find it.
The package is still not in the /parts/pick-colour-picker/packages folder

Not sure I understand that; isn’t snapcraft meant to be usable on, y’know, Windows and suchlike? Does it know about the Ubuntu archives even if it’s not running on Ubuntu? (Or perhaps I’ve misunderstood and you need to be on Ubuntu to build packages?)

I’m not sure what the story is for building on other distributions/operating systems. But the primary way snapd lets you use binary packages across distribution releases and distributions is by using mount name spaces so the snapped application sees /snap/core/current as the root file system.

That core file system is built from Ubuntu 16.04, so it isn’t too surprising that you might need Ubuntu packages to build applications to run on top of it. This will probably change a bit with the introduction of base snaps, but that would just involve swapping out one set of dependencies and build OS for another.

Getting back to the problem at hand: if snapcraft is using the system Python interpreter to build the package, then you either need (a) python-gi listed in build-packages, or (b) everything needed to build python-gi from PyPI listed in build-packages.

your snaps get executed inside the core snap which is an ubuntu “rootfs”, to keep libc, linker et all compatible, snapcraft uses indeed the ubuntu archive for build and runtime packages when assembling a snap. even on other distros it would have to do that in the current setup. a build on windows or macos would have to use a minimal ubuntu container to build …

there is an approach to overcome this (at least for other linux distros) so you can also use fedora or suse rpm archives etc, discussed at Introducing base snaps

@ogra
going back to the original issue then.
I’m using the python plugin.
I added python-gi as a dependency in the build-packages, so pip could find it at build time.
But, when I run snapcraft, it can’t find it because, instead of looking for the library in the system folders, like it would in a normal distro install it only looks in parts/pick-colour-picker/packages.
To solve this issue I thing the python plugin should also look for example in:
parts/pick-colour-picker/install/usr/lib/python2.7/dist-packages

@ogra
I also tried using the “prepare” tag to edit the setup.py file and remove that dependency in the snap, since we already cover it in the stage-packages.
But the prepare tag is apparently read after the python setup.py build. This goes against what is written in the scriptlet page:

“The prepare scriptlet is triggered between the pull and the build step of a plugin. It allows you to run commands on the source of a part before building it.”

In this case “prepare” could solve this issue (for now). and allow to modify the script.
I still think that pip should look in all the snap for the dependencies, keeping the normal functionality.
These both seem issues that can be solved fairly easily.
If they need to be pointed out somewhere else let me know!