Build a snap with any version of Python I want

Is it possible to specify any version of Python that I want, when building with the core18 option in snapcraft.yaml?

It seems that I am forced to use Python 3.6 which is the default for Ubuntu 18.04, so am assuming that the “18” in “core18” is related to Ubuntu 18? What if I want Python 3.5 or Python 3.7 ? Are there specific entries I can add to snapcraft.yaml to make this happen? Or are we stuck with Python 3.6? Till when?

I searched the forums and there are some people talking about a Python 3.7 snap - which hasn’t eventuated. What’s the difference between stage-packages, snaps, and plugins anyway?

And there is another thread taking about using Python 3.6 when the default is Python 3.5, though it has no clear answer, and talks about building via build.snapcraft.io which presumably is an online service which adds further constraints. I’m just trying to build on my Ubuntu 18.04 machine.

Yes, Ubuntu 18.04 to be exact.

You can always build one from source by adding a new part definition, there are also other ways to incorporate a Python distribution in your snap(PPA, stage-snaps) but this option is always available.

snapcraft help python tells you what you need to do to use a different python for a part.
Briefly, if you build python in a part and put that in part-name in the after clause for the python using plugin you have, the plugin will prefer that python installation.

Here’s a bit of a complex snap doing something like that, the part defined here, named git-ubuntu, https://git.launchpad.net/usd-importer/tree/snap/snapcraft.yaml#n358 has python3 in after and python3 is defined on line https://git.launchpad.net/usd-importer/tree/snap/snapcraft.yaml#n205

2 Likes

The docs mentions SNAPCRAFT_PYTHON_INTERPRETER and SNAPCRAFT_PYTHON_VENV_ARGS

$ snapcraft version
snapcraft, version 4.1.1

...

This plugin also interprets these specific build-environment entries:

    - SNAPCRAFT_PYTHON_INTERPRETER
      (default: python3)
      The interpreter binary to search for in PATH.

    - SNAPCRAFT_PYTHON_VENV_ARGS
      Additional arguments for venv.

By default this plugin uses python from the base, if a part using
this plugin uses a build-base other than that of the base, or a
different interpreter is desired, it must be bundled in the snap
(including venv) and must be in PATH.

On GitHub I could not find any usage of SNAPCRAFT_PYTHON_INTERPRETER other than in snapcraft and its tests. **In case all parts shall use the same, non default python2/python3 version can I use this build-environment specific Python version?

According to the release notes of the Snapcraft v4 python plugin "The plugin can use an interpreter if it is added through a comprehensive list of stage-packages (an extension shall be evaluated in the future to provide alternative complete python stacks)." That’s seems to be exactly what has been done in the code. Is that right?

Does anyone have a concise example of a Python 3.x (x!=6) build that someone can point me to? I’m not a snap n00b per se, but I feel like trying to follow the above would set me on course for some trial-and-error. It would be great to just have a “recipe”.

I came across How do I correctly install a custom python version and a local wheel package? which does provide a maybe-working Python 3.7 example, though one of the posts in that thread indicates that it wouldn’t work on arm (which I do need it to).

Based on other threads around here (e.g. Proposal: Extensive Python version support for Python plugin), I’m not the only one working for a way to do this. Until minor version pinning is implemented in snapcraft, it would be great to have a fool-proof example.

1 Like

I want to share my experience with this exact problem that you’re facing: in short, there’s there’s no ‘built-in’ easy way that I came across that easily allows you to use whatever python version you want. What we settled on for our snaps was to actually build the python interpreter that we use ourselves (as a part) during the snap build process and this way we have full control of what version of python we ship.

This is what we’ve settled on:

parts:
  python38:
    source: https://www.python.org/ftp/python/3.8.3/Python-3.8.3.tar.xz
    source-type: tar
    source-checksum: md5/3000cf50aaa413052aef82fd2122ca78
    plugin: autotools
    configflags:
      - --prefix=/usr
      # Enabling this will make the build times go up significantly because it
      # turns on link time optimizations and profile guided optimizations.
      #
      # For PGO, python is compiled twice. Once to collect profiling data (by
      # from running all UTs) and once to create an optimized build based on
      # that data.
      #
      # Unfortunately, one of the unit tests, test_socket, hangs for some
      # reason and it seems to be a known issue. For now just disable
      # optimizations as a workaround.
      #
      # - --enable-optimizations
    build-packages:
      # not needed: tk-dev
      - libbz2-dev
      - libexpat1-dev
      - libffi-dev
      - libgdbm-dev
      - liblzma-dev
      - libncurses5-dev
      - libreadline-dev
      - libsqlite3-dev
      - libssl-dev
      - libzip-dev
      - uuid-dev
    stage-packages:
      # not needed: tk8.6
      - libbz2-1.0
      - libexpat1
      - libffi6
      - libgdbm3
      - liblzma5
      - libncurses5
      - libreadline6
      - libsqlite3-0
      - libssl1.0.0
      - libzip4
      - uuid-runtime
    override-stage: |
      # We want the latest pip to be able to install pyproject.toml based projects
      PYTHONUSERBASE="$SNAPCRAFT_PART_INSTALL/usr" python3.8 -m pip install --user --upgrade pip wheel

      # Apply the same shebang rewrite as done by snapcraft
      find $SNAPCRAFT_PART_INSTALL/usr/bin/ -maxdepth 1 -mindepth 1 -type f -executable -exec \
        sed -i                                                                                \
          "s|^#!${SNAPCRAFT_PART_INSTALL}/usr/bin/python3.8$|#!/usr/bin/env python3|" {} \;

      snapcraftctl stage
    filesets:
      exclusion:
        - -etc
        - -lib/systemd
        - -usr/bin/2to3
        - -usr/bin/2to3-3.8
        - -usr/bin/deb-systemd-helper
        - -usr/bin/deb-systemd-invoke
        - -usr/bin/easy_install-3.8
        - -usr/bin/idle3
        - -usr/bin/idle3.8
        - -usr/bin/pip3
        - -usr/bin/pip3.8
        - -usr/bin/pydoc3
        - -usr/bin/pydoc3.8
        - -usr/bin/python3.8-config
        - -usr/bin/python3-config
        - -usr/bin/uuidgen
        - -usr/include
        - -usr/lib/*.a
        - -usr/lib/pkgconfig
        - -usr/lib/python3.8/test
        - -usr/sbin
        - -usr/share
        - -var
    prime:
      - "$exclusion"

If you’re okay with building python like this then you can use the recipe above and just change the source link to a different python version and update the shebang rewrite to point to the right python executable. Any part that is going to use python should have the following in its recipe:

after:
  - python38

Which guarantees that it will be built after the python interpret has been built. The major drawbacks is that this will significantly increase your build times (albeit, this can be amortized if you cache builds).

To be honest, this whole process is kind of a mess and required me to dig around in the snapcraft implementation to figure it out. And if you need to do builds of your python application for other CPU architectures then the best thing you can do to not waste your time is to get dedicated build hardware (I haven’t kept up with snap updates so I’m not sure if python mulit-arch build experience improved since the last time I checked things out last year).

2 Likes

I tried the python38 part above. Because both libreadline6 and libgdbm5 could not be found under core18, these replacement entries seemed to work

- libgdbm5
- libreadline7

In the end I got the error

Failed to stage: Parts 'python38' and 'MYAPP' have the following files, but with different contents:
    usr/bin/pydoc3
    usr/bin/python3
    usr/lib/python3.8/distutils/_msvccompiler.py
    usr/lib/python3.8/distutils/command/bdist_wininst.py
    usr/lib/python3.8/distutils/command/build_ext.py
    usr/lib/python3.8/distutils/command/install.py
    usr/lib/python3.8/distutils/command/install_egg_info.py
    usr/lib/python3.8/distutils/command/install_lib.py
    usr/lib/python3.8/distutils/dir_util.py
    usr/lib/python3.8/distutils/sysconfig.py
    usr/lib/python3.8/distutils/unixccompiler.py
    usr/lib/python3.8/lib2to3/Grammar.txt
    usr/lib/python3.8/lib2to3/fixes/fix_apply.py
    usr/lib/python3.8/lib2to3/fixes/fix_filter.py
    usr/lib/python3.8/lib2to3/fixes/fix_intern.py
    usr/lib/python3.8/lib2to3/fixes/fix_reload.py
    usr/lib/python3.8/lib2to3/pgen2/driver.py
    usr/lib/python3.8/lib2to3/pgen2/grammar.py
    usr/lib/python3.8/lib2to3/pgen2/token.py
    usr/lib/python3.8/lib2to3/pgen2/tokenize.py

Snapcraft offers some capabilities to solve this by use of the following keywords:
    - `filesets`
    - `stage`
    - `snap`
    - `organize`

Any thoughts?

I got this python3.9 working on my core20 based snap. There is a bug somewhere which results in ensurepip failing because it is not able to find distutils. To “fix” that I had to copy distutils directory from system

  alarm:
    plugin: python
    requirements:
      - requirements.txt
    source: .
    stage-packages:
      - python3.9-venv
    build-packages:
      - python3.9-venv
    build-environment:
      - SNAPCRAFT_PYTHON_INTERPRETER: python3.9
    override-build: |
      rm -r $SNAPCRAFT_PART_INSTALL/usr/lib/python3.9/distutils
      cp -r /usr/lib/python3.9/distutils/ $SNAPCRAFT_PART_INSTALL/usr/lib/python3.9/
      snapcraftctl build

Wonder if that is a bug in Snapcraft ?

cc @sergiusens ^

Updated the part because pyconfig.h was not found. So created a symlink of the system include directory to the part

  alarm:
    plugin: python
    requirements:
      - requirements.txt
    source: .
    stage-packages:
      - python3.9-venv
    build-packages:
      - python3.9-venv
      - python3.9-dev
    build-environment:
      - SNAPCRAFT_PYTHON_INTERPRETER: python3.9
    override-build: |
      # Workaround a bug in snapcraft python plugin
      # https://forum.snapcraft.io/t/build-a-snap-with-any-version-of-python-i-want/10420/8
      rm -rf $SNAPCRAFT_PART_INSTALL/usr/lib/python3.9/distutils
      ln -s /usr/lib/python3.9/distutils $SNAPCRAFT_PART_INSTALL/usr/lib/python3.9/distutils
      mkdir -p $SNAPCRAFT_PART_INSTALL/usr/include/
      ln -s /usr/include/python3.9 $SNAPCRAFT_PART_INSTALL/usr/include/python3.9
      snapcraftctl build
1 Like

This might be related to the issue we have when using the newer gnome extensions which also bundle python and the python plugin itself.

We should solve this @kenvandine, @om26er has provided a nice hint :slight_smile:

2 Likes

I tried the above snapcraft fragment and it worked for installing Python 3.9 (and even Python 3.8 venv) under core20. The snapcraft launch command needs to be command: bin/python src/somefile.py since bin in the root directory of the snap is where the resulting Python virtual environment python executables seems to be installed.

I had to add the following lines to the override-build: step

apt-get install --yes curl
curl -sS https://bootstrap.pypa.io/get-pip.py | sudo python3

to get pip installed, since it didn’t seem to be there and I had a requirements: step. P.S. adding curl to build-packages: or stage-packages: didn’t seem to work.

What about core18?

It would be great to get this solution to also work under core18.
Under core18 its a fail for python3.9-venv due to:

Could not find a required package in 'build-packages': python3.9-venv

even though python3.9-venv is listed in build-packages: as per the solution above.

Slightly better luck with core18 + python python3.8-venv as the snap built ok but the root of the resulting snap doesn’t contain the bin/python venv directory, thus snap commands get resolved to the nearest thing, usr/bin/python3 which turns out to be the old Python 3.6 installation - thus a fail. My attempt log here.

Core18 is based on Ubuntu 18.04 which doesn’t have Python 3.9 in the archives. One option could be to use the deadsnakes ppa – At one stage I was using that PPA in my snaps as well.

Latest snapcraft have made adding PPA very easy now https://snapcraft.io/docs/package-repositories

Thanks for the ppa tip which worked installing python 3.9 on core18.

I also finally got Python 3.8 working on core18, when I figured out that the command to invoke is command: python3.8 src/main.py which resolves to usr/bin/python3.8, and not to use command: python3 src/main.py which resolves to usr/bin/python3 which is the old Python 3.6.

Note: It turns out that using the python3.*-venv approach under core18 and core20 results in different locations for the resulting Python executables and thus in the command: invocations you need to use. I’ve gone into detail below, but from these results, it seems that the command invocation you can safely use in all cases (core18, core20), is usr/bin/python3.9.

Click to expand - details on the different locations of the resulting Python executables

Assuming your snapcraft file contains as per @om26er above, here are the locations of the installed Pythons:

Note all these paths are relative to ‘prime’ which ends up as the root of the deployed snap (see my article).

  • Under core18 the Python you want is in usr/bin/python3.9 not usr/bin/python3 (which is the old Python 3.6). Also, bin/python* files do not exist (like they do in core20). So you should e.g.
    • command: usr/bin/python3.9 -V
    • command: python3.9 -V
    • command: python3 -V No!
  • Under core20 the Python you want is in bin/python3 or bin/python or bin/python3.9. You can also reference usr/bin/python3.9. The old default Python doesn’t seem to be anywhere I can find in the resulting snap, which is good - less confusion. So you should e.g.
    • command: usr/bin/python3.9 -V
    • command: bin/python3.9 -V
    • command: bin/python3 -V
    • command: python3.9 -V No! Snapcraft can’t auto-resolve this Python to anything

The example command: invocations above pass the parameter -V just to print the Python version number, normally you would pass the path to your Python script.

From these results, it seems that the command invocation you can safely use in all cases (core18, core20), is usr/bin/python3.9 for example

command: usr/bin/python3.9 src/main.py

P.S. Not sure how this Python location information relates to the SNAPCRAFT_PYTHON_INTERPRETER setting, nor how it relates to the (now defunct as of core20) Python plugin python-version: setting.

What about pip installed packages?

The python3.*-venv approach, whilst successfully installing a Python of our choice, unfortunately under core20 it doesn’t seem to make my Python packages available, given I have the usual

requirements:
    - /root/project/requirements.txt

section in my snapcraft.yaml file. Interestingly, with the same snapcraft file (adjusted for core18/core20) Python does have access to packages in the resulting snap under core18, just not under core20.

I haven’t really worked on the Snapcraft source before, so I might need a few pointers. Can I help fix that issue ?

We have a need to make the Python plugin rock solid and I would like to help where I can.

This works great, except that ln -s should probably be replaced with cp -r, otherwise the snap will fail the automatic review (snap-review) when uploading a release, due to having external symlinks. Cheers.

I had a success just building from source:

parts:
  python:
    plugin: autotools
    source: https://www.python.org/ftp/python/3.9.8/Python-3.9.8.tgz
    autotools-configure-parameters:
      - --enable-optimizations
    build-packages:
      - build-essential
      - gdb 
      - lcov
      - pkg-config
      - libbz2-dev 
      - libffi-dev 
      - libgdbm-dev 
      - libgdbm-compat-dev 
      - liblzma-dev
      - libncurses5-dev
      - libreadline6-dev
      - libsqlite3-dev
      - libssl-dev
      - lzma
      - lzma-dev 
      - tk-dev 
      - uuid-dev 
      - zlib1g-dev
    override-stage: |
      snapcraftctl stage
      [ ! -d "${SNAPCRAFT_STAGE}/bin" ] && mkdir ${SNAPCRAFT_STAGE}/bin
      ln -s ../usr/local/bin/python3.9 "${SNAPCRAFT_STAGE}/bin/python3.9"
      ln -s ../usr/local/bin/python3.9 "${SNAPCRAFT_STAGE}/usr/bin/python3"
    stage-packages:
      - libfontconfig1
      - libfreetype6
      - libgdbm-compat4
      - libgdbm6
      - libpng16-16
      - libtcl8.6
      - libtk8.6
      - libx11-6
      - libxau6
      - libxcb1
      - libxdmcp6
      - libxext6
      - libxft2
      - libxrender1
      - libxss1

  homeassistant:
    after: [python]
    plugin: python
    source: https://github.com/home-assistant/core.git
    source-tag: ${SNAPCRAFT_PROJECT_VERSION}
    build-environment:
      - SNAPCRAFT_PYTHON_INTERPRETER: python3.9

Using PPA’s or any other APT-related versions of python just gave me a headache.

1 Like

how about pypip or pyenv ?

Here’s the rest; just included the part to get python available.

[...]
    python-packages:
      - setuptools<58
      - wheel
      - Cython
      - pip
    build-packages:
      - autoconf
      - build-essential
      - cmake
      - cython3
      - ffmpeg
      - libavcodec-dev
      - libavdevice-dev
      - libavfilter-dev
      - libavformat-dev
      - libavresample-dev
      - libavutil-dev
      - libcrypt-dev
      - libffi-dev
      - libglib2.0-dev
      - libglu1-mesa-dev
      - libgpiod-dev
      - libjpeg-dev
      - libopenzwave1.5-dev
      - libpcap0.8-dev
      - libswresample-dev
      - libssl-dev
      - libswscale-dev
      - libudev-dev
      - libxml2-dev
      - libxslt1-dev
      - pkg-config
      - python3-pip
      - python3.9-dev
      - zlib1g-dev
      - on armhf:
        - cargo
        - rustc
      - on ppc64el:
        - cargo
        - rustc
    stage-packages:
      - freeglut3
      - ffmpeg
      - libglu1-mesa
      - libpcap0.8
      - libpcap0.8-dev
      - libpulse0
      - libturbojpeg
      - netbase
      - tcpdump
      - zlib1g
    stage:
      - -lib/python3.9/site-packages/homeassistant/components/updater
      - -lib/python3.9/site-packages/aiogithubapi*
    requirements:
      - requirements_all.txt
    constraints:
      - homeassistant/package_constraints.txt
    override-stage: |
      snapcraftctl stage
      sed -i 's/include-system-site-packages = false/include-system-site-packages = true/g' $SNAPCRAFT_STAGE/pyvenv.cfg

A lot of these packages is Home Assistant releated, but I guess what you need:

  python-packages:
    - pip
  build-packages:
    - python3.9-dev

You probably need to add a PPA if you need a newer version. e.g

package-repositories:
  - type: apt
    ppa: deadsnakes/ppa

The build override part can be avoided if python3-distutils is added to stage-packages stanza. In case the project that is being compiled requires python3-dev, then that also needs to be added to stage-packages. The problem with the latter however is that it balloons the size of the snap because it also includes the -dev package, which in reality is only required during build-time and NOT runtime.

1 Like

Thanks for this, makes sense. I tried it and it works, but like you said, my snap size went up by 60% (I need python3-dev in stage-packages).