I’m working on snapping an internal python tool which has a few requirements that need to be met. Specifically, it needs python 3.7 and based on the architecture that I’m building on, I need to install a different local wheel that I built ahead of time.
I seem to have figured out how to install python 3.7 correctly as a part but I’m having some difficulties installing the wheels. Specifically, they go through the install flow but for reasons that I don’t understand, their contents don’t end up in the resulting .snap file.
The reason I need to install those wheels like this is because the package present on pypi comes with a precompiled binary that links against a different version of boost than what’s present in the base (on arm builds). So to solve it, I re-built the wheels locally and am overwriting the dependency that’s being pulled from pypi.
It’s not immediately clear from the available documentation that for this to work you don’t install packages as you would have done it normally outside of snapcraft (i.e. simple pip install <path to package>).
By studying the source code of the python plugin I came across these lines from which I learned that for this to work you would have to do something along the lines of: PYTHONUSERBASE=$SNAPCRAFT_STAGE python3.7 -m pip install --user -I $WHEEL_FILE.
So basically, to solve my issues I updated the my-app part as follows:
Hi @zoopp
Since I see arm there, are you trying to cross compile?
looking at your snapcraft yaml, let’s first focus on python37 part,when I tested this, it does not cross compile, and seems like this is at the moment broken https://stackoverflow.com/questions/56178633/trouble-cross-compiling-python
Do you really need python3.7, would 3.5 do, or if you use base: core18, would 3.6 do?
If you can work with python 3.5/3.6, then check my modifications to your snapcraft: