I have a Python Project hosted on GitHub which builds a working PIP Package.
I have setup Snapcraft to autobuild from my GitHub repository.
Snapcraft succeeds in building a snap file which can be installed.
Traceback (most recent call last):
File “/snap/ocrmypdfgui/9/bin/ocrmypdfgui”, line 5, in
from ocrmypdfgui.main import main
ModuleNotFoundError: No module named ‘ocrmypdfgui’
My complete project including snapcraft.yaml can be found here:
I would very much appreciate help understanding why I am getting a Module Not Found Error when my PIP Package works. Google searching unfortunately has not helped me with this issue so far. What am I missing?
Python has a concept of a path specifically for modules, you have two choices.
In the Python Code itself, import sys and then sys.path.append("/snap/ocrmypdfgui/current/lib/python3.8/site-packages"), perhaps probe for the existence of the $SNAP variable if you want to avoid doing this for any other packages.
The problem comes down to Python’s ctypes.util find_library() having a dependency on /sbin/ldconfig, when the host has liblept installed, the snap works, if the host doesn’t, the snap fails. Even if the host has a copy of the library, the snap would still be using it’s own version bundled internally.
The easiest fix would be to change the one line that even bothers looking for the library, because this is in a snap, we know the library will always where we put it. Keep the $LD_LIBRARY_PATH change, and under ocrmypdfgui part, add the following.
sed -i 's/find_library(libname)/"liblept.so.5"/g' $SNAPCRAFT_PART_INSTALL/lib/python3.6/site-packages/ocrmypdf/leptonica.py
It’s also worth mentioning you should probably change,
So that it builds correctly regardless of absolute paths which aren’t guaranteed to be consistent. You can then drop the following part entirely, since I presume it’s only there to have gotten the .desktop file somewhere you expected it.
This can be fixed by adding binutils to the base snap or as a stage package in the app snap. I think we should petition to include this deb package in all the core* snaps as a common requirement for many snaps, especially python snaps that require to use cpython library loading via pip packages.
I haven’t tested this, but I surmised the above due to the following code in the linked file:
# See issue #9998
return _findSoname_ldconfig(name) or \
_get_soname(_findLib_gcc(name)) or _get_soname(_findLib_ld(name))
Both _findSoname_ldconfig and _findLib_gcc fail and fall through to _findLib_ld. The configured LD_LIBRARY_PATH is only inspected in this latter function (_findLib_ld), which requires ld to be accessible on the PATH. The ld executable is included in the binutils package. Therefore we can get python to respect the LD_LIBRARY_PATH when loading shared objects within cpython by making ld available (i.e. installing binutils either in the base snaps - core, core18, core20, core22, et al - or by installing binutils into every snap that uses python with cpython libraries)
If it’s the same problem as OP, I doubt the answer has changed.
I know there were talks about pre-loading the library caches for snaps, but I don’t think they’ve gotten anywhere.
I’d still have to imagine the way that the Python VM was doing it back then is still the same way it’s doing it now. I.E, in this thread it was essentially looking at the hosts cache and identifying the libraries the host had rather than the snap namespace has.
Aside from changing Python / the app itself, I’d have to guess the only other solution could be to try replacing /sbin/ldconfig via a layout with a wrapper that simply always gives a positive response when asking for a specific library, but this could break other things subtly if they were to call to the wrapper and not get the expected response.
Fundamentally, the problem is that Python ends up treating the cache as a source of truth of what’s installed, and not just literally a cache, meaning stuff that’s still valid and would work gets pre-emptively prevented from trying to do so.