Using system certificates in snap that relies on python urllib

My VPN inspects SSL packets so needs to be added to the system certificate store. This causes https requests to fail until the certificate is added to /etc/ssl/certs and then running update-ca-certificates.

First question is what is the best way to make that update apply to all snap apps. I can mount --bind /etc/ssl/certs on to /snap/core18/current/etc/ssl/certs, but not sure if this is the best option.

Second question is how to make python scripts in a snap package use that system certificate. It seems that certifi by default is using lib/python3.10/site-packages/certifi/cacert.pem rather than the system certificate store. I’ve tried setting REQUESTS_CA_BUNDLE='/etc/ssl/certs/ca-certificates.crt' and also tried adding python3-certifi to stage-packages: in snapcraft.yaml, and neither seems to have any effect.

I’ve also tried snap set system store-certs.cert1="$(cat vpn.crt)".

In short, how is a normal user supposed to handle the situation where an additional certificate needs to be trusted for standard SSL operations? Is there a simple fix here?

When user login via ssh to remote server, he can setup and store his remote ssh keys locally to connect easy without additional mess.

If there is VPN, what are the ssh keys? And there is maybe GNOME plugin missing of VPN running…

And if there is paid VPN service available maybe via browser plugin not necessary as snap, what is manager of such keys or its subscription and how this relates to available ssh keys for remote access selected remote servers or services?

There is a lot of to consider and I am not such expert on this, maybe Ubuntu’s or Debian team will tell more… but it is also thing of state or country legal assessment like WIFI illegal in some countries and kernel devs had to do a lot of hard work on this area and not finished yet…

$ sudo touch /etc/ssl/certs/foo
[sudo] Passwort für ogra: 
$ snap run --shell telegram-desktop
To run a command as administrator (user "root"), use "sudo <command>".
See "man sudo_root" for details.

$ ls /etc/ssl/certs/|grep foo
foo
$ exit
$

Your snap should definitely see the added certs … (it surely does here)

Snaps run with /etc bind mounted from the host system (with a few exceptions bind mounted over the top, like /etc/nsswitch.conf). So the /etc/ssl hierarchy the snap sees will be the same as what the host system sees.

You can test this by running stat /etc/ssl/certs/ca-certificates.crt both inside and outside of the snap sandbox: compare the file size, inode number and times. You can start a shell running inside a snap sandbox by running e.g. snap run --shell firefox.

The one gotcha is if you only have a symlink to the cert in /etc/ssl/certs: if it symlinks out to a location that is different in the sandbox, you might have a problem if the app is looking for the /etc/ssl/certs/$hash.0 files rather than the ca-certificates.crt bundle.

1 Like

On which distro do you see this issue?

@jamesh You’re right, I’m seeing the host /etc/ssl/certs within the snap. The problem seems to be that Python urllib doesn’t use the system certs but one that comes in the snap – in this case, it’s looking at /snap/yt-dlp/330/lib/python3.10/site-packages/pip/_vendor/certifi/cacert.pem, rather than /etc/ssl/certs/ca-certificates.crt . I thought setting REQUESTS_CA_BUNDLE='/etc/ssl/certs/ca-certificates.crt' would avoid that behavior, but it has no effect.

@mborzecki1 this is Ubuntu Jammy on WSL.

Update - I added python3-certifi to the snapcraft.yaml file, rebuilt, and installed. Now the app is pulling from the system cert store and it works fine.

Perhaps this should be a best default practice for Ubuntu snaps that use python urllib?

1 Like