Snapcraft --(http|https)-proxy parameter is not picked up across the board

I needed snapcraft proxy support so i used the edge channel

$ snap list snapcraft
Name       Version                 Rev   Tracking  Publisher   Notes
snapcraft  3.9.2+git121.g3de398e6  3786  edge      canonical*  classic

Having set LXD proxy settings
(to enable it to access https://cloud-images.ubuntu.com)

$ lxc config set core.proxy_https=127.0.0.1:8118
$ lxc config set core.proxy_https=127.0.0.1:8118

I proceeded to pass proxy parameters to snapcraft

$ SNAPCRAFT_ENABLE_DEVELOPER_DEBUG=yes \
     snapcraft snap \
       --use-lxd \
       --http-proxy=127.0.0.1:8118 \
       --https-proxy=127.0.0.1:8118
Starting snapcraft 3.9.2+git121.g3de398e6 from /snap/snapcraft/3786/lib/python3.6/site-packages/snapcraft/cli.
snapcraft is running in a docker or podman (OCI) container
Parts dir /root/provision.Ek6UhEW2xZ/snap-bitcoind/parts
Stage dir /root/provision.Ek6UhEW2xZ/snap-bitcoind/stage
Prime dir /root/provision.Ek6UhEW2xZ/snap-bitcoind/prime
Parts dir /root/provision.Ek6UhEW2xZ/snap-bitcoind/parts
Stage dir /root/provision.Ek6UhEW2xZ/snap-bitcoind/stage
Prime dir /root/provision.Ek6UhEW2xZ/snap-bitcoind/prime
The LXD provider is offered as a technology preview for early adopters.
The command line interface, container names or lifecycle handling may change in upcoming releases.
Launching a container.
/snap/snapcraft/3786/lib/python3.6/site-packages/pylxd/models/_model.py:116: UserWarning: Attempted to set unknown attribute "type" on instance of "Container"
  key, self.__class__.__name__
Waiting for cloud-init
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 cloud-init status --wait
...........................................................................................................................................................................................
status: done
snapcraft is running as a snap True, SNAP_NAME set to 'snapcraft'
Enable use of snapd snap.
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap set system experimental.snapd-snap=true
Holding refreshes for snaps.
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap set core refresh.hold=2019-12-01T15:11:20.363734Z
Waiting for pending snap auto refreshes.
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap watch --last=auto-refresh
error: no changes of type "auto-refresh" found
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap ack /var/tmp/snapd.assert
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap install /var/tmp/snapd.snap
2019-12-01T14:56:35Z INFO Waiting for restart...
snapd 2.42.4 from Canonicalâś“ installed
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap ack /var/tmp/core18.assert
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap install /var/tmp/core18.snap
core18 20191030 from Canonicalâś“ installed
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap ack /var/tmp/snapcraft.assert
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snap install --classic /var/tmp/snapcraft.snap
snapcraft 3.9.2+git121.g3de398e6 from Canonicalâś“ installed
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snapcraft refresh
Err:1 http://archive.ubuntu.com/ubuntu bionic InRelease                  
  Temporary failure resolving 'archive.ubuntu.com'
Err:2 http://security.ubuntu.com/ubuntu bionic-security InRelease        
  Temporary failure resolving 'security.ubuntu.com'
Err:3 http://archive.ubuntu.com/ubuntu bionic-updates InRelease
  Temporary failure resolving 'archive.ubuntu.com'
Err:4 http://archive.ubuntu.com/ubuntu bionic-backports InRelease
  Temporary failure resolving 'archive.ubuntu.com'
Reading package lists... Done        
Building dependency tree       
Reading state information... Done
All packages are up to date.
W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/bionic/InRelease  Temporary failure resolving 'archive.ubuntu.com'
W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/bionic-updates/InRelease  Temporary failure resolving 'archive.ubuntu.com'
W: Failed to fetch http://archive.ubuntu.com/ubuntu/dists/bionic-backports/InRelease  Temporary failure resolving 'archive.ubuntu.com'
W: Failed to fetch http://security.ubuntu.com/ubuntu/dists/bionic-security/InRelease  Temporary failure resolving 'security.ubuntu.com'
W: Some index files failed to download. They have been ignored, or old ones used instead.
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 printenv HOME
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snapcraft snap
There seems to be a network error: maximum retries exceeded trying to reach the store.
Check your network connection, and check the store status at https://status.snapcraft.io/
Run the same command again with --debug to shell into the environment if you wish to introspect this failure.

It looks like apt in the container is not inheriting the proxy setting passed to snapcraft, or at least apt is not configured to avoid dns resolution (and to leave that up to the proxy).

Having then allowed outgoing DNS traffic, I’m able to go a bit further and apt-get update completed in the build container, suggesting that it picked up the http and https proxy setting.

Once past the apt-get update there is another error, this time accessing status.snapcraft.io

Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 printenv HOME
Running: /snap/bin/lxc exec snapcraft-bitcoind -- env SNAPCRAFT_HAS_TTY=True http_proxy=127.0.0.1:8118 https_proxy=127.0.0.1:8118 snapcraft snap
There seems to be a network error: maximum retries exceeded trying to reach the store.
Check your network connection, and check the store status at https://status.snapcraft.io/

My expectations, when --http-proxy and / or --https-proxy are defined:

Just to clarify this part : status.snapcraft.io is a status monitoring page, meant for human consumption and visually checking the status of store services; neither snapcraft nor any other client use, or need access, to that service. Snapcraft is merely telling you to have a look in case the store is having trouble (typically visible in that page) as a debugging aid.

  • Daniel

The feature is only on edge as it hasn’t been battle tested yet, much less documented. The setting of apt and snap proxy shall eventually be taken into account on edge.

Yeah sorry, I typed that part too quickly. The error was

There seems to be a network error: maximum retries exceeded trying to reach the store.
Check your network connection

Maybe also consider adding a retry or timeout setting, since it looks like these connection timeouts can occur (see my previous comment). Unfortunately I can’t tell from the error message whether the connect timed out, or connect was refused, or the connect succeeded then read timed out, etc

At the time this happened i went to status.snapcraft.io in a browser and that page timed out also, suggesting there was a larger problem with the store infrastructure. This makes it difficult to sort out which problem is causing which error.

Hi @stephane, there was some infrastructure outages last night that may be the cause of the network issues you were seeing. Hopefully they are resolved now :slight_smile:

1 Like