The hadoop snap has a slot that shares the $SNAP_DATA/etc/hadoop directory:
I can’t figure out how to consume this with an auto-connecting plug that needs it in a subdir. Auto-connect is the key here, as this works with local snaps, but as soon as I publish snaps that will auto-connect (i’m the publisher of both), they can’t create the mount point before the auto-connector:
2017-09-25T11:58:39-05:00 ERROR run hook “install”: cannot perform operation: mount --bind -o ro,nosuid,nodev /var/snap/hadoop/11/etc/hadoop /var/snap/pig/common/hadoop-conf: No such file or directory
This works if I specify the plug with “target: $SNAP_COMMON” just fine, but I can’t do a “target: $SNAP_COMMON/foo”.
I’ve tried various hooks (install / prepare-plug-hadoop-conf) to make the plug target directory before the auto-connect, but those don’t seem to execute soon enough, so I’m always met with the “No such file …” that references my plug’s mount point.
How can you share content from a slot’s $SNAP_DATA with a plug that auto-connects?
I’ve opened this bug because I think it’s a bug:
I’d be elated to find out I’m just doing it wrong!