Restricting Browsing to a Specific URL/Domain

Hello everyone, I’m currently using the wpe-webkit-mir-kiosk snap on Ubuntu Frame for a kiosk setup, and I would appreciate to get you suggestions for restricting user browsing. I know this package is specifically designed for kiosk use, but some of the web apps I plan to run may contain clickable links that could take users away from the main URL.

Ideally, I want to restrict browsing to a single domain (e.g., https://app_url.com), ensuring that users cannot navigate to external sites like Facebook or Github, even if they manage to find clickable links. Additionally, if a user finds and clicks an external link, I’d like to have them automatically redirected back to the main URL in the worst case.

Some have suggested using a proxy like Squid to achieve this for kiosk browser in general, which could automatically redirect any external navigation back to the main URL. However, I’m curious if there’s a built-in way to handle this restriction within the snap, or if anyone has found another solution that’s easier to implement.

If not, this feature could be a useful addition to enhance control over browsing within kiosk environments.

Thanks in advance for any advice or suggestions!

As far as I know, SNAP’s security relies on AppArmor, and it doesn’t have the functionality you’re looking for. You won’t be able to bypass that. The easiest solution is to use a SQUID proxy or dig into the sources of wpe-webkit-mir-kiosk and create your own fork. I know that configuring tools like SQUID isn’t easy for everyone. Configuring through convoluted text files can be frustrating.

A quick and easy solution is to set up a router with a built-in SQUID and a configuration panel where everything can be clicked through. I recommend https://www.ipfire.org/ - if I remember correctly, you can configure everything there.

Configuring Squid for HTTP connections is trivial – you add 5 options to the configuration file and create a bash script to rewrite URLs.

I ran it locally and it works. The real challenge starts with HTTPS. You need a Squid version that has built-in OpenSSL, and the configuration is more complex. You have to generate a certificate and add various things to the config. After an hour, I gave up. The squid-openssl version on Ubuntu 24.04 probably doesn’t have all the functions compiled… In the days without SSL, everything was simple.

Example domain - google[dot]com

sudo apt install squid-openssl

sudo nano /etc/squid/squid.conf

acl allowed_domain dstdomain .google.com
acl blocked_domains dstdomain .*
http_access deny blocked_domains
http_access allow allowed_domain
url_rewrite_program /usr/local/bin/squid_redirector.sh

sudo nano /usr/local/bin/squid_redirector.sh

#!/bin/bash
ALLOWED_DOMAIN="google.com"
ALLOWED_URL="http://www.google.com/"
while read url; do
    if [[ $url =~ $ALLOWED_DOMAIN ]]; then
        echo $url
    else
        echo $ALLOWED_URL
    fi
done

sudo chmod +x /usr/local/bin/squid_redirector.sh

sudo systemctl restart squid

Before restarting, check if the configuration is correct:

squid -k parse