Stronger identity verification for ALL publishers?

Hi folks

We have seen a flurry of uploads of apps which trick users into revealing sensitive information. These are not attacking the system engineering, they are attacking the user through social engineering, so confinement rules cannot address the issue.

The team is working on a range of initiatives to mitigate and reduce the risk of apps like this. However, my concern is that apps can be updated, so even if an app is comprehensively reviewed at initial publication, the same app could turn rogue at a later date.

In the world of open source, we don’t generally have rigorous publisher information. Sites like github don’t tell you anything rigorous about who’s writing the code they host. In Ubuntu and Debian we take careful steps to know who’s uploading packages, but PPAs or OBS or Github or other build services will allow any anonymous person to build and publish packages.

Our goal with snaps should be to be the safest way to get software for your Linux system, and that means we should consider measures that are novel or different to other package hosting sites.

One thing we could do is require a more comprehensive proof of publisher identity for every publisher. We could require a credit card, and we could integrate the sort of ‘know your client’ technology that app-based banks are using to verify some sort of ID such as a passport. Typically these require something like a photograph of the passport together with a video of the person speaking. I think most banks use SAAS services for this KYC capability, and we could use the same services for Snapcraft publisher identity verification.

I suspect there is a bit of an arms race right now between bad actors with generative AI video-creation capability and these KYC services, and it would be hard to know who’s winning, in other words whether such a service provides real assurance of identity. A credit card test could also be passed by a fraudulent actor with stolen card information. If we charged a fee and waited two months to see if the charge was revoked we might have more confidence that it was really the cardholder’s intent to be charged, but then we’d have a two month delay for any publisher going through that process. I don’t really want to ask free software publishers to pay a fee to share their software as snaps, but perhaps there would be support for that if it led to people having more confidence in the safety of the system as a whole.

In the end, while it may not be perfect, we would have more data to share with law enforcement to try and bring the bad actors to justice.

We’re definitely going to attack this problem, even if the solutions are unpalatable it feels worse to potentially host malware. I’m posting here to get feedback on the specific ideas of getting some sort of hard identity check, and / or credit card information and transaction fee, for publishers. Also to invite suggestions for better approaches to publisher identity.



I’ve been a FOSS developer since 2009, but I also have a PhD in anthropology, so I have some thoughts.

The problem you raise @sabdfl is a fantastic problem to think about because it is a very human problem, involving relationships, trust, verification, accountability, transparency, anxiety, improvisation, security, privacy, and ultimately, power. This problem is so rich and multifaceted that which ever software platform gets this right will reap the rewards. Likewise, do it wrong and true peril awaits.

I want to take a step back from the technical details of identity checks and document verification to look at the bigger picture.

From the perspective of the end user, how can and do they come to trust the software they use, and the data that software manages? Who do they place their trust in — which institutions or individuals? Who is accountable? Who has power? How do they learn what they need to know in order to make a judgement that they are comfortable with?

From the perspective of the software platform, how can the platform encourage end users to take their existing improvised digital data practices and improve them? When should it communicate them? Enforce them? How should it communicate any needed changes to FOSS developers? How and who should and store collect identity verification from developers? Going beyond mere identity verification, should some developers be asked to go through a virtual face-to-face live interview in which development histories and commitment to platform responsibilities are examined?

Proposal 1: Develop Relationships

When knowing whether to trust a program, a key question for end users is who is responsible for it? If the end user is sophisticated (and let’s assume many are not), that means, who wrote it, and who is responsible for getting it into my computer? Underlying both questions is why should I trust them?

(At a more expert level, we of course distinguish between who wrote a FOSS program, who maintains it, who packages it, and who distributes it; moreover we distinguish between user-facing applications and the libraries they depend on. I’m blurring all of that here for the sake of discussion.)

These who and why questions of trust and responsibility seem so obvious. But is it easy to do this on modern systems? No! It’s possible to answer some of these questions if one improvises, but there is often plenty of friction, and it requires genuine motivation.

Imagine using a software platform that makes it natural for non-sophisticated end users to learn who is responsible for software, and makes it easy for them to answer their own questions about why those responsible should be trusted.

Imagine, for instance, you come across some application in a store. The platform provides an easy way to learn about its developer(s) — how long they have been developing applications for, the applications they have developed, and whether their identity has been verified. The platform makes it human: it allows for the developer to include an optional photo of themselves, or a organisation logo. It allows them to include relevant personal or organisational credentials, if they choose. The platform makes it clear which information is verified by whoever is responsible for the platform, and which is self-reported by the developer.

The platform also provides an easy way to learn about the application’s history — how long it’s been around for, who has been responsible for it through time. Imagine showing that information in a visually attractive way, so that the big picture can be seen at a glance, with details available if wanted.

The heart of this vision for a better platform is not software, but relationships — a relationship between end users, the developers, and the platform. Platforms need better infrastructure to make this relationship more transparent, all while protecting privacy and security.

The goal here is to make visible original data points about who is responsible, and why they should be trusted, with the intent that end users themselves will exercise their own judgement and experience to generate their analysis. The platform’s responsibility is to make these original data points visible, and to choose wisely how the data is presented and what is included. Which brings us to the next proposal.

Proposal 2: Distribute Trust and Responsibility

Verifying identity in the form of know your client (KYC) requires an opaque, authoritarian, and hierarchic institutional apparatus — and I state that not as a moral observation, but to make it clear what it necessarily entails, without judgement of institutions that do this, like financial institutions. In the context of software development, let’s call this authoritarian approach know your developer (KYD). I’m not an admirer of authoritarianism, but I acknowledge KYD may well be to truly vital to some aspects of the overall platform landscape, especially for commercial operations like Canonical.

However, while KYD may be vital for some platform aspects, it is not vital for others, and will be counter-productive in some areas. It’s easy to imagine KYD alienating some developers, especially those with strongly held existing ideas about privacy and security, or those with good reasons to keep themselves invisible.

It’s also easy to imagine some unsophisticated end users assuming that if a developer’s identity has been verified by a platform, the platform is itself responsible for when things go wrong. Uh oh.

The opposite of pure KYD is trustless trust, where “trust is based in the network of participants/peers and in the underlying algorithms, without necessarily being placed in any of the individual participants or any outside authority” (source). Normally this concept is applied to things like blockchain, but we can abstract away from implementation details to focus only on the end-goal — building a platform in which trust does not depend on individuals, but on the sourcing, distribution, and run-time processes the platform implements.

A good platform ought to use a hybrid approach, in which rejects sole use of one or the other models (authoritarian versus trustless trust). That is, it distributes trust and responsibility, which is to say, it distributes power. It does this by tightly focusing the domains and mechanisms in which it mandates KYD, where it mandates trustless trust, and then very critically makes it clear to the end user where and to whom that mandate applies — and where that mandate ends.

It does so with the expectation that some classes of programs really must be verified via KYD (e.g. financial, medical), whereas for others KYD verification is optional or impractical.

None of what I’m saying is new, of course. Platforms already do some of this, even if no platform does it with the focus, care, and detail I’d love to see.

A closing thought: one way to make a fee-based verification system attractive to developers would be to give them something extra that they don’t already have. For example, a verified profile on the Ubuntu website that showcases their programs and experience, and provides a way for end users to sponsor them.

1 Like

That’s a lovely set of points to consider and respond to Damon, thank you.

Yes, I agree we should explore surfacing a range of tested facts about an app, and also the publisher. Have other credible members met them in person? Have they passed an automated KYC? @jnsgruk suggested we could do a Let’s Encrypt style TXT record test to establish publisher control of a claimed domain. It would be interesting to explore the elements that are meaningful, and also the tests that are meaningful, knowing that each and every individual test can be defeated by a well-resourced actor, but also that each test also raises the average standard in the ecosystem. Equally it would be interesting to look from a UX and visual design perspective at how best to signal what’s been tested and how best to feel about it. A blue tick feels a bit too binary in a world where trust is only truly earned gradually.

Credit card tests are somewhat exclusionary - lots of interesting people do not have them, a tiny charge on a stolen card might easily be missed by the real holder, a large charge is just more exclusionary. Nevertheless, they would be a test, and they would build confidence if they were passed alongside other tests.

Passport / official ID tests are also vulnerable to fraud and somewhat exclusionary because not everyone can get them, but again also potentially valuable as part of a palette of confidence-building measures. I remember fondly Debian’s keysigning parties, which involved a speed-dating counter-rotating chain of outstanding FLOSS contributors and freedom lovers checking… passports! You couldn’t find a less authoritarian example of reliance on the same authoritarian trust-enhancing test as a mobile app bank in 2024 :wink:

Back in the 1990’s I was focused on identity (specifically the identity of the company behind an SSL-protected website with a digital certificate I would issue). Lots of people used to assume that this was about trust, and I found it very hard to explain to people that it was only identity - trust is an entirely different thing altogether. Thirty years later I would have hoped we had made more progress in this regard.

1 Like

Flatpak packager here. I’ve been carefully watching the situation (as the same issues apply to Flathub), and there is one thing that may be helpful.

the same app could turn rogue at a later date.

For Flathub, there’s a bot that automatically fetches new upstream releases and makes a PR in your repo. This bot is run by Flathub.

One solution is to require “verified” apps prove the repository and domain is under the publisher’s control, and then only have any dependency/application update made by hand (i.e not a Canonical bot) go through verification again.

This boosts security on two fronts; the first partly addressing the issue at hand, and the second keeping applications up to date.

1 Like

This is an important topic, and I honestly understand why such measures are considered.

I maintain one project published on the Snap store, Graphs which is currently part of GNOME Circle. Most development is done with GNOME Builder, which works with Flatpaks and this is thus also the environment where we do most of our actual testing. I do test the Snap package from time-to-time as well, but as I run Fedora Silverblue, this is somewhat tedious as Snap doesn’t really work that well on Silverblue. Testing the Snap is thus a thing I do in GNOME Boxes, not on bare metal.

I don’t mean the above as any negative sentiments towards Snapcraft, I do not want to incite any package wars here. Just wanted to sketch the situation from my side, and why our primary distribution platform is on Flathub and the Snap package being more of an afterthought.

Either way. Personally, I feel a bit uneasy with sharing copies of my passport with private entities, especially if these need to be given online. Given the general privacy conciousness in the FOSS-world, I can only assume there’s a lot of people out there with that perception. If such measured would be a prerequisite for publishing on the Snap store, that might be a reason for me to drop Snap support all together and point our users to the Flatpak package. As it just might raise the barrier from “might as well ship it on Snapcraft as well” to it just not being worth it.

Again, I don’t mean that as a condemnation of Snapcraft in any sense (this is not the place for package wars). But I can see the situation that projects that ship on both Snapcraft and Flathub might drop one of them, if the barrier for publishing on one of these two platforms gets raised. Sending private data to a private entity is such a barrier which I feel can be somewhat big in the world of FOSS. I also want to stipulate that this has nothing to do with Canonical specifically, but more with the idea of sending such sensitive data into any database. If such a measure would be enacted, I’d strongly request to allow the developer to put a fat “Copy” watermark over the passport scan at the very least.

Using a credit card verification might work, but I am not 100% sure how watertight the verification of identity is in that case. It also raises a similar barrier to some people, especially to people not having access to a valid credit card.

Regarding applications going rogue after initial review. I know on Flathub nowadays, if an application requests more permissions than it did during the initial review, it triggers and automatic new manual review. That doesn’t solve everything, but just in case Snap isn’t doing that, I think it’s a good idea.

Since this has been mentioned twice in two different posts above now, note that the snap store does exactly this since day one… and this works very well (like it does on flathub I guess)…

The apps that triggered all this didn’t use any bigger privileges from the start, they are all just plain webapps and this didnt change throughout their released revisions, so neither flathub nor the snapcraft store would catch their malicious behavior through any of these existing tests… they are after all simply phishing your passphrases and do all subsequent actions online to empty your wallet …

What either store would need here as additional automation would be runtime testing of the apps… potentially through some cleverly implemented AI that can judge by screen scraping what is actually going on and recognize evil patterns or some such.


If it would be to complex/unreliable for an AI method to screen-scrape whether an app is doing something nasty, one could at least try to find out of the name and/or descrioption of an app whether it can have to do with finance, shopping, … or other fraud-attracting activities and require human verification.

1 Like

For the human verification one could also create a checklist for the verifier to know what they have to look at: For example if a third party packages the app or the upstream publisher themselves, whether the promised app is actually contained, whether the app is started through a (open-source or closed-source) wrapper and before perhaps another program started (perhaps a keyboard scanner?). Also start the program and see whether it does something obviously wrong (ask for private key) or whether the UI is the same of the upstream program installed through other channel, does upstream advertise this Snap on their site …

With a good checklist a verifier could probably identify most suspicious apps in minutes.

@Sjoerd1993 gave more emphasis on keeping the flatpak, and that increasing the entry barrier would reduce the number of packages, and I agree with that completely. Some suggested measures above are credit-card, passport etc. But, many individual third party volunteers, like myself, have neither passport nor credit card.

There have been many proposals given before. But, one thing I would like to add is that, there is currently no barrier before registering a name for snap. And thus publishing a snap which doesn’t use any plugs that provokes deny-constraint, can be done easily. Specifically with toolkits like flutter, that doesn’t need to interact with dbus, can easily bypass this. If a barrier can be added, an easy one may be, before any snap can come to the store from unverified publishers, may reduce some issues.

But, this will again increase the time needed to publish any snap into store. Also, to add, may be the popularity of snap store(probably popularity of ubuntu), is another catalyst to these things?

In the snap world we have some snaps which are built like this, from source. For those, yes, we have a lot more provenance. But we also allow people to push blobs that they built. This is true for commercial vendors, but it is equally true for small projects that have their own build systems.

In this case, we don’t see source, so we can’t use that as a method to judge the app. Open source is better in this sense, by far, but it’s just not the only way people publish apps for Linux these days.

We could definitely also do the repo control check, that’s a nice one, thank you.

1 Like

If git history is correct then 11 years ago I was wondering about how to make python source packages on pypi safer, with the adoption of a distributed trust system. The main problem I wanted to solve at the time is the combination of relative scarcity of signed source archives and the problem of key distribution that gpg was always plagued by. The longer read is on but the bulk of the idea is that one can express trust in a specific blob (source, although for snaps binary is more appropriate) and then share that trust with others. I think the technical part was a bit optimistic (fiends and colleagues do not really understand gpg and they could be equally prone to social engineering attacks intent on imitating distrust records, as they were to just trusting brands earlier. The “distrust” system never moved past the idea but I think it was trying to solve the problem that is still with us today.

The relationship between a “brand” of a software project and anther entity, typically a person is what is key. I think snaps and the store have historically never given much prominence to the account names, with no easily-to-discover way to see information about a given account. Is zygoon on the snap store really me? Is it just someone who was fast enough to grab the handle?

Perhaps a way out of the problem is to look at fediverse, with their ingeniously simple account verification system. If I can post a very specific link on the domain I own, the link on my fedi account shows up as verified.

It would be nice if the snap store could start with something like that. Have every developer account show up as a distinct page, show verified links, show publishing history or other published snaps.

Over time that place might show up specific information why canonical is trusted, why snapcrafters is trusted, or that zygoon happens to have been around for a decade of snaps and can prove to own Or that I agreed to swipe a card to verify my name.

I think developer identity verification is essential for private snaps at least, but I’d also suggest leveraging the community to verify packages if possible. Github works largely because of the ‘star’ system - and it’s the first thing we all subconsciously check. If a package has 5k stars, that’s another a vote of confidence from the community that it’s genuine.

Another approach to build confidence for open source snaps, could be to have the developer publish a snap verification repo on Github with a key (similar to how Gmail verifies a domain with DNS) - then on the snap you can show that the developer has access to a particular org on Github, so a user can open the link to the Github repo and verify that the snap is coming from the developer they expect.

1 Like

This is all part of the larger social problem of deciding who to trust and for what. Snaps do provide some safety guarantees against malicious or accidental harm and that complicates the message: I (and likely most readers here) know that a confined snap doesn’t have access to my ssh keys and trust Canonical to keep it that way. And that is a good thing. But for most computer users the distinction between that very specific kind of safety and some more generic safety is not clear.

Providing more information about snaps, such as verifying and starring publishers, indicating the origin and level of trust of the snap contents, reviews and history can help the diligent and knowledgeable user. But it needs to be presented very much up-front, in an easily digested manner and without creating a barrier to installing and publishing snaps.

We can clearly do better:

$ snap search foo
Name                      Version                       Publisher              Notes    Summary
foobar2000                2.1.2                         mmtrt                  -        foobar2000 is an advanced freeware audio player.
foobillard-plus           3.43.0                        stanmichals            -        Billard game simulator
foot-terminal             1.16.2                        iomezk                 -        A fast, lightweight and minimalistic Wayland terminal emulator
foobar-tbrandon           0.1                           tbrandon               -        Foobar foobared foobaring foobar
auto-cpufreq              2.1.0                         fooctrl                -        Automatic CPU speed & power optimizer for Linux
microk8s                  v1.28.7                       canonical✓             classic  Kubernetes for workstations and appliances
retroarch                 1.17.0                        libretro               -        RetroArch is the official reference frontend for the libretro API.
kiosc                     1.16.13                       visualproductions      -        Customise your touch screen user-interface
openstack                 2023.1                        canonical✓             -        Small footprint, K8S native OpenStack
multipass                 1.13.1                        canonical✓             -        Instant Ubuntu VMs
plex-htpc                 1.56.1                        plexinc✓               -        Plex HTPC client for Linux
ghostscript-printer-app   10.01.2-1                     openprinting✓          -        Ghostscript Printer Application
kgoldrunner               23.08.3                       kde✓                   -        A game of action and puzzle-solving
microcloud                1.1-04a1c49                   canonical✓             -        Automated small-scale cloud deployment
red-app                   9.0                           keshavnrj✪             -        Complete Youtube Desktop Applications
colorwall                 2.0                           keshavnrj✪             -        Free Ultra HD wallpaper for Desktop
tux-football              0.3.1                         joker2770              -        Tux Football, a fun arcade-style 2D football game.
swell-foop                41.1                          jbicha                 -        Clear the screen by removing groups of colored and shaped tiles
chronoburn                2.3.7                         cancian                -        A real-time calorie counter that simulates the human metabolism
kicad                     7.0.10                        nickoe                 -        Electronic schematic and PCB design software
opennds                   9.3.0                         ogra                   -        A high performance, small footprint Captive Portal
dotrun                    1.4.8                         canonicalwebteam       -        A command-line tool for running Node.js and Python projects
waiterio-restaurant-pos   1.4.0                         waiterio               -        Waiterio Restaurant POS is the fastest way to handle restaurants' orders
notepad3                  5.21.905.1                    mmtrt                  -        Notepad3 is a fast and light-weight text editor.
zile-tealeg               2.4.11                        tealeg                 -        GNU Zile
papyrus                   1.1.2                         ooguz                  -        Papyrus - A simple paper backup tool
knot-resolver-gael        5.7.1                         gael-legoff            -        Knot Resolver
codechenx-tv              0.5.3                         codechenx              -        tv(table viewer) for delimited text file(csv,tsv,etc) in terminal
fclones-gui               v0.2.0                        pkolaczk-u             -        Interactive duplicate file finder and remover
obsr                      v1.2.00                       obsr                   -        OBSR is OBServer's core wallet software.
converternow              4.2.1                         ferraridamiano         -        Unit and currencies converter
gambit                    0.1.0                         kz6fittycent           -        Chess board in your terminal.
cataclysmdda              2023-03-01-0054               marisag1967            -        Cataclysm: Dark Days Ahead is a turn-based survival game set in a post-apocalyptic world.
tactics                   0.6                           nokse22                -        Build your soccer lineup
edgex-objectbox           1.1.0-20191118+28af10ea       objectbox              -        ObjectBox Database with EdgeX Foundry™ IoT Edge Platform - Edge Computing in IoT
bendyroad                 1.1030                        bendyroad              -        A rapid development IDE for controller firmware.
nimblenote                3.2.2                         leavesunderfoot        -        nimblenote
greenfoot                 3.8.0                         davidotek              -        Educational software designed to make learning programming easy and fun
spiderfoot                v2.11.0-final-2998-g0f815a20  jitpatro               -        SpiderFoot is an open source intelligence (OSINT) automation tool.
end-dayz                  1.0                           marisag1967            -        DayZ like game where you have to scavenge for food and water
dispatch                  0.0.1                         redaced                -        dispatch
gladupe                   0.1                           cajomar                -        The world's best Nibbles clone.
paratextlite                         hindlemail             -        Paratext Lite running on the desktop
dnsmasq-wia               2.80snap1                     katiechapman           -        Dnsmasq provides network infrastructure for small networks
ubuntu-package-changelog  0.1.0+git29.5396f4e           toabctl                -        Get Ubuntu package changelogs for different series, pockets and packages
sample                    0.1.0-GIT                     hroptatyr              -        Produce a sample of lines from files.
yuck                      0.2.4-GIT                     hroptatyr              -        Command line option parser for C.
dnsmasqd-dhoward          2.76                          ssni-dh                -        Network infrastructure swiss-army knife
yaml-overlay-tool         0.6.4                         vmware-tanzu-labs      -        A YAML Overlay Tool with templating tendencies.
sos-notepad3              5.19.630.2381                 manhtd-test-snapstore  -        sos notepad3 is a fast and light-weight text editor.
snakeit                   1.1                           isakbryn               -        Snake just like on the old NOKIA phones!
dataclerk                 0.1.0                         tim-clicks             -        A simple server for storing your data

Some of the publishers have “✓” or "✪ " - but what does that tell anyone that doesn’t already know? And there is no indication of which snaps have been widely used, regularly updated or have been rated good or bad by users.

Coming up with a summary “trust rating” isn’t going to be easy, and opens the opportunity for gaming. But is probably needed for the health of the snap ecosystem.


Many of the snaps are built from open source. In this case we should use a signed tag to build the snap in Launchpad. This would ensure that the binary matches the known source and would allow review. The signed tag should be linked in the snap information.

Snaps only provided as binary should be marked as “dangerous”.

I strongly agree that something akin to the star system in combination with a simple download counter for snaps that appears as part of a prompt on any attempted snap download could go a long way in helping with the social engineering side of this problem. This is something I personally always check on other app stores, and I think most attempts by malicious actors to mimic a popular program would be foiled if the downloading user saw that it only had a handful of downloads/no stars.

I think all of the other technical suggestions in this thread are great ideas to implement as well, but when it comes to the social engineering side, I unfortunately think there are many users who will not go through the effort of checking a verification key against some upstream source, etc, so it would be prudent to implement a very short, sweet, and hard to miss download/star counter as a “sanity check” right on the command line at download time.

I personally don’t think this is the way forward.

It is not marked as dangerous today and it’s fine to call it what it is - binary-only (assuming it is not proprietary, but the license is a separate concern with separate markings, you can have binary-only recipe for free software and source-available proprietary applications).

EDIT: To clarify my point, I think that you can build malicious software from source and include the possibly-obfuscated source code, so that is not what makes something safe or unsafe.

The inherent property we are after is not safety but trust. We attempt to enforce relative safety from certain malicious actors with the sandbox but that has limits, like the recent wallet stealing app showed. We cannot enforce trust, we can only provide information to the user and make it more difficult to for spoof software to stay undetected long enough to cause someone damage.

1 Like

I believe this is more about trust than verification, although one must be verified who they truly are before the first level of trust can be given. The solution would be to setup a trust system, very similar to security clearances in the military.

Each publisher has an initial trust level once their identity is verified and there is a process in which this publisher gains trust. Things that could elevate their trust level could be the following:

  • source code is hosted in GitHub
  • number of snap packages published
  • number of reviewed applications (manually or AI reviewed by Canonical)
  • domain ownership validation (this was mentioned in another comment)
  • number of downloads
  • number of positive feedbacks (stars)
  • tenure as a snap publisher

For example, if the publisher is a known entity, lets say Google, and the code for the snap is hosted in GitHub and can be verified. This would elevate not only the publisher trust level, but also the package.

Over time the publisher builds trust (trust level increases), but it is also important to have mechanisms to validate trust at anytime. The only true way to validate trust is to test this trust when they know your not testing them.

Ultimately it is the responsibility of end user as to whether they should trust the publisher. Giving them a “Trust System” (security clearance), would allow them to make an informed decision.


Level 5 Trust Shield :slight_smile:


1 Like

I like the Idea of trust levels.

  • source code is hosted in GitHub
  • number of snap packages published
  • number of reviewed applications (manually or AI reviewed by Canonical)
  • domain ownership validation (this was mentioned in another comment)
  • number of downloads
  • number of positive feedbacks (stars)
  • tenure as a snap publisher


  • star developer status
  • code signing certificate
  • identity validated

If you worry about giving your personal data to someone you simply do not raise your trust level (and hope that the users trust YOU)

Issues: Code signing certificates cost a lot of money since the latest changes - at least one company provides a certificate for open source developers with identity validation for a reasonable 49 € (net()) - maybe there is a chance to partner with them (or other companies) to issue a “snap-certificate” product with a discount?

There are identity validation services where you don´t give your credit card or passport data to some unknown company and a trusted party validates your identity. In Germany we have PostIdent where you can validate your Identity at the post office (5-10€). Since developers come from all over the world not everyone might have access to such services.

Source Code hosted on github might be hard to check as long as you use external modules / libraries. How do you define this? E. g. I package snaps for FOSS hosted on github but with the binaries they provide in the releases. And the modules I import in my Python scripts are hosted on GH as well but the code is imported via pypi.