Solved: Built, failed to release (error in snapcraft.yaml)

I configured a snap that successfully builds on my local machine. Now I transferred the build process into https://snapcraft.io to build from Github directly. The build gets triggered correctly (it depends on some Github action releasing new .deb packages), but then takes ages to complete.

Finally I can see the status Failed, the text Built, failed to release but the logs show

[...]
Snapping...
Snapped settlers-remake_0.0.1_amd64.snap
Revoking proxy token...
RUN: /usr/share/launchpad-buildd/bin/in-target scan-for-processes --backend=lxd --series=focal --arch=amd64 SNAPBUILD-1790811
Scanning for processes to kill in build SNAPBUILD-1790811

What can I do, or what is missing to get a successful build?

Checking other similar questions, I meanwhile checked:

  • There are no emails in my inbox.
  • Disconnecting and reconnecting the Github repo did not help

Built, failed to release from the build service usually indicates there is a problem in the store. Often trying later can work, or there might be a blockage that can be cleared at dashboard.snapcraft.io.

Thank you for the hint. But how long is the store allowed to fail?

And on the dashboard everything looks normal. What should I be looking for? It still feels like I am in darkness.

I built the snap locally now, which was successful. Then I tried to upload it using

snapcraft upload <name of snap>

and got this:

Error while processing...
The store was unable to accept this snap.
  - __all__: summary: Multiple lines are not allowed.

Is this a possible reason? If yes, why would the build server not collect such a message?

So I changed in my snapcraft.yaml file from

summary: | One line of text

into

summary: One line of text

And indeed, this time the manual upload was successful:

Processing...|                                                                                                                                                                                             
Ready to release!

Going for the remote build again…

Confirmed. The issue is resolved now.

So after all it was a user error - but no message was pointing there. This could be improved IMHO.