supported-category: tools for local, non-root user driven configuration of/switching to development workspaces/environments
reasoning: The application requires access to utilize GPU acceleration for LLM (Large Language Model) inference. While I have tested the app with existing interfaces like gpu-control and hardware-observe, these interfaces are not sufficient as the application fails to detect the GPU under strict confinement. When testing with classic confinement, the GPU is properly detected and the application functions as intended, providing the necessary acceleration for running large language models locally
Yes, I’ve reviewed the information on that page and tried the suggested approaches, but the app still doesn’t detect my GPU under strict confinement. However, when I build the app with the classic confinement option, it detects the GPU and functions as expected.
Here’s the plugs that I put during the build process with electron-builder
electron-builder has a long history of creating broken snaps if the packaging task is a bit more complex, have you tried to simply package it with a proper snapcraft.yaml instead ?
GPU access is definitely fully supported through the existing interfaces and surely not a valid reason to grant classic confinement …
Here is a snap example for an AI tool that also uses electron-builder but with a proper snapcraft.yaml: