In # Process for reviewing classic confinement snaps document listed one of criteria that might requires classic
:
running arbitrary command (esp if user-configurable such as a developer tool to organize dev environments)
We have many use cases that allows user to load arbitrary .so file:
- Load external libtorch.so/libmxnet.so/libtensorflow.so etc to benchmark user custom build of deeplearning framework.
- Load external hardware accelerator driver, like AWS Inferencia Chip, or AWS Elastic Inference Accelerator.
- Load extra shared library to support custom operators that being used by the model. Both PyTorch and MXNet support custom operators.
Would you please approve classic
request based on above use case?