Spark client snap track name


For the snap spark-client that we support in DataPlatform, I would like to change the track name (or create a new one and unlist the current one) from latest to 3.4.

We are planning to only use the risks edge and beta, not sure it is relevant though at this point.

Let me know if I need to provide you any other information

Best, Enrico

Hi, setting the default track for snaps is done using snapcraft directly with snapcraft set-default-track. This assumes the 3.4 track already exists (I didn’t check, sorry, it’s Friday afternoon :wink: ). Then you can snapcraft close latest/stable and so on for all risks on latest.

Note that we really don’t recommend closing latest for snaps or charms - it makes things confusing :slight_smile:

  • Daniel

Hi Daniel! Actually I would need the 3.4 track to be created, as right now only latest exists.

Thanks also for your inputs on this. I’ll sync with Alex to make sure that we adhere to what we do for all other snaps in Data Platform.

Best, Enrico


Per Process for aliases, auto-connections and tracks 2, we need a 1-week voting/discussion period, so I’ll check back on the discussion and votes in a few days.

I have three questions before casting my vote.

  1. What’s spark-client’s release cadence, how often is a new major version (potentially requiring a new track) released? Is this documented somewhere by upstream?
  2. Is there some commitment from upstream on maintenance of old versions? e.g. is 3.3 still supported with security updates? Will it continue to be supported now that 3.4 is out, and for how long?
  3. Are new versions backwards-incompatible? meaning, if I was running 3.3 and try to install 3.4, will that just work, or do I need to migrate my data/configuration, or will things break horribly?



Hi Odysseus,

Thanks for your input. The structure of the track managements will closely follow what has already been implemented for other data platform products (e.g charmed-mysql, charmed-kafka, charmed-postgresql). Adding also @taurus in cc who has been coordinating this for DataPlatform few months ago. I’ll also address your questions when it comes to for this technology and the upstream project (also summarized here):

  1. The release cadence for the upstream Spark project for point release is generally around 6-9 months. See here for the history of previous point releases. For major releases there is no fixed cadence.
  2. point releases are generally supported upstream for 18 months.
  3. New versions are generally (more on this later) backwards compatible and the Spark project strives to not break “stable” APIs, even at major version. However, in Spark releases there may be “experimental” and “alpha” API/features that may change across upgrades.

Hope to have addressed your questions.

Best, Enrico

Confirmed from my side. Voted +1 to add track 3.4 to snap spark-client.

Is an upgrade from 3.3 to 3.4 transparent or does it require some sort of manual action?

in other words, once a new version comes out, do users have a reason NOT to jump to the latest one automatically?

  • Daniel

Sorry, I might have missed the last message. To try to answer your question, let me put some context here: Spark is a middleware that is used (as a library) in big data applications for enabling parallel data processing. Some applications may be written using a certain version, say 3.3, that is taken as dependency, and they may need some changes/tests/refactoring before jumping to the latest version, 3.4. Most of API should keep consistent between versions, but it may be that people (within their applications) may use experimental features that are subject to change, and we cannot swap/change the version under their feet automatically.

I hope this clarifies your question


“we cannot swap/change the version under their feet automatically.” is key. Based on this, I’m +1 on granting this track.

  • Daniel

+1 from me on granting this track.

Track 3.4 for snap spark-client has now been created.



Thanks @roadmr and @odysseus-k for the review and for the discussion.

Just last question for my understanding and planning: when we will need to open a new track (supposedly when Spark 3.5 will be released sometime towards end of this year / beginning of next year), should I open a new thread of just add a comment on this one? Would there be a fast-track process?

Many thanks, Enrico


It’s perfectly fine if you add a new comment on this thread. And, indeed, for any new track request which is similar in nature and cadence to the first one, there is a simplified, fast-track process (where no voting is required) as described here.