-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add Jsonnet support for matrix based building #140
Comments
The main issue here is that building multiple versions of an image with the same name is not possible with the current tagging system, which always assigns the I see two ways to add support for multi-version images. These are not mutually incompatible, but can be used to achieve the same thing. I think we should either do both A&B, or just B, but not A alone. I'm not sure if option A would be required to support this fully, or even desirable due to its drawbacks. A.(originally from @gmpinder) # recipe.yml
image-version: [38, 39]
B.(originally from @gerblesh) # recipe-gts.yml
image-version: 38
image-tags: [gts] # replaces the 'latest' tag # recipe-current.yml
image-version: 39
# 'latest' tag is applied by default
|
And latest should ALWAYS be the latest version of the image where the base image is the same (see: https://learn.microsoft.com/en-us/archive/blogs/stevelasker/docker-tagging-best-practices-for-tagging-and-versioning-docker-images#stable-tags). This isn't a good point. The CLI fully manages what tags are set, so it is not too much to ensure that the highest version number for that recipe is marked as latest. There is also precedent with Ublue where they don't mark their version 38 as latest when they are also building 39.
That's unfair. Adding an optional property to a module isn't "clunky", we already do that for other aspects like SecureBlue has 50 recipes. So with option B they would be required to create 50 more recipes to even think about trying to have a second version of each of the recipes. Whereas with an array of versions, the CLI would manage the tagging for the final images and only mark the most recent version as latest as is the convention with docker tags.
As would my proposed solution:
Only in the sense that it's supports multiple versions of the same base image. I have no intent to have the base image follow in the same pattern.
This can be handled with an arg and also default to templating the first version in the array. This can be figured out more later. Now I want to be clear here. I'm not bashing on other's ideas. I'm simply stating facts about scalability of the two solutions and solution B completely falls short of that scalability factor. |
Yes, setting latest as the tag for the latest version is best practice. Yes, option A would allow for that to be done automatically. No, it wouldn't be such a big deal to make the user get the final say on this with option B.
Let me illustrate my point a bit further here; I never thought of the recipe syntax as a DSL. The vision (in my mind) is/was for a static configuration language, that describes the steps to build and push an image, in a sufficiently abstracted way. I think adding
I agree with the multiple files being needed feeling unscalable. But @qoijjj seemingly disagreed: (link)
Also, @tulilirockz 's Atomic Studio uses Jsonnet in a rather elegant way to solve the scalability issue. (link) I quite like her approach, and think supporting something like it officially would be a pretty good way to solve scalability for multiple purposes, not just the multi-version case
It is my understanding that while option A would allow us to implement tagging for things like
Ok, yeah. Option A would directly only do this in the sense of multiple versions. One might argue, though, that versions with different base images are also just different versions of the same image. This would set precedent for other properties following the same pattern. I don't find the precedent fully threatening, though. My earlier points have some more thoughts on this. I just also realized, that GitHub Actions supports the matrix being generated by another action, so that could be leveraged in a case like this.
Ok, yeah. Maybe it could generate files like |
Honestly I personally think that using a yaml/json generator language is the best approach for this kind of situation, as @xynydev said I'm using jsonnet but we could easily use something like pkl to manage the high level aspects of image generation and would make scaling a bunch easier, like, the user could just add "make images with xfce4" in some fancy way and the recipes would just be generates without any issue. Something like: // Bluebuild Library For Ublue Images (could also make an implementation for VanillaOS)
class Image {
suffix: String
base_url: String = "ghcr.io/ublue-os/"
type: "main" | "nvidia" | "asus" | "surface"
desktop: "silverblue" | "kinoite"
ignore_type: Boolean = false // If someone wanna do something fancy
modules: Listing<String>
tags: Listing<String>
}
class Meta {
name: String
description: String
images: Mapping<String, Image>
}
// This is what the user is gonna be using
test = new Meta {
name = "atomic-studio"
description = "Operating system based on Fedora Atomic meant for content creators and artists"
images = new Mapping<String, Image> {
["gnome"] = (Image) {
suffix = "-gnome"
type = "main"
desktop = "silverblue"
modules = new Listing<String> { "chungus" }
tags = new Listing<String> { "latest" "gts" }
}
}
} That way we can enforce rules for the recipes directly through the built in type system, whitespace wouldnt matter, and stuff like that! We can both make it super flexible and strict when we want. It would be a bit more annoying to make everyone install pkl to test their things but like, it could be an interesting alternative to using raw yaml |
And yet, they changed their mind which is what spurred this issue in the first place. You know I'm honestly getting tired of all this bike shedding. I'm trying to implement features the users are asking for that are functional and can still maintain backwards compatibility. I'm then met with so much push back and grandiose ideas that far exceed the scope of the problems at hand. |
@gmpinder I apologize if you think this was a waste of time to discuss 😔 I'm not sure where I changed my mind, I still think being able to specify a matrix of base versions would be nice and I don't see where in that screenshot I contradicted myself. But regardless if you don't want to implement this, that's okay. |
I wasn't commenting on the number of recipes. I was saying that the "structure is highly efficient because it permits us to factor out and reuse large chunks of yaml dozens of times." In other words, being able to factor out redundant config is a benefit. For the same reason I think being able to have multiple base image versions in the same config file would be a benefit: reduced redundant config. |
But I will find a workaround, please feel free in the future to decline my asks if they're not of interest or out of scope :) |
I'm sorry guys for the rude response. I'm dealing with things IRL and it spilled over here. @qoijjj I am interested in building out a feature like this. @xynydev I'm sorry for being overly defensive. I'm reopening the issue for further discussion. I'm taking a break though so I probably won't respond for a couple weeks. |
Aight, I've been conciously not thinking about this for the past few days, so here are some fresh thoughts outlined:
Jsonnet exampleI took the Jsonnet configuration from Atomic Studio and did some pretty violent refactoring on it to showcase what a config file might look for someone just looking to cleanly generate some recipes based on some rules. local project = {
base_name: "atomic-studio",
description: "Operating system based on Fedora Atomic meant for content creators and artists",
base_images: "ghcr.io/ublue-os/",
};
local suffix(base_image, nvidia) = (
(if (base_image == "silverblue") then "-gnome" else "")
+ (if (nvidia) then "-nvidia" else "")
);
local image(base_image, nvidia, image_version) = {
"name": project.base_name + suffix(base_image, nvidia),
"description": project.description,
"base-image": project.base_images + base_image + (if (nvidia) then "-nvidia" else "-main"),
"image-version": image_version,
"modules": std.flattenArrays([
[
{ "from-file": "common/shared/gui-apps.yml" },
{ "from-file": "common/shared/packages.yml" },
{ "from-file": "common/shared/files.yml" },
{ "from-file": "common/shared/scripts.yml" },
{ "from-file": "common/shared/bling.yml" },
{ "from-file": "common/shared/services.yml" },
],
if (nvidia) then [
{ "from-file": "common/shared/nvidia/scripts.yml" },
] else [
{ "from-file": "common/shared/amd/packages.yml" },
{ "from-file": "common/shared/amd/scripts.yml" },
],
if (base_image == "silverblue") then [
{ "from-file": "common/gnome/apps.yml" },
{ "from-file": "common/gnome/files.yml" },
{ "from-file": "common/gnome/scripts.yml" },
] else [
{ "from-file": "common/plasma/apps.yml" },
{ "from-file": "common/plasma/files.yml" },
{ "from-file": "common/plasma/scripts.yml" },
],
[
{ "from-file": "common/audio/audinux.yml" },
{ "from-file": "common/audio/pipewire-packages.yml" },
{ "type": "yafti" },
{ "type": "signing" },
],
]),
};
local images() = {
["recipe" + suffix(base_image, nvidia) + "-" + std.toString(image_version) + ".yml"]: image(base_image, nvidia, image_version)
for nvidia in [
false, true
]
for base_image in [
"kinoite", "silverblue"
]
for image_version in [
38, 39
]
};
images() This file can be then turned into the separate ❯ jsonnet -m ./ studio.jsonnet
./recipe-38.yml
./recipe-39.yml
./recipe-gnome-38.yml
./recipe-gnome-39.yml
./recipe-gnome-nvidia-38.yml
./recipe-gnome-nvidia-39.yml
./recipe-nvidia-38.yml
./recipe-nvidia-39.yml The filenames are output by the program, so multi-stage GitHub Actions could easily be used to generate the build matrix for the BlueBuild Action. The files are Lua exampleI translated the Jsonnet example to Lua. This is my first time using Lua, so I might not be "doing it correctly", but I found that Lua is not that well suited for this purpose. The # studio.lua
json = require "json" -- https://github.com/rxi/json.lua
project = {
base_name = "atomic-studio",
description = "Operating system based on Fedora Atomic meant for content creators and artists",
base_images = "ghcr.io/ublue-os/",
}
function suffix(base_image, nvidia)
local suffix = ""
if base_image == "silverblue" then suffix = suffix .. "-gnome" end
if nvidia then suffix = suffix .. "-nvidia" end
return suffix
end
for _, nvidia in ipairs({true, false}) do
for _, base_image in ipairs({"kinoite", "silverblue"}) do
for _, image_version in ipairs({38, 39}) do
local config = {
name = project.base_name .. suffix(base_image, nvidia),
description = project.description,
base_image = project.base_images .. base_image .. (nvidia and "-nvidia" or "-main"),
image_version = image_version,
modules = {
{ from_file = "common/shared/gui-apps.yml" },
{ from_file = "common/shared/packages.yml" },
{ from_file = "common/shared/files.yml" },
{ from_file = "common/shared/scripts.yml" },
{ from_file = "common/shared/bling.yml" },
{ from_file = "common/shared/services.yml" },
-- yeah, lua makes this kinda clumsy...
(nvidia and
{ from_file = "common/shared/nvidia/scripts.yml" }
),
(not nvidia and
{ from_file = "common/shared/amd/packages.yml" }
),
(not nvidia and
{ from_file = "common/shared/amd/scripts.yml" }
),
(base_image == "silverblue" and
{ from_file ="common/gnome/apps.yml" }
),
(base_image == "silverblue" and
{ from_file ="common/gnome/files.yml" }
),
(base_image == "silverblue" and
{ from_file ="common/gnome/scripts.yml" }
),
(base_image == "kinoite" and
{ from_file ="common/plasma/apps.yml" }
),
(base_image == "kinoite" and
{ from_file ="common/plasma/files.yml" }
),
(base_image == "kinoite" and
{ from_file ="common/plasma/scripts.yml" }
),
{ from_file = "common/audio/audinux.yml" },
{ from_file = "common/audio/pipewire-packages.yml" },
{ type = "yafti" },
{ type = "signing" },
}
}
local json_str = json.encode(config):gsub("_", "-")
local file_path = "./recipe" .. suffix(base_image, nvidia) .. "-" .. image_version .. ".yml"
print(file_path)
f = io.open(file_path, "w")
f:write(json_str)
f:close()
end
end
end This file can be then turned into the separate ❯ lua studio.lua
./recipe-38.yml
./recipe-39.yml
./recipe-gnome-38.yml
./recipe-gnome-39.yml
./recipe-gnome-nvidia-38.yml
./recipe-gnome-nvidia-39.yml
./recipe-nvidia-38.yml
./recipe-nvidia-39.yml The filenames are output by the program, so multi-stage GitHub Actions could easily be used to generate the build matrix for the BlueBuild Action. The files are JS ExampleI translated the Jsonnet example to JS. I'm such a webdev, that this feelt very natural and easy for me, thought the line count is marginally bigger and the amount of boilerplate required marginally larger. This could become the most ergonomic way to write multi-recipe configs, if I just quickly made a TS library to have the types and some ergonomic functions for the whole script. # studio.js
import * as fs from "node:fs";
import { join } from "node:path";
const outputDir = "./recipes";
try {
if (!fs.existsSync(outputDir)) {
fs.mkdirSync(outputDir);
}
} catch (err) {
console.error(err);
throw new Error();
}
const project = {
baseName: "atomic-studio",
description:
"Operating system based on Fedora Atomic meant for content creators and artists",
baseImages: "ghcr.io/ublue-os/",
};
const suffix = (baseImage, nvidia) =>
(baseImage == "silverblue" ? "-gnome" : "") + (nvidia ? "-nvidia" : "");
const files = [];
for (let nvidia of [true, false]) {
for (let baseImage of ["kinoite", "silverblue"]) {
for (let imageVersion of [38, 39]) {
const config = {
name: project.baseName + suffix(baseImage, nvidia),
description: project.description,
"base-image":
project.baseImages +
baseImage +
(nvidia ? "-nvidia" : "-main"),
"image-version": imageVersion,
modules: [
{ "from-file": "common/shared/gui-apps.yml" },
{ "from-file": "common/shared/packages.yml" },
{ "from-file": "common/shared/files.yml" },
{ "from-file": "common/shared/scripts.yml" },
{ "from-file": "common/shared/bling.yml" },
{ "from-file": "common/shared/services.yml" },
...(nvidia
? [{ "from-file": "common/shared/nvidia/scripts.yml" }]
: [
{ "from-file": "common/shared/amd/packages.yml" },
{ "from-file": "common/shared/amd/scripts.yml" },
]),
...(baseImage == "silverblue"
? [
{ "from-file": "common/gnome/apps.yml" },
{ "from-file": "common/gnome/files.yml" },
{ "from-file": "common/gnome/scripts.yml" },
]
: [
{ "from-file": "common/plasma/apps.yml" },
{ "from-file": "common/plasma/files.yml" },
{ "from-file": "common/plasma/scripts.yml" },
]),
{ "from-file": "common/audio/audinux.yml" },
{ "from-file": "common/audio/pipewire-packages.yml" },
],
};
const json = JSON.stringify(config, null, 2);
const filePath = join(
outputDir,
"recipe" +
suffix(baseImage, nvidia) +
"-" +
imageVersion +
".yml"
);
try {
fs.writeFileSync(filePath, json);
files.push("./" + filePath);
} catch (err) {
console.error(err);
throw new Error();
}
}
}
}
// GitHub Actions needs JSON to generate a build matrix.
console.log(JSON.stringify(files)); This file can be then turned into the separate ❯ node studio.js
["./recipes/recipe-nvidia-38.yml","./recipes/recipe-nvidia-39.yml","./recipes/recipe-gnome-nvidia-38.yml","./recipes/recipe-gnome-nvidia-39.yml","./recipes/recipe-38.yml","./recipes/recipe-39.yml","./recipes/recipe-gnome-38.yml","./recipes/recipe-gnome-39.yml"] Can also be run with The filenames are output as a A TS library for this could also include the following function I just AI-generated ( 😳 ). // some functional magic an AI wrote that i like 66.666...% understand
const generateMatrix = (matrix) =>
Object.entries(matrix)
.map(([key, values]) => values.map((value) => ({ [key]: value })))
.reduce((a, b) => a.flatMap((d) => b.map((e) => ({ ...d, ...e })))); As that would allow the for loop mess be transformed into this: const matrix = {
baseImage: ["kinoite", "silverblue"],
nvidia: [true, false],
imageVersion: [38, 39],
};
for (let { baseImage, nvidia, imageVersion } of generateMatrix(matrix)) {
...
} |
Alright, I think moving in this direction would probably be better. I've taken some time to think this over and look at what we could do. I think that out of all the options specified here, An advantage to this would be to allow the use of Like @xynydev said, this would help to keep the individual recipe files simple for less technical users while also opening up an avenue to give power users more options to better automate image building. |
I dislike jsonnet syntax, and I think supporting multiple options would be great. JS with Deno could also be integrated into Rust, and I think I could make a pretty nice library for it. You are free to work on CLI integration, but that is not a priority for me, as this would need changes in the I think the course of action in order of importance regarding this issue would be to:
|
As much as I like and am used to JS, I think including an entire JS (or Lua) runtime is really overkill for the requirement. Jsonnet or PKL would be perfect as they are designed specifically for the purpose of dynamic configuration file generation. Unfortunately PKL is very new and does not have any rust bindings so that is out of the question which only really leaves Jsonnet. |
Well, there is no reason to include an entire JS runtime, then. It doesn't have to be integrated onto the Rust-based CLI. Integrating Jsonnet to give JSON commandline output would be great. I think documenting multiple options would be great, so that people can pick their favorites. |
So this would be something for creating matricies in GHA? Also sounds like something that I could make for GitLab CI. There's a way to generate another ci yaml file with more jobs that contain the artifacts of the previous job (in this case the new generated recipes). Would there be a way to pass the recipe files to the new jobs in GitHub? Cause could move recipes from the jsonnet generate job and pass the paths to the recipes to the new jobs. |
Yup!
Yeah, we may figure out for as many CI as we please.
Like, keeping the recipe files between jobs? Cause I tried looking for options when making this Atomic Studio PR, but found that the least complicated option would just be to regenerate the recipes in the second job. |
I made a short TS library file for the JS/TS configuration, and I think that made it quite nice. The recipe config is also typed, so using a compatible editor makes the experience very nice. I'll hold off on making this anything official, though, until #138 is done, and we establish an examples repo. example.tsimport { Recipe, generateMatrix, saveRecipes } from "./bluebuild";
const project = {
baseName: "atomic-studio",
description:
"Operating system based on Fedora Atomic meant for content creators and artists",
baseImages: "ghcr.io/ublue-os/",
}
const suffix = (baseImage, nvidia) =>
(baseImage == "silverblue" ? "-gnome" : "") + (nvidia ? "-nvidia" : "")
const matrix = {
baseImage: ["kinoite", "silverblue"],
nvidia: [true, false],
imageVersion: [38, 39],
}
const recipes = generateMatrix(matrix).map(({ baseImage, nvidia, imageVersion }): Recipe => {
return {
name: project.baseName + suffix(baseImage, nvidia),
description: project.description,
"base-image":
project.baseImages + baseImage + (nvidia ? "-nvidia" : "-main"),
"image-version": imageVersion,
modules: [
{ "from-file": "common/shared/gui-apps.yml" },
{ "from-file": "common/shared/packages.yml" },
{ "from-file": "common/shared/files.yml" },
{ "from-file": "common/shared/scripts.yml" },
{ "from-file": "common/shared/bling.yml" },
{ "from-file": "common/shared/services.yml" },
...(nvidia
? [{ "from-file": "common/shared/nvidia/scripts.yml" }]
: [
{ "from-file": "common/shared/amd/packages.yml" },
{ "from-file": "common/shared/amd/scripts.yml" },
]),
...(baseImage == "silverblue"
? [
{ "from-file": "common/gnome/apps.yml" },
{ "from-file": "common/gnome/files.yml" },
{ "from-file": "common/gnome/scripts.yml" },
]
: [
{ "from-file": "common/plasma/apps.yml" },
{ "from-file": "common/plasma/files.yml" },
{ "from-file": "common/plasma/scripts.yml" },
]),
{ "from-file": "common/audio/audinux.yml" },
{ "from-file": "common/audio/pipewire-packages.yml" },
{ type: "signing" }
],
}
})
saveRecipes(recipes, "./recipes") ❯ bun run example.ts
["./recipes/recipe-atomic-studio-nvidia-38.json","./recipes/recipe-atomic-studio-nvidia-39.json","./recipes/recipe-atomic-studio-38.json","./recipes/recipe-atomic-studio-39.json","./recipes/recipe-atomic-studio-gnome-nvidia-38.json","./recipes/recipe-atomic-studio-gnome-nvidia-39.json","./recipes/recipe-atomic-studio-gnome-38.json","./recipes/recipe-atomic-studio-gnome-39.json"] The recipe filenames use the |
Ok... I think I found the perfect solution. It's called RCL. Here's a blog post by the author. It's written in Rust, it's nice to use, and the dev @ruuda seems to have good ideas. The configuration reads well and does exactly what we want directly out of the box. Integrating it with CLI would likely be possible, if not trivial (there's already at least a Python module being built from the Rust codebase). The biggest issues I see is; lack of VSCode syntax higlighting (has grammars, can be solved, see ex. anycode), being a hobby project without stability guarantees (so are we, and I don't think that should prevent us from using a really good tool). Here's an example configuration I made with RCL:
Here's how it builds:
Here's an example of a generated recipe ( {
"alt-tags": ["gts"],
"base-image": "ghcr.io/ublue-os/silverblue-nvidia",
"description": "My test project.\n(silverblue-nvidia edition, GTS version)\n",
"image-version": 40,
"modules": [
{
"files": [
{"destination": "/", "source": "system"},
{"destination": "/", "source": "system_nvidia"}
],
"type": "files"
},
{
"type": "default-flatpaks",
"user": {"install": ["org.kde.krita", "org.fedoraproject.MediaWriter"]}
},
{"script": "nvidia-setup.sh", "type": "script"},
{"type": "signing"}
],
"name": "test-silverblue-nvidia"
} (It can't generate YAML, which is fine. It also puts the keys in an object in a wonky order, but that doesn't really matter either) I also came to think that with Nushell one could probably achieve the same thing, but nested each pipelines would be kind of odd, and that's not really something worth including in BlueBuild directly. I'm planning to transform the different things tested in this issue into examples for the examples repo, so that's maybe one contender for that, if I get around to it. |
So there was some conversation in discord and some in a discussion about trying to have support for a recipe building multiple versions of itself. For example @qoijjj has mentioned that it would be much easier to support the transition of SecureBlue to Fedora 40 if all that needed to be done was to mark each recipe to build both a
39
version and a40
version.My proposal is to allow the existing
image-version
property in the top level to accept both a single value and an array of values. This would allow existing recipes to continue working and allow scaling up builds. So an example recipe could look something like this:The CLI would then go through and build 2 separate images, one based on 39 and one on 40. All tags that are currently created contain the version of Fedora that built except
latest
. In this instance, we would setlatest
to the highest version that is being built. This could also open the door to supporting agts
tag for the second highest version.Now there is a possibility that there will be recipe's that have module definitions that make it incompatible across versions. At this point, it would be up to the user to separate out the recipes and manage these changes. Or we can support having a
for-version
or some property for each module that makes it so that you can cordon certain modules to only be ran on a specific version. Like:The text was updated successfully, but these errors were encountered: