-
Notifications
You must be signed in to change notification settings - Fork 599
The MicroProfile Compliance Pipeline
The purpose of the MicroProfile Compliance pipeline is to simplify pulling together the necessary evidence for claiming MicroProfile compliance. It takes advantage of new function added to our test framework to gather the .adoc
evidence files into a simple location and eliminates the need to manually extract each one from its corresponding FAT output zip.
The pipeline itself runs all of the TCKs* against both the OL and OL-Beta, each on Ubuntu 24 with IBM Semeru 8, 11, 17, and 21; giving a total of 8 platforms.
* NB: This means there are non-MicroProfile TCKs running in addition to all of the Microprofile TCKs.
The creation of this pipeline included changes to the CI Orchestrator and the TCKResultsWriter.
FATs/TCKs can now write any additional files they wish to collate to an extras
directory alongside the standard FAT output directories of results
and servers
. This extras
directory will be included in the FAT output zip but its contents can also be uploaded to a different location on LibertyFS if the pipeline is configured to do so.
The TCKResultsWriter has been updated to automatically write the .adoc
files to an extras
directory and any FAT can be updated to create the extras
directory and write files to it.
Setting the upload.all.fat.logs
property to true
on any pipeline will result in additional uploads to LibertyFS:-
- The contents of the
extras
directory. - The FAT's JUnit.XML
- The FAT's output zip.
The upload location of the extra output differs by trigger/pipeline:
- Non-MicroProfile Compliance pipeline: written to that pipeline's standard LibertyFS location.
- MicroProfile Compliance manual trigger: manually configured by requestor.
- MicroProfile Compliance scheduled trigger: written to
/liberty/dev/Xo/platforms/ciorchestrator/
.
An example of how the extra uploaded files are laid out on LibertyFS for a MicroProfile Compliance scheduled trigger is shown below:
/liberty/dev/Xo/platforms/ciorchestrator/
[PIPELINE_ID]/
fat_1.xml
fat_1.zip
fat_2.xml
fat_2.zip
extras/
fat_1.adoc
fat_2.adoc
As the pipeline runs 4 platforms for both the OL and OL-Beta images there would be 8 directories, one for each pipeline. Hence, 8 extras
directories from which to collate for the full set of extra files (eventually work will be done to do this automatically).
NB: The CI Orchestrator performs this extra uploading, the functionality is not available in RTC builds.
There are two ways the MicroProfile Compliance Pipeline is requested.
This trigger is classed as Liberty Monitored
and so can be found on the Liberty Monitored Pipelines page; meaning the Pipeline Monitors will triage it.
The pipeline is invoked weekly and runs against the latest good Release pipeline; meaning the OL/OL-Beta images and the FATs it compiled as the artifacts being tested. The extra output will be uploaded to the Release pipeline's directory on LibertyFS.
The manual trigger is classed as Liberty Dev
and so can be found on the Liberty Developer Pipelines page.
Using this trigger allows the pipeline to be run against a specific existing pipeline as configured by the requestor.
The primary use case is for the MicroProfile team to run the pipeline against a specific Release pipeline, such as a GM driver, to collect the evidence for demonstrating OL's MicroProfile Compliance.
To launch the pipeline manually the following properties must be set:
-
product_image_artifact_execution_id
- this is the UUID of the build which compiled the OL/OL-Beta images which will be tested for compliance. For a Release pipeline it can be found on its Pipeline Analysis page in the Steps tab for the step "RTC-Parent" (there's an ID column which contains a button to copy the UUID to the clipboard). -
bucket_image_artifact_execution_id
- this is the UUID of the build that compiled thefat_tck
s you wish to use to verify Liberty. This can be any build where thefat_tck
downloads the correct TCK from Maven. -
changes_summary_artifact_execution_id
- this is the UUID of the build which determined what new changes were delivered into it. Generally, set it to the same value as theproduct_image_artifact_execution_id
. -
outputPath
- this is the path on LibertyFS to which the extra output will be uploaded. When verifying the compliance of a GM or Release build this should be set to that build's directory on LibertyFS (plus a sensible suffix such as "mp_compliance" as this will ensure the extra output (and thus the evidence) is stored alongside it; as GM drivers are never deleted this means the evidence is retained permanently.
Example outputPath
When running against a GM/Release build, it is suggested that the outputPath
is set to a value similar to:
/liberty/dev/Xo/release/cl241020240902-1102-_M_AqQGkSEe-cMOmDIehu1Q/mp_compliance/
This will result in the following file layout, avoiding the 8 directories named with pipelineIds (only one exemplified below) being lost amongst the rest of the build's output:
/liberty/dev/Xo/release/cl241020240902-1102-_M_AqQGkSEe-cMOmDIehu1Q/
[MANY BUILD FILES/DIRS]
mp_compliance/
[PIPELINE_ID]/
fat_1.xml
fat_1.zip
fat_2.xml
fat_2.zip
extras/
fat_1.adoc
fat_2.adoc
Two examples of the file layout on LibertyFS are provided above. To collect all of the evidence needed for MicroProfile compliance simply retrieve the .adoc
files from within the extras
directory of the 8 pipeline directories.
Parts of the FAT come from product_image_artifact_execution_id
(see below) and parts come from bucket_image_artifact_execution_id
. If these are too different you will get exceptions.
The other thing to note is we only store product image URLs for 10 days after we are first told about them. So if you request the pipeline more than 10 days after the build in question (such as the GM release build) the pipeline will hang waiting until it timesout.
Once you have downloaded your results you must do the following things:
First you must replace every instance of the open liberty version string with the exact version of the image you are verifying. E.G. I had to replace 24.0.0.9-beta
with 24.0.0.9-beta-cl240820240709-0302
.
- Always get the version number from the release team.
- Do not forget to update the file name as well as the file contents.
Next look for the heading Product Name, Version and download URL (if applicable)
, beneath that update the link to wherever the release team have put a long lasting copy of the version you are certifying.
Upload Your Results
Your processed adoc files must be uploaded to https://github.com/OpenLiberty/certifications. See https://github.com/OpenLiberty/certifications/blob/prod/README.md for the multi-step process to do so.
If OpenLiberty does not yet run the right versions of the TCK you will need to complete these steps. Create an open liberty branch. Open io.openliberty.microprofile.something.x.x_fat_tck/public/tckRunner/tck/pom.xml If the version of the TCK you wish to run is in maven central set the correct version numbers. If it is not in maven central you will also have to add a repository statement pointing to where it is. For example:
<repositories>
<repository>
<id>sonatype</id>
<name>sonatype repository with the staged version of the tcks</name>
<url>https://oss.sonatype.org/content/repositories/orgeclipsemicroprofile-1705/</url>
</repository>
</repositories>
If you have added an extra repository you will also have to edit fattest.simplicity/componenttest/topology/utils/tck/settings.xml
. Look for the element <mirrorOf>
, and add an extra entry in the format : !<id from the repository you’re adding to pom.xml>
.
For example, here is how mine looked: <mirrorOf>!sonatype,!central,!central-mirror,!plugin-central-mirror,external:*</mirrorOf>