-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[TEP-0050] Add OnError field #7162
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -21,6 +21,8 @@ weight: 203 | |
- [Tekton Bundles](#tekton-bundles) | ||
- [Using the `runAfter` field](#using-the-runafter-field) | ||
- [Using the `retries` field](#using-the-retries-field) | ||
- [Using the `onError` field](#using-the-onerror-field) | ||
- [Produce results with `OnError`](#produce-results-with-onerror) | ||
- [Guard `Task` execution using `when` expressions](#guard-task-execution-using-when-expressions) | ||
- [Guarding a `Task` and its dependent `Tasks`](#guarding-a-task-and-its-dependent-tasks) | ||
- [Cascade `when` expressions to the specific dependent `Tasks`](#cascade-when-expressions-to-the-specific-dependent-tasks) | ||
|
@@ -606,6 +608,106 @@ tasks: | |
name: build-push | ||
``` | ||
|
||
### Using the `onError` field | ||
|
||
> :seedling: **Specifying `onError` in `PipelineTasks` is an [alpha](additional-configs.md#alpha-features) feature.** The `enable-api-fields` feature flag must be set to `"alpha"` to specify `onError` in a `PipelineTask`. | ||
|
||
> :seedling: This feature is in **Preview Only** mode and not yet supported/implemented. | ||
|
||
When a `PipelineTask` fails, the rest of the `PipelineTasks` are skipped and the `PipelineRun` is declared a failure. If you would like to | ||
ignore such `PipelineTask` failure and continue executing the rest of the `PipelineTasks`, you can specify `onError` for such a `PipelineTask`. | ||
|
||
`OnError` can be set to `stopAndFail` (default) and `continue`. The failure of a `PipelineTask` with `stopAndFail` would stop and fail the whole `PipelineRun`. A `PipelineTask` fails with `continue` does not fail the whole `PipelineRun`, and the rest of the `PipelineTask` will continue to execute. | ||
|
||
To ignore a `PipelineTask` failure, set `onError` to `continue`: | ||
|
||
``` yaml | ||
apiVersion: tekton.dev/v1 | ||
kind: Pipeline | ||
metadata: | ||
name: demo | ||
spec: | ||
tasks: | ||
- name: task1 | ||
onError: continue | ||
taskSpec: | ||
steps: | ||
- name: step1 | ||
image: alpine | ||
script: | | ||
exit 1 | ||
``` | ||
|
||
At runtime, the failure is ignored to determine the `PipelineRun` status. The `PipelineRun` `message` contains the ignored failure info: | ||
|
||
``` yaml | ||
status: | ||
conditions: | ||
- lastTransitionTime: "2023-09-28T19:08:30Z" | ||
message: 'Tasks Completed: 1 (Failed: 1 (Ignored: 1), Cancelled 0), Skipped: 0' | ||
reason: Succeeded | ||
status: "True" | ||
type: Succeeded | ||
... | ||
``` | ||
|
||
Note that the `TaskRun` status remains as it is irrelevant to `OnError`. Failed but ignored `TaskRuns` result in a `failed` status with reason | ||
`FailureIgnored`. | ||
|
||
For example, the `TaskRun` created by the above `PipelineRun` has the following status: | ||
|
||
``` bash | ||
$ kubectl get tr demo-run-task1 | ||
NAME SUCCEEDED REASON STARTTIME COMPLETIONTIME | ||
demo-run-task1 False FailureIgnored 12m 12m | ||
``` | ||
|
||
To specify `onError` for a `step`, please see [specifying onError for a step](./tasks.md#specifying-onerror-for-a-step). | ||
|
||
**Note:** Setting [`Retry`](#specifying-retries) and `OnError:continue` at the same time is **NOT** allowed. | ||
|
||
### Produce results with `OnError` | ||
|
||
When a `PipelineTask` is set to ignore error and the `PipelineTask` is able to initialize a result before failing, the result is made available to the consumer `PipelineTasks`. | ||
|
||
``` yaml | ||
tasks: | ||
- name: task1 | ||
onError: continue | ||
taskSpec: | ||
results: | ||
- name: result1 | ||
steps: | ||
- name: step1 | ||
image: alpine | ||
script: | | ||
echo -n 123 | tee $(results.result1.path) | ||
exit 1 | ||
``` | ||
|
||
The consumer `PipelineTasks` can access the result by referencing `$(tasks.task1.results.result1)`. | ||
|
||
If the result is **NOT** initialized before failing, and there is a `PipelineTask` consuming it: | ||
|
||
``` yaml | ||
tasks: | ||
- name: task1 | ||
onError: continue | ||
taskSpec: | ||
results: | ||
- name: result1 | ||
steps: | ||
- name: step1 | ||
image: alpine | ||
script: | | ||
exit 1 | ||
echo -n 123 | tee $(results.result1.path) | ||
``` | ||
|
||
- If the consuming `PipelineTask` has `OnError:stopAndFail`, the `PipelineRun` will fail with `InvalidTaskResultReference`. | ||
- If the consuming `PipelineTask` has `OnError:continue`, the consuming `PipelineTask` will be skipped with reason `Results were missing`, | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Just confirming. This would ensure that all the subsequent Tasks that in-turn depend on this skipped Task would also be skipped right? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yeah, and to be more specific: all subsequent resource-dependent tasks (with |
||
and the `PipelineRun` will continue to execute. | ||
|
||
### Guard `Task` execution using `when` expressions | ||
|
||
To run a `Task` only when certain conditions are met, it is possible to _guard_ task execution using the `when` field. The `when` field allows you to list a series of references to `when` expressions. | ||
|
Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -73,6 +73,98 @@ func TestPipelineTask_ValidateName(t *testing.T) { | |
} | ||
} | ||
|
||
func TestPipelineTask_OnError(t *testing.T) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. why the test is in this file not pipeline_validaiton_test.go? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is a good question, and it seems like the boundary of the 2 files are very blur... But we do have existing tests against Maybe we should think to refactor all the test against a field to one place in the future. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yeah! I think we need to refactor to move the tests there, not in the PR could be a follow up cleanup pr |
||
tests := []struct { | ||
name string | ||
p PipelineTask | ||
expectedError *apis.FieldError | ||
wc func(context.Context) context.Context | ||
}{{ | ||
name: "valid PipelineTask with onError:continue", | ||
p: PipelineTask{ | ||
Name: "foo", | ||
OnError: PipelineTaskContinue, | ||
TaskRef: &TaskRef{Name: "foo"}, | ||
}, | ||
wc: cfgtesting.EnableAlphaAPIFields, | ||
}, { | ||
name: "valid PipelineTask with onError:stopAndFail", | ||
p: PipelineTask{ | ||
Name: "foo", | ||
OnError: PipelineTaskStopAndFail, | ||
TaskRef: &TaskRef{Name: "foo"}, | ||
}, | ||
wc: cfgtesting.EnableAlphaAPIFields, | ||
}, { | ||
name: "invalid OnError value", | ||
p: PipelineTask{ | ||
Name: "foo", | ||
OnError: "invalid-val", | ||
TaskRef: &TaskRef{Name: "foo"}, | ||
}, | ||
expectedError: apis.ErrInvalidValue("invalid-val", "OnError", "PipelineTask OnError must be either \"continue\" or \"stopAndFail\""), | ||
wc: cfgtesting.EnableAlphaAPIFields, | ||
}, { | ||
name: "OnError:stopAndFail and retries coexist - success", | ||
p: PipelineTask{ | ||
Name: "foo", | ||
OnError: PipelineTaskStopAndFail, | ||
Retries: 1, | ||
TaskRef: &TaskRef{Name: "foo"}, | ||
}, | ||
wc: cfgtesting.EnableAlphaAPIFields, | ||
}, { | ||
name: "OnError:continue and retries coexists - failure", | ||
p: PipelineTask{ | ||
Name: "foo", | ||
OnError: PipelineTaskContinue, | ||
Retries: 1, | ||
TaskRef: &TaskRef{Name: "foo"}, | ||
}, | ||
expectedError: apis.ErrGeneric("PipelineTask OnError cannot be set to \"continue\" when Retries is greater than 0"), | ||
wc: cfgtesting.EnableAlphaAPIFields, | ||
}, { | ||
name: "setting OnError in beta API version - failure", | ||
p: PipelineTask{ | ||
Name: "foo", | ||
OnError: PipelineTaskContinue, | ||
TaskRef: &TaskRef{Name: "foo"}, | ||
}, | ||
expectedError: apis.ErrGeneric("OnError requires \"enable-api-fields\" feature gate to be \"alpha\" but it is \"beta\""), | ||
wc: cfgtesting.EnableBetaAPIFields, | ||
}, { | ||
name: "setting OnError in stable API version - failure", | ||
p: PipelineTask{ | ||
Name: "foo", | ||
OnError: PipelineTaskContinue, | ||
TaskRef: &TaskRef{Name: "foo"}, | ||
}, | ||
expectedError: apis.ErrGeneric("OnError requires \"enable-api-fields\" feature gate to be \"alpha\" but it is \"stable\""), | ||
wc: cfgtesting.EnableStableAPIFields, | ||
}} | ||
for _, tt := range tests { | ||
t.Run(tt.name, func(t *testing.T) { | ||
ctx := context.Background() | ||
if tt.wc != nil { | ||
ctx = tt.wc(ctx) | ||
} | ||
err := tt.p.Validate(ctx) | ||
if tt.expectedError == nil { | ||
if err != nil { | ||
t.Error("PipelineTask.Validate() returned error for valid pipeline task") | ||
} | ||
} else { | ||
if err == nil { | ||
t.Error("PipelineTask.Validate() did not return error for invalid pipeline task with OnError") | ||
} | ||
if d := cmp.Diff(tt.expectedError.Error(), err.Error(), cmpopts.IgnoreUnexported(apis.FieldError{})); d != "" { | ||
t.Errorf("PipelineTask.Validate() errors diff %s", diff.PrintWantGot(d)) | ||
} | ||
} | ||
}) | ||
} | ||
} | ||
|
||
func TestPipelineTask_ValidateRefOrSpec(t *testing.T) { | ||
tests := []struct { | ||
name string | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about the case when the Task produces a result that may be needed downstream or by the pipeline?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wouldn't setting it to just "continue" without providing a value to the outputs of the task make it fail?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry! I spoke too soon. You already go over that use case below.