Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Entrypoint cancellation is a bit "noisy" and scary #7268

Closed
vdemeester opened this issue Oct 24, 2023 · 1 comment · Fixed by #7272
Closed

Entrypoint cancellation is a bit "noisy" and scary #7268

vdemeester opened this issue Oct 24, 2023 · 1 comment · Fixed by #7272
Assignees
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@vdemeester
Copy link
Member

Expected Behavior

When running a PipelineRun (such as below) or TaskRun, I expected to only see the output of my Task if everything goes well.

Actual Behavior

The entrypoint cancellation feature (from #6511) seems to log some error when it's a normal behavior (aka no cancel was called).

[add-uid : uid] {"level":"error","ts":1698142443.0641263,"caller":"entrypoint/entrypointer.go:178","msg":"Error while waiting for cancellation{error 26 0  context canceled}","stacktrace":"github.com/tektoncd/pipeline/pkg/entrypoint.Entrypointer.Go.func2\n\tgithub.com/tektoncd/pipeline/pkg/entrypoint/entrypointer.go:178"}

Steps to Reproduce the Problem

cat <<'EOF' | kubectl create -f -
---
apiVersion: tekton.dev/v1
kind: Task
metadata:
  name: uid-task
spec:
  results:
    - name: uid
  steps:
    - name: uid
      image: alpine
      command: ["/bin/sh", "-c"]
      args:
        - echo "1001" | tee $(results.uid.path)
---
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
  name: uid-pipeline-run
spec:
  pipelineSpec:
    tasks:
    - name: add-uid
      taskRef:
        name: uid-task
    - name: show-uid
      taskSpec:
        steps:
          - name: show-uid
            image: alpine
            command: ["/bin/sh", "-c"]
            args:
              - echo $(tasks.add-uid.results.uid)
EOF
  1. Run the above tekton PipelineRun
  2. Look at the logs

Additional Info

  • Kubernetes version: 1.26

    Output of kubectl version:

(paste your output here)
  • Tekton Pipeline version: main branch

    Output of tkn version or kubectl get pods -n tekton-pipelines -l app=tekton-pipelines-controller -o=jsonpath='{.items[0].metadata.labels.version}'

(paste your output here)
@vdemeester vdemeester added the kind/bug Categorizes issue or PR as related to a bug. label Oct 24, 2023
@chengjoey
Copy link
Member

/assign

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants