Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR! Invalid callback for stdout specified: awx_display #1448

Closed
davidgleiss opened this issue Mar 15, 2023 · 13 comments
Closed

ERROR! Invalid callback for stdout specified: awx_display #1448

davidgleiss opened this issue Mar 15, 2023 · 13 comments
Labels
bug Researched, reproducible, committed to fix

Comments

@davidgleiss
Copy link

ISSUE TYPE
  • Bug Report
SUMMARY

I cannot start any playbook with ansible-navigator.

OS:

Operating System: Ubuntu 22.04.2 LTS              
Kernel: Linux 5.15.0-67-generic
ANSIBLE-NAVIGATOR VERSION
ansible-navigator 2.2.0
CONFIGURATION
---
ansible-navigator:
  ansible:
    config:
      path: .ansible.cfg
    inventory:
      help: false
      entries:
        - inventory.yaml
  execution-environment:
    pull:
      policy: missing
    container-engine: docker
    enabled: true
    image: quay.io/ansible/awx-ee:latest

  playbook-artifact:
    enable: false

  logging:
    append: true
    file: /tmp/ansible-navigator.log
    level: warning
LOG FILE

(https://gist.github.com/davidgleiss/de768448496cd63f6ef218dcd8c54406)

STEPS TO REPRODUCE
ansible-navigator run test.yaml -m stdout

or

ansible-navigator run test.yaml
EXPECTED RESULTS

Playbook should be executed

ACTUAL RESULTS
[WARNING]: error in 'jsonfile' cache plugin while trying to create cache dir /runner/artifacts/1a0ae560-374c-4084-a8bf-4e3a65e37f31/fact_cache : b"[Errno 13] Permission denied:
'/runner/artifacts/1a0ae560-374c-4084-a8bf-4e3a65e37f31'"
ERROR! Invalid callback for stdout specified: awx_display
ADDITIONAL INFORMATION
Image: awx-ee:latest (primary) (Information about ansible and ansible collections)                                                                                     
 0│---
 1│ansible:
 2│  collections:
 3│    details:
 4│      '@NAMESPACE@.@NAME@': 3.0.1
 5│      amazon.aws: 5.3.0
 6│      ansible.posix: 1.5.1
 7│      ansible.windows: 1.13.0
 8│      awx.awx: 21.13.0
 9│      azure.azcollection: 1.14.0
10│      google.cloud: 1.1.3
11│      kubernetes.core: 2.4.0
12│      openstack.cloud: 2.0.0
13│      redhatinsights.insights: 1.0.7
14│      theforeman.foreman: 3.9.0
15│  version:
16│    details: ansible [core 2.14.3]
@davidgleiss davidgleiss added bug Researched, reproducible, committed to fix new New issues and PRs to triaged labels Mar 15, 2023
@ssbarnea
Copy link
Member

@cidrblock The awx_display callback is not part of default ansible callbacks, at least not in current stable (2.14).

ansible-doc -t callback -l |grep awx_display

Still, grepping for awx_display on navigator source code did got few results, which makes me wonder if is not a real bug:

$ rg awx_display
tests/fixtures/integration/actions/config/test_welcome_interactive_specified_config.py/test/3.json:111:235:        " 96│Default stdout callback                                            False       env                                                                                                                                         awx_display",
tests/fixtures/integration/actions/exec/test_stdout_config_file.py/test/2.json:26:34:        "ANSIBLE_STDOUT_CALLBACK=awx_display",
tests/fixtures/integration/actions/exec/test_stdout_file.py/test/2.json:23:34:        "ANSIBLE_STDOUT_CALLBACK=awx_display",
tests/fixtures/integration/actions/config/test_welcome_interactive_specified_config.py/test/1.json:111:177:        " 96│Default stdout callback                                                                                                False                env                  awx_display",

@ssbarnea ssbarnea removed the new New issues and PRs to triaged label Mar 15, 2023
@eamigo
Copy link
Contributor

eamigo commented Apr 4, 2023

awx_display is part of the ansible-runner source. runner always sets this as the display callback.

These error messages are caused because the /runner directory in the awx-ee image is not readable by the default uid 1000 that the awx-ee container runs under. The creator-ee image is running with uid 0 by default.

awx-ee

$ ansible-navigator --eei quay.io/ansible/awx-ee:latest exec 'id && ls -ld /runner'
----------------------------------------------------------------
Execution environment image and pull policy overview
----------------------------------------------------------------
Execution environment image name:     quay.io/ansible/awx-ee:latest
Execution environment image tag:      latest
Execution environment pull arguments: None
Execution environment pull policy:    tag
Execution environment pull needed:    True
----------------------------------------------------------------
Updating the execution environment
----------------------------------------------------------------
Running the command: podman pull quay.io/ansible/awx-ee:latest
Trying to pull quay.io/ansible/awx-ee:latest...
Getting image source signatures
Copying blob 8ade7cff7401 skipped: already exists  
Copying blob 395af34d3f44 skipped: already exists  
Copying blob 0e3cc96c2c12 skipped: already exists  
Copying blob 520d520a786c skipped: already exists  
Copying blob 29a3845ed67a skipped: already exists  
Copying blob 6dfe62e21d87 skipped: already exists  
Copying blob bab41278e7b5 skipped: already exists  
Copying blob 79fd85aad81b skipped: already exists  
Copying blob 869a9397e506 skipped: already exists  
Copying blob a49629dc91a9 skipped: already exists  
Copying blob 4617f4d10380 skipped: already exists  
Copying blob 4f4fb700ef54 skipped: already exists  
Copying blob 88f46c944f44 skipped: already exists  
Copying blob 44b9a45681ff skipped: already exists  
Copying blob c30ddb8a6ecb skipped: already exists  
Copying blob 17ef803709e7 skipped: already exists  
Copying blob 1a4b50973163 skipped: already exists  
Copying config ab238ad29d done  
Writing manifest to image destination
Storing signatures
ab238ad29d2f2a943db5e0baa89545055104c93a021c7dfa436ca9afdcd9c0fd
uid=1000(1000) gid=0(root) groups=0(root)
drwx------. 4 root root 38 Apr  4 15:15 /runner

creator-ee

$ ansible-navigator --eei ghcr.io/ansible/creator-ee:latest exec 'id && ls -ld /runner'
--------------------------------------------------------------------
Execution environment image and pull policy overview
--------------------------------------------------------------------
Execution environment image name:     ghcr.io/ansible/creator-ee:latest
Execution environment image tag:      latest
Execution environment pull arguments: None
Execution environment pull policy:    tag
Execution environment pull needed:    True
--------------------------------------------------------------------
Updating the execution environment
--------------------------------------------------------------------
Running the command: podman pull ghcr.io/ansible/creator-ee:latest
Trying to pull ghcr.io/ansible/creator-ee:latest...
Getting image source signatures
Copying blob 1aee283f9f89 skipped: already exists  
Copying blob 9d6fa3d0934a skipped: already exists  
Copying blob fb06dcc62ceb skipped: already exists  
Copying blob 57183c2f29e0 skipped: already exists  
Copying blob 56d9972f4094 skipped: already exists  
Copying blob 922af44795ef skipped: already exists  
Copying blob cd7239b7ef68 skipped: already exists  
Copying blob 7b90279e55d6 skipped: already exists  
Copying blob 332c06ce0b25 skipped: already exists  
Copying blob 6af21b828353 skipped: already exists  
Copying blob b0c634a5e21f skipped: already exists  
Copying blob 39ec24445cbc skipped: already exists  
Copying blob 1d48ea3216fe skipped: already exists  
Copying config 804b31126b done  
Writing manifest to image destination
Storing signatures
804b31126b71be9b2010b9e2ec59123603d28358c424c6e60b5f5376ccb7d6c7
uid=0(root) gid=0(root) groups=0(root)
drwx------. 4 root root 38 Apr  4 15:16 /runner

You could try configuring the uid to 0 by setting the user id as an additional container-option.

ansible-navigator run  --container-options='--user=0' test.yaml

@davidgleiss
Copy link
Author

I still get the same error, changing the uid did not help.

We trying this on a Ubuntu Server (sort of Jump Host), which we use by connecting with VSCode and the SSH Remote extension. Connecting with "normal" SSH (without VSCode) also brings the same error.

Running our playbooks locally inside Ubuntu WSL works like expected. (awx-ee image without any options)
Unfortunately this does not work with our VPN, therefore we rely on the Jump Host when working from home.

@felixfontein
Copy link

I got the same error message when trying to use ansible-builder 3.0.0 with latest ansible-runner and ansible-navigator with podman (https://github.com/ansible-collections/community.crypto/actions/runs/5027516137). When switching to docker (for both image building and EE execution) the problem went away.

(PR: ansible-collections/community.crypto#606)

@cidrblock
Copy link
Collaborator

cidrblock commented May 19, 2023

confirming this is an issue with builder 3

ansible/ansible-builder#541

@cidrblock
Copy link
Collaborator

version 3.3.1 of ansible-navigator was just released upstream, mind giving it a run and letting us know if it's better (or worse) :)

@felixfontein
Copy link

@cidrblock the 3.3.1 release will only happen if someone approves the release workflow: https://github.com/ansible/ansible-navigator/actions/runs/5032141300

image

@cidrblock
Copy link
Collaborator

TY @felixfontein we disabled the need for approval elsewhere, not sure how it got missed.

anyway, it's up https://pypi.org/project/ansible-navigator/3.3.1/

@felixfontein
Copy link

I've tried to use podman instead of docker in ansible-collections/community.sops#144 with the new release, and it worked fine for me!

@shatakshiiii
Copy link
Contributor

Hey @davidgleiss, ansible-navigator 3.3.1 was released, which sets the -u root option in podman container-engine. Please give it a run and let us know how it's working now.

@davidgleiss
Copy link
Author

Hi,

thanks to all for the effort! I just tested version 3.3.1 with podman and it works, I'm very happy about that :)
I can now run playbooks inside our own EE on our jump host.

One "issue" I face now is that I need to sudo ansible-navigator, otherwhise podman cannot pull the image due to this error:

Error: writing blob: adding layer with blob "sha256:effc4ea612c8cb531b45192df865c42f91ac1fa5da56a4af992e019934c442b9": Error processing tar file(exit status 1): potentially insufficient UIDs or GIDs available in user namespace (requested 0:5 for /usr/bin/write): Check /etc/subuid and /etc/subgid: lchown /usr/bin/write: invalid argument

I found this instruction on how to run podman rootless: https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md
Would this be necessary?
I know this error has nothing todo with the original issue, but can somebody point me in the right direction?

Thanks again!

@davidgleiss
Copy link
Author

One "issue" I face now is that I need to sudo ansible-navigator, otherwhise podman cannot pull the image due to this error:

Error: writing blob: adding layer with blob "sha256:effc4ea612c8cb531b45192df865c42f91ac1fa5da56a4af992e019934c442b9": Error processing tar file(exit status 1): potentially insufficient UIDs or GIDs available in user namespace (requested 0:5 for /usr/bin/write): Check /etc/subuid and /etc/subgid: lchown /usr/bin/write: invalid argument

I found this instruction on how to run podman rootless: https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md Would this be necessary? I know this error has nothing todo with the original issue, but can somebody point me in the right direction?

I just found the solution to the podman error:

containers/podman#12715 (comment)

sudo usermod --add-subgids 10000-75535 <username>
sudo usermod --add-subuids 10000-75535 <username>
podman system migrate
podman pull ....

Now ansible-navigator runs fine without sudo.

@shatakshiiii
Copy link
Contributor

shatakshiiii commented May 24, 2023

Thanks @davidgleiss for further investigating this, I'm glad its working now!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Researched, reproducible, committed to fix
Projects
None yet
Development

No branches or pull requests

6 participants