-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pulumi: Improve login flow #1617
Conversation
This looks very nice. Is it ready to test? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some initial thoughts
This rule enforces that subprocess calls use absolute paths to the executable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When attempting to deploy a new SHM, I hit an (extra long) error, which indicates a failure when it runs pulumi whoami
2023-10-20 13:51:31 [ INFO] Reading project settings from '/home/deploydsh/.config/data_safe_haven/config.yaml'. backend_settings.py:110
2023-10-20 13:51:33 [ INFO] AzureAD tenant ID will be cb94a6f6-ef7a-42ab-bcad-4f0b887cfd3e. config.py:176
2023-10-20 13:51:33 [ INFO] Admin email address will be [email protected]. config.py:183
2023-10-20 13:51:33 [ INFO] IP addresses used by administrators will be ['193.60.220.253/32']. config.py:190
2023-10-20 13:51:33 [ INFO] Fully-qualified domain name will be green.develop.turingsafehaven.ac.uk. config.py:197
2023-10-20 13:51:33 [ INFO] Timezone will be Europe/London. config.py:204
2023-10-20 13:51:34 [ INFO] name: [email protected] (id: 3f1a8e26-eae2-4539-952a-0a6184ec248a azure_cli.py:67
tenant: 4395f4a7-e455-4f95-8a9f-1fbaef6384f9
Is this the Azure account you expect?
[y/N]: y
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /workspaces/data-safe-haven/data_safe_haven/infrastructure/stack_manager.py:62 in confirm │
│ │
│ 59 │ │ # do this with a minimal dummy stack which also works with the Azure backend. │
│ 60 │ │ # A subprocess call is used here. │
│ 61 │ │ try: │
│ ❱ 62 │ │ │ result = subprocess.check_output( │
│ 63 │ │ │ │ [self.path, "whoami", "--verbose"], │
│ 64 │ │ │ │ stderr=subprocess.PIPE, │
│ 65 │ │ │ │ encoding="utf8", │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ self = <data_safe_haven.infrastructure.stack_manager.PulumiAccount object at 0xffff8df8c8b0> │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ /usr/local/lib/python3.10/subprocess.py:421 in check_output │
│ │
│ 418 │ │ │ empty = b'' │
│ 419 │ │ kwargs['input'] = empty │
│ 420 │ │
│ ❱ 421 │ return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, │
│ 422 │ │ │ **kwargs).stdout │
│ 423 │
│ 424 │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ kwargs = { │ │
│ │ │ 'stderr': -1, │ │
│ │ │ 'encoding': 'utf8', │ │
│ │ │ 'env': { │ │
│ │ │ │ 'AZURE_STORAGE_ACCOUNT': 'shmgreenbackend', │ │
│ │ │ │ 'AZURE_STORAGE_KEY': │ │
│ │ 'gkUkFf9cFNOJt4m2OkwmRZlsYYW3eNwvZW2CiCOPiJdU1gW4620a3tH7F1YPp+/LEzxEWrVnbikP+A… │ │
│ │ │ │ 'AZURE_KEYVAULT_AUTH_VIA_CLI': 'true' │ │
│ │ │ } │ │
│ │ } │ │
│ │ popenargs = (['/home/deploydsh/.pulumi/bin/pulumi', 'whoami', '--verbose'],) │ │
│ │ timeout = None │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ /usr/local/lib/python3.10/subprocess.py:526 in run │
│ │
│ 523 │ │ │ raise │
│ 524 │ │ retcode = process.poll() │
│ 525 │ │ if check and retcode: │
│ ❱ 526 │ │ │ raise CalledProcessError(retcode, process.args, │
│ 527 │ │ │ │ │ │ │ │ │ output=stdout, stderr=stderr) │
│ 528 │ return CompletedProcess(process.args, retcode, stdout, stderr) │
│ 529 │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ capture_output = False │ │
│ │ check = True │ │
│ │ input = None │ │
│ │ kwargs = { │ │
│ │ │ 'stdout': -1, │ │
│ │ │ 'stderr': -1, │ │
│ │ │ 'encoding': 'utf8', │ │
│ │ │ 'env': { │ │
│ │ │ │ 'AZURE_STORAGE_ACCOUNT': 'shmgreenbackend', │ │
│ │ │ │ 'AZURE_STORAGE_KEY': │ │
│ │ 'gkUkFf9cFNOJt4m2OkwmRZlsYYW3eNwvZW2CiCOPiJdU1gW4620a3tH7F1YPp+/LEzxEWrVnb… │ │
│ │ │ │ 'AZURE_KEYVAULT_AUTH_VIA_CLI': 'true' │ │
│ │ │ } │ │
│ │ } │ │
│ │ popenargs = (['/home/deploydsh/.pulumi/bin/pulumi', 'whoami', '--verbose'],) │ │
│ │ process = <Popen: returncode: 255 args: ['/home/deploydsh/.pulumi/bin/pulumi', │ │
│ │ 'whoami...> │ │
│ │ retcode = 255 │ │
│ │ stderr = 'error: PULUMI_ACCESS_TOKEN must be set for login during non-interactive │ │
│ │ CLI sess'+5 │ │
│ │ stdout = '' │ │
│ │ timeout = None │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
CalledProcessError: Command '['/home/deploydsh/.pulumi/bin/pulumi', 'whoami', '--verbose']' returned non-zero exit status 255.
During handling of the above exception, another exception occurred:
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /workspaces/data-safe-haven/data_safe_haven/commands/deploy.py:83 in shm │
│ │
│ 80 │ ] = None, │
│ 81 ) -> None: │
│ 82 │ """Deploy a Safe Haven Management component""" │
│ ❱ 83 │ deploy_shm( │
│ 84 │ │ aad_tenant_id=aad_tenant_id, │
│ 85 │ │ admin_email_address=admin_email_address, │
│ 86 │ │ admin_ip_addresses=admin_ip_addresses, │
│ │
│ ╭─────────────────────────── locals ───────────────────────────╮ │
│ │ aad_tenant_id = 'cb94a6f6-ef7a-42ab-bcad-4f0b887cfd3e' │ │
│ │ admin_email_address = '[email protected]' │ │
│ │ admin_ip_addresses = ['193.60.220.253/32'] │ │
│ │ domain = 'green.develop.turingsafehaven.ac.uk' │ │
│ │ force = None │ │
│ │ timezone = 'Europe/London' │ │
│ ╰──────────────────────────────────────────────────────────────╯ │
│ │
│ /workspaces/data-safe-haven/data_safe_haven/commands/deploy_shm.py:43 in deploy_shm │
│ │
│ 40 │ │ verification_record = graph_api.add_custom_domain(config.shm.fqdn) │
│ 41 │ │ │
│ 42 │ │ # Initialise Pulumi stack │
│ ❱ 43 │ │ PulumiAccount(config).handle_login() │
│ 44 │ │ stack = SHMStackManager(config) │
│ 45 │ │ # Set Azure options │
│ 46 │ │ stack.add_option("azure-native:location", config.azure.location, replace=False) │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ aad_tenant_id = 'cb94a6f6-ef7a-42ab-bcad-4f0b887cfd3e' │ │
│ │ admin_email_address = '[email protected]' │ │
│ │ admin_ip_addresses = ['193.60.220.253/32'] │ │
│ │ config = <data_safe_haven.config.config.Config object at 0xffff8df4b490> │ │
│ │ force = None │ │
│ │ fqdn = 'green.develop.turingsafehaven.ac.uk' │ │
│ │ graph_api = <data_safe_haven.external.api.graph_api.GraphApi object at │ │
│ │ 0xffff8df8c820> │ │
│ │ timezone = 'Europe/London' │ │
│ │ verification_record = 'MS=' │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ /workspaces/data-safe-haven/data_safe_haven/infrastructure/stack_manager.py:102 in handle_login │
│ │
│ 99 │ │
│ 100 │ def handle_login(self) -> None: │
│ 101 │ │ """Ensure the user is using the DSH Pulumi backend""" │
│ ❱ 102 │ │ if not self.confirm(): │
│ 103 │ │ │ msg = ( │
│ 104 │ │ │ │ "Attempting to login to Pulumi account using" │
│ 105 │ │ │ │ f" container [green]{self.cfg.pulumi.storage_container_name}[/]" │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ self = <data_safe_haven.infrastructure.stack_manager.PulumiAccount object at 0xffff8df8c8b0> │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
│ │
│ /workspaces/data-safe-haven/data_safe_haven/infrastructure/stack_manager.py:69 in confirm │
│ │
│ 66 │ │ │ │ env=self.env, │
│ 67 │ │ │ ) │
│ 68 │ │ except subprocess.CalledProcessError as exc: │
│ ❱ 69 │ │ │ msg = f"Logging into Pulumi failed.\n{exc}\n{result}" │
│ 70 │ │ │ raise DataSafeHavenPulumiError(msg) from exc │
│ 71 │ │ │
│ 72 │ │ self.logger.info("Confirming Pulumi account details") │
│ │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ self = <data_safe_haven.infrastructure.stack_manager.PulumiAccount object at 0xffff8df8c8b0> │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
UnboundLocalError: local variable 'result' referenced before assignment
I think what happens is that handle_login()
tries to check if the user is logged in to the right pulumi account, using confirm()
. confirm()
then tries to run pulumi whoami
. If you aren't already logged in to pulumi, running pulumi whoami
makes it try to log you in. By default, it'll try to log you in to the Pulumi Cloud, which then fails - the relevant error for that is
stderr = 'error: PULUMI_ACCESS_TOKEN must be set for login during non-interactive CLI sess'+5
PULUMI_ACCESS_TOKEN
shouldn't be necessary when using a non-pulumi backed, like the Azure one we're using, but at this point whoami
doesn't know what backend we want to use - we haven't run login()
yet.
I think the current logic'll catch people being logged in to the wrong account but doesn't correctly handle people who aren't logged in at all.
On this point: one way around this is to set the
|
Interesting. Setting that environment variable makes (Maybe Pulumi login is really just about getting a local token so you don't have to authenticate/specify with every command and the env variables can override this) |
@craddm Seems to work. Setting the appropriate environment variables and passing them to the Pulumi automation API workspace means you don't need to use You can run This is nice actually as it gives us much less code to maintain/make ABI compatible. |
Co-authored-by: Matt Craddock <[email protected]>
@craddm Good catch, I thought I had fixed that. I have make the AzureCli class a singleton so it shouldn't happen now. Oops, didn't see that change didn't push. It has been now. |
LGTM 👍 |
✅ Checklist
Enable foobar integration
rather than515 foobar
).develop
.'[WIP]'
to the title if needed (if you're not yet ready to merge)../tests/AutoFormat_Powershell.ps1 -TargetPath <path to file or directory>
for Powershell).🌂 Related issues
Closes #1511
🔬 Tests
Tested locally by triggering the new prompts