Skip to content

Commit

Permalink
Dev #35: made sure multiple environment configurations can co-exist
Browse files Browse the repository at this point in the history
  • Loading branch information
JimCircadian committed Jul 31, 2023
1 parent f522289 commit a97cab9
Show file tree
Hide file tree
Showing 5 changed files with 159 additions and 157 deletions.
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,10 @@ override.tf.json

# Ignore configuration files that contain secrets
terraform/backend.secrets
terraform/backend.*.secrets
terraform/azure.secrets
azfunctions/local.settings.json
terraform/azure.*.secrets
local.settings.json

# Ignore lock files
.terraform.lock.hcl
Expand Down
40 changes: 21 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,29 +9,30 @@ You will need to install the following in order to use this package:

- A [`Microsoft Azure`](https://portal.azure.com) account with at least `Contributor` permissions on the `IceNet` subscription

* `Python 3.9` (this is the latest version supported by `Azure Functions`)
* `Python 3.8` or above

## Setup the Azure infrastructure

### Python

Install `Python` requirements with the following:

* `pip install --upgrade pip poetry`
* `poetry install`
* `pip install --upgrade pip setuptools wheel`
* `pip install -r requirements.txt`

### Setup the Terraform backend

* Run the `Terraform` setup script `./setup_terraform.py` like so:

```
./setup_terraform.py -v \
-i [[redacted]] \
-s [[redacted]] \
-g [[redacted]] \
-rg [[redacted]] \
-sa [[accountname]] \
-sc [[containername]]```
-i [[admin_subnets]] \
-s [[subscription_name]] \
-rg [[state_resourcegroupname]] \
-sa [[state_accountname]] \
-sc [[state_containername]] \
[[docker_login]] \
[[notification_email]]
```

**You can specify the environment with `-e [[ENV]]` which defaults to `dev`**
Expand All @@ -40,21 +41,22 @@ Install `Python` requirements with the following:
* Initialise `Terraform` by running `terraform init` like so:

```
terraform init -backend-config=backend.secrets \
-backend-config='storage_account_name=[[accountname]]' \
-backend-config='container_name=[[containername]]'
terraform init -backend-config=backend.[[ENV]].secrets \
-backend-config='storage_account_name=[[state_accountname]]' \
-backend-config='container_name=[[state_containername]]'
```

* Check the actions that `Terraform` will carry out by running `terraform plan -var-file=azure.secrets`
* Deploy using `Terraform` by running `terraform apply -var-file=azure.secrets`
* Check the actions that `Terraform` will carry out by running `terraform plan -var-file=azure.[[ENV]].secrets`
* Deploy using `Terraform` by running `terraform apply -var-file=azure.[[ENV]].secrets`
* Switch environments by calling `terraform init` again

### Interfacing with IceNet pipeline

In order to process `NetCDF` files created by the [IceNet pipeline](https://github.com/antarctica/IceNet-Pipeline), these need to be uploaded to the blob storage created by the `Terraform` commands above.
In order to process `NetCDF` files created by the [IceNet pipeline](https://github.com/icenet-ai/icenet-pipeline), these need to be uploaded to the blob storage created by the `Terraform` commands above.
Follow [the instructions here](https://docs.microsoft.com/en-us/azure/cognitive-services/translator/document-translation/create-sas-tokens) to generate tokens for the blob storage at:

* resource group: `rg-icenetetldev-data`
* storage account: `sticenetetldevdata`
* resource group: `rg-icenet[[ENV]]-data`
* storage account: `sticenet[[ENV]]data`
* storage container: `input`

The SAS token will need: `Create`, `Write`, `Add` and `List` permissions.
Expand All @@ -75,8 +77,8 @@ Other methods are possible (for example interfacing with blob receipts) but thes
In order to provide access to the `NetCDF` files stored in blob storage another SAS token will be needed.
Follow [the instructions here](https://docs.microsoft.com/en-us/azure/cognitive-services/translator/document-translation/create-sas-tokens) to generate tokens for the blob storage at:

* resource group: `rg-icenetetldev-data`
* storage account: `sticenetetldevdata`
* resource group: `rg-icenet[[ENV]]-data`
* storage account: `sticenet[[ENV]]data`
* storage container: `input`

The SAS token will need: `Read` and `List` permissions.
Expand Down
266 changes: 133 additions & 133 deletions setup_terraform.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,141 +11,10 @@
from azure.mgmt.storage import StorageManagementClient


def main():
"""Setup initial Azure infrastructure used by Terraform"""
# Disable unnecessarily verbose Azure logging
logging.getLogger("azure.identity._internal").setLevel(logging.ERROR)
logging.getLogger("azure.identity._credentials").setLevel(logging.ERROR)
logging.getLogger("azure.core.pipeline.policies").setLevel(logging.ERROR)

# Parse command line arguments
parser = argparse.ArgumentParser(
description="Initialise the Azure infrastructure needed by Terraform"
)
parser.add_argument(
"-e",
"--environment",
default="dev",
help="Environment name to create, will be used to identify ALL resources (make it short)"
)
parser.add_argument(
"-g",
"--azure-group-id",
type=str,
default="35cf3fea-9d3c-4a60-bd00-2c2cd78fbd4c",
help="ID of an Azure group containing all developers (default is Turing's 'All Users' group).",
)
parser.add_argument(
"-i",
"--user-ip-addresses",
nargs="+",
default=[
"TuringVPN=193.60.220.253/32", # Turing VPN
"BASVPN=194.66.0.0/24", # BAS VPN
],
help="List of CIDRs that users will connect from.",
)
parser.add_argument(
"-rg",
"--azure-resource-group-name",
type=str,
default="rg-icenetetl-terraform",
help="Name of the Azure resource group",
)
parser.add_argument(
"-s",
"--azure-subscription-name",
type=str,
default="IceNet",
help="Name of the Azure subscription being used.",
)
parser.add_argument(
"-sa",
"--azure-storage-account-name",
type=str,
default="sticenetetlterraform",
help="Name of the Azure storage account",
)
parser.add_argument(
"-sc",
"--azure-storage-container-name",
type=str,
default="blob-icenetetl-terraform",
help="Name of the Azure storage container",
)
parser.add_argument(
"-v",
"--verbose",
action="count",
default=0,
help="Verbosity level: each '-v' will increase logging level by one step (default is WARNING).",
)
parser.add_argument(
"docker_username",
help="Login for pulling docker images from Docker Hub"
)
parser.add_argument(
"notification_email",
help="Email for sending notifications"
)
args = parser.parse_args()
user_ip_address_dict = {
item.split("=")[0]: item.split("=")[1] for item in args.user_ip_addresses
}

# Configure logging, increasing verbosity by one level for each 'v'
verbosity = max(logging.WARNING - (10 * args.verbose), 0)
coloredlogs.install(fmt="%(asctime)s %(levelname)8s: %(message)s", level=verbosity)

logging.warning("We'll need a password for docker username {}".format(args.docker_username))
docker_password = getpass.getpass()

# Set Terraform variables
tags = {
"deployed_by": "Python",
"project": "IceNet",
"component": "ETL",
}
resource_group_name = args.azure_resource_group_name
storage_account_name = args.azure_storage_account_name
storage_container_name = args.azure_storage_container_name

# Get a common Azure information
credential = InteractiveBrowserCredential()
subscription_id, tenant_id = get_azure_ids(credential, args.azure_subscription_name)

# Configure the Terraform backend
configure_terraform_backend(
credential,
subscription_id,
resource_group_name,
storage_account_name,
storage_container_name,
tags=tags,
)
storage_key = load_terraform_storage_key(
credential, subscription_id, resource_group_name, storage_account_name
)

# Write Terraform configs to file
write_terraform_configs(
subscription_id,
tenant_id,
args.azure_group_id,
user_ip_address_dict,
storage_key,
args.environment,
docker_username=args.docker_username,
docker_password=docker_password,
notification_email=args.notification_email
)


def get_azure_ids(credential, subscription_name):
"""Get subscription and tenant IDs"""
# Connect to Azure clients
subscription_client = SubscriptionClient(credential=credential)
logging.debug("DEBUG: {}".format(subscription_client))

# Check that the Azure credentials are valid
try:
Expand All @@ -172,17 +41,18 @@ def write_terraform_configs(
):
"""Write Terraform config files"""
# Backend secrets
backend_secrets_path = os.path.join("terraform", "backend.secrets")
backend_secrets_path = os.path.join("terraform", "backend.{}.secrets".format(environment))
logging.info(f"Writing Terraform backend secrets to {backend_secrets_path}")
backend_secrets = {
"access_key": storage_key,
"key": "terraform.{}.tfstate".format(environment)
}
with open(backend_secrets_path, "w") as f_out:
for key, value in backend_secrets.items():
f_out.write(f'{key} = "{value}"\n')

# Azure secrets
azure_secrets_path = os.path.join("terraform", "azure.secrets")
azure_secrets_path = os.path.join("terraform", "azure.{}.secrets".format(environment))
logging.info(f"Writing Azure tenancy details to {azure_secrets_path}")
azure_vars = {
"subscription_id": subscription_id,
Expand Down Expand Up @@ -280,5 +150,135 @@ def load_terraform_storage_key(
return storage_keys.keys[0].value


def main():
"""Setup initial Azure infrastructure used by Terraform"""
# Disable unnecessarily verbose Azure logging
logging.getLogger("azure.identity._internal").setLevel(logging.ERROR)
logging.getLogger("azure.identity._credentials").setLevel(logging.ERROR)
logging.getLogger("azure.core.pipeline.policies").setLevel(logging.ERROR)

# Parse command line arguments
parser = argparse.ArgumentParser(
description="Initialise the Azure infrastructure needed by Terraform"
)
parser.add_argument(
"-e",
"--environment",
default="dev",
help="Environment name to create, will be used to identify ALL resources (make it short)"
)
parser.add_argument(
"-g",
"--azure-group-id",
type=str,
default="35cf3fea-9d3c-4a60-bd00-2c2cd78fbd4c",
help="ID of an Azure group containing all developers (default is Turing's 'All Users' group).",
)
parser.add_argument(
"-i",
"--user-ip-addresses",
nargs="+",
default=[
"TuringVPN=193.60.220.253/32", # Turing VPN
"BASVPN=194.66.0.0/24", # BAS VPN
],
help="List of CIDRs that users will connect from.",
)
parser.add_argument(
"-rg",
"--azure-resource-group-name",
type=str,
default="rg-icenetetl-terraform",
help="Name of the Azure resource group",
)
parser.add_argument(
"-s",
"--azure-subscription-name",
type=str,
default="IceNet",
help="Name of the Azure subscription being used.",
)
parser.add_argument(
"-sa",
"--azure-storage-account-name",
type=str,
default="sticenetetlterraform",
help="Name of the Azure storage account",
)
parser.add_argument(
"-sc",
"--azure-storage-container-name",
type=str,
default="blob-icenetetl-terraform",
help="Name of the Azure storage container",
)
parser.add_argument(
"-v",
"--verbose",
action="count",
default=0,
help="Verbosity level: each '-v' will increase logging level by one step (default is WARNING).",
)
parser.add_argument(
"docker_username",
help="Login for pulling docker images from Docker Hub"
)
parser.add_argument(
"notification_email",
help="Email for sending notifications"
)
args = parser.parse_args()
user_ip_address_dict = {
item.split("=")[0]: item.split("=")[1] for item in args.user_ip_addresses
}

# Configure logging, increasing verbosity by one level for each 'v'
verbosity = max(logging.WARNING - (10 * args.verbose), 0)
coloredlogs.install(fmt="%(asctime)s %(levelname)8s: %(message)s", level=verbosity)

logging.warning("We'll need a password for docker username {}".format(args.docker_username))
docker_password = getpass.getpass()

# Set Terraform variables
tags = {
"deployed_by": "Python",
"project": "IceNet",
"component": "ETL",
}
resource_group_name = args.azure_resource_group_name
storage_account_name = args.azure_storage_account_name
storage_container_name = args.azure_storage_container_name

# Get a common Azure information
credential = InteractiveBrowserCredential()
subscription_id, tenant_id = get_azure_ids(credential, args.azure_subscription_name)

# Configure the Terraform backend
configure_terraform_backend(
credential,
subscription_id,
resource_group_name,
storage_account_name,
storage_container_name,
tags=tags,
)
storage_key = load_terraform_storage_key(
credential, subscription_id, resource_group_name, storage_account_name
)

# Write Terraform configs to file
write_terraform_configs(
subscription_id,
tenant_id,
args.azure_group_id,
user_ip_address_dict,
storage_key,
args.environment,
docker_username=args.docker_username,
docker_password=docker_password,
notification_email=args.notification_email
)


if __name__ == "__main__":
main()
Loading

0 comments on commit a97cab9

Please sign in to comment.