Skip to content

Commit

Permalink
feat(infra): workshop environment helper stack
Browse files Browse the repository at this point in the history
  • Loading branch information
athewsey committed Nov 22, 2021
1 parent f3ae99e commit 34522f2
Show file tree
Hide file tree
Showing 15 changed files with 1,687 additions and 0 deletions.
85 changes: 85 additions & 0 deletions .workshop-infra/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
# SPDX-License-Identifier: MIT-0
STACK_NAME?="textract-transformers-workshop"
DEPLOYMENT_BUCKET_NAME?="UNDEFINED"
DEPLOYMENT_BUCKET_PREFIX?=""
TARGET_REPO?="https://github.com/aws-samples/amazon-textract-transformer-pipeline"

target:
$(info ${HELP_MESSAGE})
@exit 0

package: ##=> Build SAM template & assets to CloudFormation on S3
$(info [*] Building AWS SAM stack...)
sam build \
--use-container \
--template template.sam.yaml && \
sam package \
--s3-bucket $(DEPLOYMENT_BUCKET_NAME) \
--s3-prefix $(DEPLOYMENT_BUCKET_PREFIX)sam \
--use-json \
--output-template-file template.tmp.json && \
python sam-postproc.py template.tmp.json template.tmp.json && \
aws s3 cp template.tmp.json \
s3://$(DEPLOYMENT_BUCKET_NAME)/$(DEPLOYMENT_BUCKET_PREFIX)template.cf.json

# CF with --disable-rollback is faster for debugging than sam deploy
create: ##=> Create services stack (only)
$(info [*] Deploying...)
aws cloudformation create-stack \
--template-body file://template.tmp.yaml \
--stack-name $(STACK_NAME) \
--capabilities CAPABILITY_IAM CAPABILITY_AUTO_EXPAND \
--disable-rollback
# --parameters \
# ParameterKey=ParamName,ParameterValue=$(PARAM_VAR)

deploy: ##=> Deploy services (flexible create or update)
$(info [*] Deploying...)
sam deploy \
--template-file template.tmp.yaml \
--stack-name $(STACK_NAME) \
--capabilities CAPABILITY_IAM CAPABILITY_AUTO_EXPAND \
--no-fail-on-empty-changeset
# --parameter-overrides \
# ParamName=$(PARAM_VAR)

all: ##=> Build and create stack
@$(MAKE) package
@$(MAKE) create

delete: ##=> Delete services
$(info [*] Deleting stack...)
aws cloudformation delete-stack --stack-name $(STACK_NAME)


#############
# Helpers #
#############

define HELP_MESSAGE

STACK_NAME: "textract-transformers-workshop"
Description: Stack Name to deploy/redeploy to
DEPLOYMENT_BUCKET_NAME:
Description: Amazon S3 bucket for staging built SAM Lambda bundles and assets
DEPLOYMENT_BUCKET_PREFIX: ""
Description: For publishing to a prefix in your deployment bucket, instead of root. Should
include trailing slash e.g. 'my-prefix/'
TARGET_REPO: "https://github.com/aws-samples/amazon-textract-transformer-pipeline"
Target repository where your workshop code lives

Common usage:

...::: Build all SAM based services :::...
$ make package

...::: Deploy or re-deploy all SAM based services :::...
$ make deploy

...::: Create (cannot re-deploy) all SAM based services with rollback disabled :::...
$ make create

...::: Delete all SAM based services :::...
$ make delete
endef
74 changes: 74 additions & 0 deletions .workshop-infra/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# Infrastructure for SageMaker Workshop with a CDK solution stack

This folder provides a helper stack which will:

- Create a SageMaker Notebook Instance with the repository cloned in
- Create an (IAM-authenticated) SageMaker Studio domain, with a user profile, with the repository cloned in (and some VPC infrastructure required to make that happen)
- Run a one-off AWS CodeBuild build to download the repository, `poetry install` the dependencies and `cdk deploy --all` stacks in the solution

It's intended to help automate setting up workshops on temporary AWS accounts, with CDK-based solutions (like this one) that assume a SageMaker notebook environment will be provisioned separately.

## Prerequisites and Caveats

This helper stack assumes that (in your target AWS Region):

- You have not yet onboarded to SageMaker Studio
- You have a default VPC you're willing to use with standard configuration, or else would like to use a custom VPC but are comfortable checking the compatibility of the stack with your VPC configuration.

> ⚠️ This stack is oriented towards convenience of **getting started** and first exploring SageMaker Studio with the companion solution stack. It is **not recommended for long-lived environments**.
>
> In particular, **be aware that:**
>
> - The stack grants broad power user permissions to the CodeBuild job (for whatever resources the CDK deployment may need to create)
> - When you delete the stack
> - The SageMaker Studio setup for your target AWS Region will be deleted (and the stack should *fail* to delete if you have any users running 'apps' in Studio apart from the ones set up by the stack. You can manage these through the [SageMaker console UI](https://console.aws.amazon.com/sagemaker/home?#/studio))
> - The CDK solution deployed deployed by the CodeBuild project will *not* automatically be cleaned up
## Developing and Deploying Locally

In addition to having an AWS account, you'll need an environment with:

- The [AWS CLI](https://aws.amazon.com/cli/)
- The [AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html)
- A Docker-compatible container runtime such as [Docker Desktop](https://www.docker.com/products/docker-desktop)
- A `make` utility such as [GNU Make](https://www.gnu.org/software/make/) - probably already installed if you have some bundled build tools already.
- *Probably* a UNIX-like (non-Windows) shell if you want things to run smoothly... But you can always give it a try and resort to translating commands from the [Makefile](Makefile) if things go wrong.

You'll also need:

- Sufficient access (log in with `aws configure`) to be able to deploy the stacks in your target region
- An *[Amazon S3](https://s3.console.aws.amazon.com/s3/home) Bucket* to use for staging deployment assets (Lambda bundles, etc)

**Step 1: Build the Lambda bundles and final CloudFormation template to S3 with AWS SAM**

(This command builds your assets and CloudFormation template, and stages them to your nominated Amazon S3 bucket)

```sh
make package DEPLOYMENT_BUCKET_NAME=DOC-EXAMPLE-BUCKET
```

**Step 2: Deploy (create or update) the stack**

```sh
make deploy STACK_NAME=workshopstack
```

***Alternative: Build and create the stack in one go**

(This option only *creates* stacks, and disables rollback, for easier debugging)

```sh
make all DEPLOYMENT_BUCKET_NAME=example-bucket STACK_NAME=workshopstack
```

There's also a `make delete` option to help with cleaning up.

## Preparing Templates for Multi-Region Deployment

If you'd like your template to be deployable in multiple AWS Regions:

- Set up an asset hosting bucket in each region of interest, and use the AWS Region ID (e.g. `us-east-1`) in the bucket names
- Set up cross-region replication to copy contents from your lead region to other regions
- Run the `make package` script against your lead region

The generated template will be automatically post-processed (by [sam-postproc.py](sam-postproc.py)) to tokenize S3 references to hosted assets to refer to the `${AWS::Region}` placeholder.
85 changes: 85 additions & 0 deletions .workshop-infra/fn-codebuild-run/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
# SPDX-License-Identifier: MIT-0
"""Custom CloudFormation Resource to kick off CodeBuild project builds
This custom resource expects a 'ProjectName' property, and will simply kick off a run of that AWS
CodeBuild Project on creation. It doesn't wait for the run to complete successfully, and it doesn't
do anything on resource UPDATE/DELETE.
"""
# Python Built-Ins:
import logging
import traceback

# External Dependencies:
import boto3
import cfnresponse

codebuild = boto3.client("codebuild")


def lambda_handler(event, context):
try:
request_type = event["RequestType"]
if request_type == "Create":
handle_create(event, context)
elif request_type == "Update":
handle_update(event, context)
elif request_type == "Delete":
handle_delete(event, context)
else:
cfnresponse.send(
event,
context,
cfnresponse.FAILED,
{},
error=f"Unsupported CFN RequestType '{request_type}'",
)
except Exception as e:
logging.error("Uncaught exception in CFN custom resource handler - reporting failure")
traceback.print_exc()
cfnresponse.send(
event,
context,
cfnresponse.FAILED,
{},
error=str(e),
)
raise e


def handle_create(event, context):
logging.info("**Received create request")
resource_config = event["ResourceProperties"]
logging.info("**Running CodeBuild Job")
result = codebuild.start_build(
projectName=resource_config["ProjectName"],
)
cfnresponse.send(
event,
context,
cfnresponse.SUCCESS,
{},
physicalResourceId=result["build"]["arn"],
)


def handle_delete(event, context):
logging.info("**Received delete event - no-op")
cfnresponse.send(
event,
context,
cfnresponse.SUCCESS,
{},
physicalResourceId=event["PhysicalResourceId"],
)


def handle_update(event, context):
logging.info("**Received update event - no-op")
cfnresponse.send(
event,
context,
cfnresponse.SUCCESS,
{},
physicalResourceId=event["PhysicalResourceId"],
)
1 change: 1 addition & 0 deletions .workshop-infra/fn-codebuild-run/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# Nothing else required beyond common Lambda layer
Loading

0 comments on commit 34522f2

Please sign in to comment.