Skip to content

Releases: aws-powertools/powertools-lambda-python

v1.10.2

04 Feb 08:40
2591050
Compare
Choose a tag to compare

Changes

This quick release fixes:

  • batch processing utility when dealing with multiple exceptions when processing SQS records
  • package size for Python <3.8 where typing_extensions module was being installed unnecessarily
  • Typos in the documentation

🌟 Minor Changes

  • fix: remove unnecessary typing-extensions for py3.8 (#281) by @nadobando

📜 Documentation updates

🐛 Bug and hot fixes

  • fix: batch processing exceptions (#276) by @cakepietoast

Internal

This release was made possible by the following contributors:

@cakepietoast, @heitorlessa, @michaelbrewer and @nadobando

v1.10.1

19 Jan 14:26
78d3ab5
Compare
Choose a tag to compare

Changes

This release patches a model mismatch when using SNS -> SQS -> Lambda as opposed to SNS -> Lambda. The former changes three keys that are incompatible with the model we derived from Lambda:

  • MessageAttributes key is not present
  • UnsubscribeUrl becomes UnsubscribeURL
  • SigningCertUrl becomes SigningCertURL

This release also introduces a new envelope, SnsSqsEnvelope, to make this process seamless and easier when processing messages in Lambda that come from SNS -> SQS. This will extract the original SNS published payload, unmarshall it, and parse it using your model.

from aws_lambda_powertools.utilities.parser import BaseModel, envelopes, event_parser

class MySnsBusiness(BaseModel):
    message: str
    username: str

@event_parser(model=MySnsBusiness, envelope=envelopes.SnsSqsEnvelope)
def handle_sns_sqs_json_body(event: List[MySnsBusiness], _: LambdaContext):
    assert len(event) == 1
    assert event[0].message == "hello world"
    assert event[0].username == "lessa"

🌟 Minor Changes

Maintenance

This release was made possible by the following contributors:

@heitorlessa

v1.10.0

18 Jan 15:19
722b4a3
Compare
Choose a tag to compare

Changes

This release adds a number of new features to Logger, Tracer, Validator, and Parameters, a new Lambda Layer with extra packages installed (e.g. parser), and documentation fixes.

Detailed information about these features are at the bottom.

For the next release (1.11.0), we'll be focusing on a new Idempotency utility, import time performance improvements for those not using Tracer utility, and possibly a new Circuit Breaker utility.


This release was made possible by the following contributors:

@am29d, @heitorlessa, @michaelbrewer, @n2N8Z, @risenberg-cyberark and @suud

🌟New features and non-breaking changes

  • feat: Add AppConfig parameter provider (#236) by @risenberg-cyberark
  • feat: support extra parameter in Logger messages (#257) by @heitorlessa

🌟 Minor Changes

  • feat: toggle to disable log deduplication locally for pytest live log #262 (#268) by @heitorlessa
  • improv: add support for custom lambda handlers with kwargs #242 (#269) by @heitorlessa
  • improv: override Tracer auto-capture response/exception via env vars (#259) by @heitorlessa
  • feat: support custom formats in JSON Schema validation (#247) by @n2N8Z

📜 Documentation updates

  • docs: fix import (#267) by @suud
  • docs: add info about extras layer (#260) by @am29d
  • docs: add missing parser models (#254) by @risenberg-cyberark

Internal

Details

Logger

Extra parameter

Logger now supports extra parameter in the standard logging when logging new messages. You can pass in any dictionary to this new parameter, and its keys and values will be available within the root of the structure - This is ephemeral and keys do not persist with its counterpart structure_logs(append=True, ...) method.

Excerpt

from aws_lambda_powertools import Logger

logger = Logger(service="payment")

fields = { "request_id": "1123" }

logger.info("Hello", extra=fields)

Log sample

{
   "timestamp": "2021-01-12 14:08:12,357",
   "level": "INFO",
   "location": "collect.handler:1",
   "service": "payment",
   "sampling_rate": 0.0,
   "request_id": "1123", // highlight-line
   "message": "Collecting payment"
}

Pytest Live Log support

Pytest Live Log support will output logging records as they are emitted directly into the console with colours.

image

Since Logger drops duplicate log records as of 1.7.0, you can now explicitly override this protection when running tests locally to make use of Pytest Live Log:

POWERTOOLS_LOG_DEDUPLICATION_DISABLED="1" pytest -o log_cli=1

Tracer

Override auto-capture response and exception

When using Tracer decorators, capture_method or capture_lambda_handler, we auto-capture its responses and exceptions, serialize them and inject as tracing metadata to ease troubleshooting.

There are times when serializing objects can cause side effects, for example reading S3 streaming objects if not read before. You can now override this behaviour either by parameter or via env var: POWERTOOLS_TRACER_CAPTURE_RESPONSE, POWERTOOLS_TRACER_CAPTURE_ERROR

Parameters

AppConfig support

You can now retrieve and cache configuration stored in AppConfig natively - Thanks to Ran from CyberArk.

from aws_lambda_powertools.utilities import parameters

def handler(event, context):
    # Retrieve a single configuration, latest version
    value: bytes = parameters.get_app_config(name="my_configuration", environment="my_env", application="my_app")

Validator

Custom formats

You can now use formats parameter to instruct Validator utility how to deal with any custom integer or string - Thanks to @n2N8Z

Custom format snippet in a JSON Schema

{
	"lastModifiedTime": {
	  "format": "int64",
	  "type": "integer"
	}
}

Excerpt ignoring int64 and a positive format instead of failing

from aws_lambda_powertools.utilities.validation import validate

event = {} # some event
schema_with_custom_format = {} # some JSON schema that defines a custom format

custom_format = {
    "int64": True, # simply ignore it,
	"positive": lambda x: False if x < 0 else True
}

validate(event=event, schema=schema_with_custom_format, formats=custom_format)

v1.9.1

21 Dec 09:13
2d14c88
Compare
Choose a tag to compare

Changes

This patch release fixes a bug when multiple parent Loggers with the same name are configured multiple times, as per #249.

from aws_lambda_powertools import Logger


class Class1:
    logger = Logger("class_1")

    @staticmethod
    def f_1():
        Class1.logger.info("log from class_1")


class Class2:
    logger = Logger("class_1")

    @staticmethod
    def f_2():
        Class2.logger.info("log from class_2")
        Class1.f_1()


Class2.f_2()

This release also adds brings staged changes initially planned for 1.10 such as minor improvements in Tracer documentation, brings equality checks to ease testing Event source data classes utility, and initial work for MyPy support (PEP 561).

🌟New features and non-breaking changes

🌟 Minor Changes

  • improv: Added eq function to DictWrapper for better equality checks (#233) by @GroovyDan
  • test(general): Add missing tests for parser (#232) by @michaelbrewer
  • docs: add clarification to Tracer docs for capture_method (#244) by @cakepietoast
  • test: DictWrapper equals method (#234) by @Nr18
  • chore: implement phony targets correctly (#235) by @Nr18

🐛 Bug and hot fixes

This release was made possible by the following contributors:

@GroovyDan, @Nr18, @cakepietoast, @dependabot, @dependabot[bot], @gmcrocetti, @heitorlessa and @michaelbrewer

v1.9.0

04 Dec 13:52
b9ec28d
Compare
Choose a tag to compare

Changes

This release adds support for Kinesis, S3, CloudWatch Logs, Application Load Balancer, and SES models in Parser - Exclusively added by Ran (once again) from CyberArk.

Docs clarified Logger keys that cannot be suppressed, a broken link, and the sidebar menu is now always expanded by default for improved UX.

🌟New features and non-breaking changes

  • feat: Add Kinesis lambda event support to Parser utility (#227) by @risenberg-cyberark
  • feat: Add S3 lambda event support to Parser utility #224 (#225) by @risenberg-cyberark
  • feat: Add CloudWatch lambda event support to Parser utility (#231) by @risenberg-cyberark
  • feat: Add ALB lambda event support to Parser utility (#229) by @risenberg-cyberark
  • feat: Add SES lambda event support to Parser utility #213 (#214) by @risenberg-cyberark

📜 Documentation updates

This release was made possible by the following contributors:

@heitorlessa, @igorlg, @pankajagrawal16, @risenberg-cyberark, Pankaj Agrawal and Ran Isenberg

v1.8.0

20 Nov 13:10
66edf65
Compare
Choose a tag to compare

Changes

This release adds support for SNS model in Parser, API Gateway HTTP API IAM and Lambda Authorization support in Event source data classes, and the new EventBridge Replay field in both Parser and Event source data classes.

Docs now have a new FAQ section within Logger to answer a common question on boto3 debugging logs, least privilege IAM permission to deploy Lambda Layers, and fix a typo in SES data class example.

🌟 Minor Changes

  • feat: include new EventBridge replay-name field in parser and data_classes utilities (#211) by @heitorlessa
  • feat: Add sns notification support to Parser utility #206 (#207) by @risenberg-cyberark
  • feat(data_classes): API Gateway V2 IAM and Lambda (#201) by @michaelbrewer

📜 Documentation updates

  • docs: correct example usage of SES data class (#209) by @cakepietoast
  • docs: add faq section (#202) by @Nr18
  • docs: add minimal permission set for using layer (#204) by @am29d

Maintenance

This release was made possible by the following contributors:

@Nr18, @am29d, @cakepietoast, @heitorlessa, @michaelbrewer, @risenberg-cyberark and Ran Isenberg, and @mwarkentin

v.1.7.0

26 Oct 09:50
8d5986a
Compare
Choose a tag to compare

Changes

Event source data classes

This release adds support for Cognito Authentication Challenge Lambda Triggers for create, define and verify trigger sources. It also enhances get_header_value method by optionally supporting case insensitive headers.

Credits: Thanks to @michaelbrewer from Gyft for both enhancements.

Parser utility

image

Parser is a new utility that provides parsing and deep data validation using Pydantic Models - It requires an optional dependency (Pydantic) to work.

It uses Pydantic Model classes, similar to dataclasses, to model the shape of your data, enforce type hints at runtime, serialize your models to JSON, JSON Schema, and with third party tools you can also auto-generate Model classes from JSON, YAML, OpenAPI, etc.

from aws_lambda_powertools.utilities.parser import event_parser, BaseModel, ValidationError
from aws_lambda_powertools.utilities.typing import LambdaContext

import json

class OrderItem(BaseModel):
    id: int
    quantity: int
    description: str

class Order(BaseModel):
    id: int
    description: str
    items: List[OrderItem] # nesting models are supported
    optional_field: Optional[str] # this field may or may not be available when parsing


@event_parser(model=Order)
def handler(event: Order, context: LambdaContext):
    assert event.id == 10876546789
    assert event.description == "My order"
    assert len(event.items) == 1

    order_items = [items for item in event.items]
    ...

payload = {
    "id": 10876546789,
    "description": "My order",
    "items": [
        {
            "id": 1015938732,
            "quantity": 1,
            "description": "item xpto"
        }
    ]
}

handler(event=payload, context=LambdaContext())
# also works if event is a JSON string
handler(event=json.dumps(payload), context=LambdaContext()) 

With this release, we provide a few built-in models to start with such as Amazon EventBridge, Amazon DynamoDB, and Amazon SQS - You can extend them by inheriting and overriding their properties to plug-in your models.

from aws_lambda_powertools.utilities.parser import parse, BaseModel
from aws_lambda_powertools.utilities.parser.models import EventBridgeModel

from typing import List, Optional

class OrderItem(BaseModel):
    id: int
    quantity: int
    description: str

class Order(BaseModel):
    id: int
    description: str
    items: List[OrderItem]

# Override `detail` key of a custom event in EventBridge from str to Order
class OrderEventModel(EventBridgeModel):
    detail: Order

payload = {...} # EventBridge event dict with Order inside detail as JSON 
order = parse(model=OrderEventModel, event=payload) # parse input event into OrderEventModel

assert order.source == "OrderService"
assert order.detail.description == "My order"
assert order.detail_type == "OrderPurchased" # we rename it to snake_case since detail-type is an invalid name

# We can access our Order just as fine now
for order_item in order.detail.items:
    ...

# We can also serialize any property of our parsed model into JSON, JSON Schema, or as a Dict
order_dict = order.dict()
order_json = order.json()
order_json_schema_as_dict = order.schema()
order_json_schema_as_json = order.schema_json(indent=2)

Similar to Validator utility, it provides an envelope feature to parse known structures that wrap your event. It's useful when you you want to parse both the structure and your model but only return your actual data from the envelope.

Example using one of the built-in envelopes provided from day one:

from aws_lambda_powertools.utilities.parser import event_parser, parse, BaseModel, envelopes
from aws_lambda_powertools.utilities.typing import LambdaContext

class UserModel(BaseModel):
    username: str
    password1: str
    password2: str

payload = {
    "version": "0",
    "id": "6a7e8feb-b491-4cf7-a9f1-bf3703467718",
    "detail-type": "CustomerSignedUp",
    "source": "CustomerService",
    "account": "111122223333",
    "time": "2020-10-22T18:43:48Z",
    "region": "us-west-1",
    "resources": ["some_additional_"],
    "detail": {
        "username": "universe",
        "password1": "myp@ssword",
        "password2": "repeat password"
    }
}

ret = parse(model=UserModel, envelope=envelopes.EventBridgeModel, event=payload)

# Parsed model only contains our actual model, not the entire EventBridge + Payload parsed
assert ret.password1 == ret.password2

# Same behaviour but using our decorator
@event_parser(model=UserModel, envelope=envelopes.EventBridgeModel)
def handler(event: UserModel, context: LambdaContext):
    assert event.password1 == event.password2

Credits: Thanks to @risenberg-cyberark from CyberArk for the idea, implementation, and guidance on how to best support Pydantic to provide both parsing and deep data validation. Also, special thanks to @koxudaxi for helping review with his extensive Pydantic experience.

🌟New features and non-breaking changes

  • feat: Advanced parser utility (pydantic) (#118) by @risenberg-cyberark

🌟 Minor Changes

📜 Documentation updates

🐛 Bug and hot fixes

  • improv: keeps Lambda root logger handler intact, and add log filter instead to prevent child log records duplication (#198) by @heitorlessa

🔧 Internal

This release was made possible by the following contributors:

@bmicklea, @dependabot, @dependabot[bot], @heitorlessa, @michaelbrewer, @risenberg-cyberark, @Nr18, and @koxudaxi

v1.6.1

23 Sep 14:27
0b9294a
Compare
Choose a tag to compare

Changes

Bug fixed for event source data classes utillity - accessing boolean atribute values in DynamoDB streams events (bool_value) now works correctly. Thanks @whisller for the fix!

🐛 Bug and hot fixes

  • fix: accessing boolean value in dynamodb utility class (#176) by @whisller

This release was made possible by the following contributors:

@cakepietoast and @whisller

v1.6.0

22 Sep 16:10
aeb52c9
Compare
Choose a tag to compare

Changes

Data_classes utility

image

New utility to easily describe event schema of popular event sources, including helper methods to access common objects (s3 bucket key) and data deserialization (records from Kinesis, CloudWatch Logs, etc).

Huge prop to @michaelbrewer for the contribution, and @cakepietoast for the comprehensive docs with examples.

JSON Schema validation utility

image

New utility to quickly validate inbound events and responses using JSON Schema. It also supports unwrapping events using JMESPath expressions, so you can validate only the payload or key that interests you.

Oh, before I forget! This also includes custom JMESPath functions for de-serializing JSON Strings, base64, and ZIP compressed data before applying validation too 🥰

Metrics with multiple values

Metrics utility now support adding multiple values to the same metric - This was updated in CloudWatch EMF, and Powertools happily support that too ;) - Thanks to @Dunedan for spotting that

Minor doc changes

We added a Testing your code section for Logger and Metrics for customers like @patrickwerz who had difficulties to do unit testing their code with Powertools - Pytest fixture and examples are now provided!

We also increased the content width to ease reading more elaborate sections, and gives us room to start tinkering with a Tutorial/Guide section in coming releases \ o /

🌟New features and non-breaking changes

  • improv: disable tracer when using the Chalice CLI (#172) by @jamesls
  • feat(trigger): data class and helper functions for lambda trigger events (#159) by @michaelbrewer
  • feat: emf multiple metric values (#167) by @cakepietoast
  • feat: simple JSON Schema validator utility (#153) by @heitorlessa

📜 Documentation updates

  • docs: Data Classes Utility (#171) by @cakepietoast
  • docs: add testing tips, increase content width, and improve log sampling wording (#174) by @heitorlessa

🐛 Bug and hot fixes

  • fix: remove DeleteMessageBatch call to SQS api if there are no messages to delete (#170) by @cakepietoast
  • fix: add missing tests, correct calls to DictWrapper constructor and improve metrics type hints (#168) by @michaelbrewer

This release was made possible by the following contributors:

@cakepietoast, @heitorlessa, @jamesls and @michaelbrewer

v1.5.0

04 Sep 10:51
063d8cd
Compare
Choose a tag to compare

Changes

SQS batch processing utility

Add a new utility to handle partial failures when processing batches of SQS messages in Lambda. The default behaviour with Lambda - SQS is to return all messages to the queue when there is a failure during processing. This utility provides functionality to handle failures individually at the message level, and avoid re-processing messages. Thanks to @gmcrocetti who contributed this utility.

Integration with CloudWatch ServiceLens

The xray_trace_id key is now added to log output when tracing is active. This enables the log correlation functionality in ServiceLens to work with applications using powertools.

Static types for Lambda context object

You can now import a static type for the Lambda context object from this library. Thanks to @Nr18 for the implementation.

Control order of logging output

You can now change the order of the fields output by the logger. Thanks to @michaelbrewer for the implementation.

Automatically deserialize parameters

Thanks to @michaelbrewer, the parameters utility can now automatically decide how to deserialize parameter values (json/base64) based on the key name.

Documentation improvements

Lots of improvements made to the documentation. Thanks to the community contributors: @Nr18 for adding a troubleshooting section, @michaelbrewer and @bls20AWS for several housekeeping contributions.

🌟 Minor Changes

📜 Documentation updates

🐛 Bug and hot fixes

  • fix: batch processing util (#155) by @cakepietoast

This release was made possible by the following contributors:

@Nr18, @am29d, @bls20AWS, @cakepietoast, @gmcrocetti, @heitorlessa, @michaelbrewer and @pankajagrawal16