A Developer's Guide to AWS PowerTools for Python Lambda Functions
BY Abhishek Nanda

A Developer's Guide to AWS PowerTools for Python Lambda Functions


INTRO:?

Powertools for AWS Lambda (Python) is a developer toolkit to implement Serverless best practices and increase developer velocity. AWS Powertools is a suite of utilities for AWS Lambda functions to simplify operational best practices, such as structured logging, distributed tracing, and metrics collection. It helps developers achieve best practices and reduce boilerplate code when building serverless applications on AWS. It currently supports AWS Lambda functions written in Python, with support for runtime versions 3.6 and newer.?

KEY FEATURES:?

  1. Logger: Simplifies structured logging in JSON format.?
  2. Tracer: Enables distributed tracing with AWS X-Ray.?
  3. Metrics: Simplifies capturing custom metrics.?

ARCHITECTURE:?

We will be using the above architecture to understand all the features of AWS Powertools. Here our goal is to send some data from the client to the S3 bucket without losing any data. We will be using the below AWS resources:?

  1. Lambda?
  2. SQS?
  3. S3?
  4. CloudWatch?

HOW IT WORKS:?

Logger:?

  1. Import all the required libraries into the Lambda function?

2. Initialize the “Logger” with various parameters?

3. Parameters are to set the values/flags as TRUE or FALSE.??

  • Log_uncaught_exception: it basically logs the uncaught exception that comes while running the lambda function. A logger will log these exceptions before this program exits non-successfully.?

  • serialize_stacktrace: while logging an error, we also find the stacktrace below it like shown below. So, if we want, we can set it FALSE.?

  • POWERTOOLS_LOG_DEDUPLICATION_DISABLED:? this parameter helps us in removing the duplicate logs if any. This is an environment variable which we can use in our case.?
  • POWERTOOLS_LOGGER_LOG_EVENT: this will log if there are any incoming events in the lambda function?

4. Now after setting all the logs parameter, we can use @logger.inject_lambda_context? to get all the logs from the Lambda_Handler.?

5. There is also a way to handle the exceptions in Logger module. It gives us the two new parameters in the JSON log as “exception” & “exception_name”.?

6. After setting all the parameters and exceptions, we can check the logs from cloud watch?

7. Below you will find the Lambda Function Code.

?

8. The final output of the logger is given below.?

9. The default values for the above parameters are?

?Tracer:?

  1. In your aws console, go to lambda & create a function. Use python 3.12 version for better performance?

2. Go into “Configuration” tab and select “monitoring and operation tool” and ENABLE “X-Ray Active tracing”?

?

3. Select layers and configure the default AWSLambda_Powertool_pythonV2 as mentioned below.

?

4. Then start with the code section below.?

  • Import all the libraries and initialize the tracer.?

  • Create a method to upload the data you want to push to s3 and use the annotations of Tracer to capture it like @tracer.capture_method ?

  • Deploy the code and test it. Go and check in the S3 bucket you have mentioned in the code to see if the changes are made or not.?
  • For tracer, go into “Monitor “ section and go down to “View X-Ray Traces”?

  • Go into trace and check there will be a new entry ?

  • Go into "trace Details" to check the list view?

  • You can go into “Trace Map” as well to check the Map view

USE-CASE:?

  1. Create two lambda functions, one for sending the data to SQS and the other to receive the data from SQS and push it to S3. We will also have to create a new SQS and CloudWatch_Log_group.?

2. Import all the libraries required. Set the X-ray Active Trace as “ENABLE”. Add a layer “AWSLambdaPowertoolsPythonV2” into both the lambdas.?

3. Let's come to the first lambda function.??

  • Initialize both “Logger” and “Tracer” with appropriate parameters.?

  • Give the SQS URL so that we can send our data to that SQS.?

  • Inside the lambda_handler, generate a correlation_id for further use. Use boto3 to connect to SQS.????

  • Pass the JSON in event with the correlation_id ?

  • Send the data to SQS like below image.?

  • Use “logger.append_keys” to add the correlation_id into the logs.

4. This will send the data to SQS. We can set the trigger for 2nd lambda through this SQS. Like whenever some data comes to this SQS, it will trigger the 2nd lambda .?

5. Now coming to the second lambda.?

  • Initialize the logger and tracer like in the first lambda. Inside the lambda_handler, collect that “correlation_id” and append that into the logger using “logger.append_keys”?

  • Connect to S3 using boto3 and mention all the required fields like given below.?

  • After pushing the data to S3, delete the message in queue

  • Use “logger.exception(e)” to catch any exception in the code and raise it as “RuntimeError”.?? ?

6. Set the new log_group that you created in “CloudWatch log group” so that both the lambdas will send their logs to the same log_group.?

7. Run the first lambda and you can go to the cloudwatch logs in the monitoring section. You can even check the X-ray tracing in the monitoring section.?

8. In the cloudwatch logs, you will find both the logs contain a key named “correlation_id” which we can use to check the connectivity between two lambdas.?

9. After all this you can go to the given S3 bucket and you will see the (.json) file.?

REFERENCES:?

  1. https://aws.amazon.com/blogs/opensource/simplifying-serverless-best-practices-with-lambda-powertools/?
  2. https://docs.powertools.aws.dev/lambda/python/latest/?
  3. https://serverlessrepo.aws.amazon.com/applications/eu-west-1/057560766410/aws-lambda-powertools-python-layer?
  4. https://docs.aws.amazon.com/search/doc-search.html?searchPath=documentation&searchQuery=powertools&this_doc_guide=Developer%2520Guide?

要查看或添加评论,请登录

社区洞察

其他会员也浏览了