top of page
  • Writer's pictureTim Burns

Test First Design with AWS Serverless

Updated: Dec 7, 2019

Get Test Coverage with PyTest on Local Systems

Momentum is building behind pure serverless environments with API Gateway and Lambda, but the design patterns have not caught up. In particular, the serverless articles I've read only address automation by using CloudFormation to deploy the stack. It's not good enough.

Developers need the ability to write failing unit tests locally, fix those tests, and then deploy the code to integration. Running CloudFormation on every deploy is too much overhead. It will lead to poor design where the developer is tweaking something until it works, and then throwing it over the fence into the AWS ecosystem.

I've come up with a better way. Let's start with a simple problem: parsing a string with a regular expression. Say I want to write tests to parse out various portions of a file. Given the file name:


I want to see the following parsed out, and so I write a test.

self.assertEqual(result["directory"], "api-gateway/tests/data") self.assertEqual(result["company"], "apple") self.assertEqual(result["application"], "health_tracking") self.assertEqual(result["date"], "20191123") self.assertEqual(result["time"], "1045") self.assertEqual(result["activity"], "steps")

If I implemented a Lambda function to resolve the test, I would compile the code, test it, tweak it, test it, but would take a while. I could do this using curl functions, but there is a better way: create a Python library as a Lambda Layer, and then build the gateway on top of that layer.

Here is how to go about it.

  1. Write Failing Unit Tests

  2. Implement the functionality in a Python library

  3. Build out a Lambda layer on an EC2 instance

  4. Deploy the zip file as a Lambda Layer

Compile your Python code to a Lambda Layer

Developers on Windows for the Linux environment used by AWS serverless is a challenge. To address this, create an EC2 build server. Use Ubuntu 18.02 and install the following on the server.

sudo apt update sudo apt install python3-pip sudo apt install awscli sudo apt-get install zip unzip

Configure your credentials on the server with "aws configure".

The utility in will build on the server by issuing remote ssh calls through the "paramiko" library. For example

setup_commands = ( f"rm -rf {layer_name}_deploy", f"mkdir -p {layer_name}_deploy/python/lib/python3.6/site-packages/{layer_name}" )

Execute it with

stdin, stdout, stderr = ssh_client.exec_command(command)

Running the script in the Makefile target deploy-lambda will generate a zip file an S3 bucket in a format for Lambda Layers.

Create a Lambda Layer using the Boto3 Client

The Boto3 Client is the final step to creating the Lambda Layer in your servless system.

Connect to the AWS account and deploy the Lambda Layer

session = boto3.Session(profile_name=args.profile) lambda_client = session.client('lambda') deploy_lambda_aws(args, lambda_client)

Create the file with the API and save the output.

response = lambda_client.publish_layer_version( LayerName=args.layer_name, Description=args.layer_description, Content={ 'S3Bucket': args.s3_bucket, 'S3Key': f"{args.layer_name}.zip" }, CompatibleRuntimes=[ 'python3.6', 'python3.7', 'python3.8', ] )

Stay tuned for my next article where I wire all this up together with API Gateway for a functional web service.

9 views0 comments

Recent Posts

See All

Providence Porchfest '24

Maddie Cardoza at Providence Porchfest '24 Had a wonderful day yesterday at Porchfest.


bottom of page