Complete Unit Testing of a Flask API
So you’ve created a Flask API with a few endpoints that interact with a Flask-managed database. Before you deploy your code, you want to ensure that each endpoint will work as expected.
Normally, when we test application code, we write unit tests for individual functions and helper functions, ensuring that each individual piece of code runs as expected. Then a few integration tests can be run to ensure that all of those correct pieces are working together.
Testing API endpoints, however, is a bit trickier. While each endpoint will have its own Python code that executes, this code is kicked off by an API request, often interacts with a database, and generally has associated authentication.
This creates challenges for test coverage that vanilla unit tests cannot cover, and integration tests fail to scale for. Our challenge is to create a scalable solution that tests our API endpoints with the speed of unit tests while covering all relevant components of each endpoint’s life cycle (API call, authentication, and database interactions).
We will walk through how we have set up our unit tests to run both locally and within GitHub Actions given our tech stack.
- API: Flask
- DB: Postgresql
- ORM: flask_sqlalchemy
- Containerization: Docker
- Testing Framework: pytest
For the remainder of this guide, we will assume that you have already created your Flask API, as well as the corresponding DB model using flask_sqlalchemy.
Setting up Conftest
The first step is to create a conftest.py file that generates the items needed by your unit tests to execute successfully.
It is important to ensure that our unit tests run quickly, but still idempotently and independently, so we scope certain fixtures to the session level and others to the function level.
In our conftest.py file we will create needed authentication headers, as well as test versions of the DB and our Flask application.
First we create the Flask App, and the corresponding DB tables. Note how this is done once at the session level, and so we are only creating new data tables once for every time we run an entire unit test suite.
Authentication header fixture:
Now we create a test client from the Flask application. This is run with every unit test function, and at the end of a particular unit test, we clear any created data from the databases.
Now we have all of the assets needed to test our endpoints (test authentication header, Flask client, and initialized DB that is reset after each unit test executes), so let’s write some tests!
Writing Unit Tests
We have the following endpoint for which we would like to write a test:
We can write a unit test for this endpoint as follows:
All that is left to do is to run our unit test suite locally and within our CI pipeline.
Running Locally w/Docker + Docker Compose
We set up two services in our docker-compose.yml file. One being the PostgreSQL DB for testing, and the other to run our unit tests. Actual environment variables that you will need will vary by use case, but here is a minimum suggestion.
Our dockerfile, pytest-dockerfile, looks as follows, and runs pytest with basic code coverage output:
Running in GitHub Actions
Running our test suite locally is great, but we also want to ensure that our tests are able to run within a CI pipeline, like GitHub Actions.
We do this as follows:
Now we have unit tests for our API that test all aspects of an API endpoint’s lifecycle: Authentication, API call, and database changes. We can deploy our API code with more confidence that endpoints, when hit correctly, will produce the anticipated results.
Better yet, we can run these tests locally and through our GitHub Actions CI pipeline, without the need to integrate with other third party providers.
We hope this helps you implement thorough unit tests for your Flask API endpoints.
dragondrop.cloud’s mission is to automate developer best practices while working with Infrastructure as Code. Our flagship OSS product, cloud-concierge, allows developers to codify their cloud, detect drift, estimate cloud costs and security risks, and more — while delivering the results via a Pull Request. For enterprises running cloud-concierge at scale, we provide a management platform. To learn more, schedule a demo or get started today!
Learn More About Web Development
Background We built an OSS containerized tool, cloud-concierge, for automating Terraform best practices, and primarily implemented the solution in Go, given that Go generally is the “lingua franca” for Terraform tooling. With some gaps in Go, however, we built a few...
Yes, an Amazon team moved from serverless microservices to a monolith, and saved a boatload in AWS costs. But the real lesson here is to choose an approach tailored for your use case.
Motivation Database migrations are an essential component in the life cycle of any relational database. We needed a way to trigger database migrations in our development and production environments within our existing CI/CD pipelines running in GitHub Actions. We...