Many JSON REST APIs today offer a Swagger page as documentation as a way to explore the API (see also this recent post by Jorrit). This Swagger page (aka Swagger UI) is often used by testers to interact with the API to manually construct and verify the API calls that are then implemented as an automated test.

What you may not realize, is that the Swagger UI is generated from an openapi.json or openapi.yaml file hosted by the API server. To ensure "Swagger compatibility", this must follow the OpenAPI Specification.

But if a webpage that can interact with the API is generated from this document, shouldn’t it be possible to generate test cases for this API from it also?

Let’s find out!

Generating the test cases

It may come as no surprise that it is indeed possible to generate the test cases from the OpenAPI document. The paths section of the document describes all the available endpoints, the operations that are supported for the endpoint and, for each operation, the possible responses.

"/wagegroups": {
    "post": {
        "summary": "Post Wagegroup",
        "operationId": "post_wagegroup_wagegroups_post",
        "requestBody": { *snip* },
        "responses": {
            "201": {
                "description": "Successful Response",
                "content": { *snip* }
            },
            "418": {
                "description": "I'm a Teapot",
                "content": { *snip* }
            },
            "422": {
                "description": "Validation Error",
                "content": { *snip* }
            }
        }
    }
},

In this example, we see that, for the /wagegroups endpoint, we have 3 test cases for the post operation (200, 418 and 422 responses).

Now writing those 3 test cases by hand is not that much effort, but endpoints that are for a specific resource generally have more methods; get to retrieve the details for a resource, put to update it and delete to remove it. All these methods can result in different responses, so it’s not uncommon to have 5-10 test cases for a single endpoint. Now writing all those by hand will quickly be a lot of work so it’d be nice to instead generate these test cases.

Within the Robot Framework, the DataDriver library does exactly that; generate test cases on the fly based on a data source. A typical DataDriver test suite looks similar to this:

*** Settings ***
Library           DataDriver
Resource          openapi.json
Test Template     Test Endpoint

*** Test Case ***
Test Endpoint for ${method} on ${endpoint} where ${status_code} is expected

*** Keywords ***
Test Endpoint
    [Arguments]    ${endpoint}    ${method}    ${status_code}
    Perform Magic    endpoint=${endpoint}    method=${method}    status_code=${status_code}

What DataDriver will do, is run the keyword referred to in the Test Template (in this case Test Endpoint) on each generated test case, passing the specified parameters (${endpoint}, ${method} and ${status_code}) to the Test Endpoint keyword. The Test Endpoint keyword will in turn Perform Magic for each test case.

Performing Magic

So generating the test cases is easy…​but how do we Perform Magic?

The good news is that for a good deal of the test cases, there is little magic needed: what needs to be done is to construct a (http) request and the response should be in accordance with the OpenAPI document. The OpenAPI document also details all the information needed to construct the request.

Let’s take a look at our previous example. To validate the test case for the post request on /wagegroups that results in a 201 response, we can find the required data in our OpenAPI document:

"post": {
    "summary": "Post Wagegroup",
    "operationId": "post_wagegroup_wagegroups_post",
    "requestBody": {
        "content": {
            "application/json": {
                "schema": {
                    "$ref": "#/components/schemas/WageGroup"
                }
            }
        },
        "required": true
    },
    "responses": {
        "201": {
            "description": "Successful Response",
            "content": {
                "application/json": {
                    "schema": {
                        "$ref": "#/components/schemas/WageGroup"
                    }
                }
            }
        }
    }
}

In the above snippet, we see a reference to the WageGroup schema, which is also found in the OpenAPI document:

"WageGroup": {
    "title": "WageGroup",
    "required": [
        "id",
        "hourly_rate"
    ],
    "type": "object",
    "properties": {
        "id": {
            "title": "Id",
            "type": "string"
        },
        "hourly_rate": {
            "title": "Hourly Rate",
            "type": "number"
        }
    }
}

So to make a valid request, we need to provide a JSON body with an id of the type string and a hourly_rate that is a number. Now in this simple example, there are no further restrictions (like a minimum or maximum length for the id or a minimum or maximum value for the hourly_rate) but such restrictions can also be taken into account when generating random values for the desired properties.

For the purpose of contract testing, using randomly generated data works well. While naming an employee 32d18a7e7e574f55bc2958fcea5b4a66 may not be a typical name found on a password, it’s as much a string as Joe, Jane or Andrianampoinimerinatompokoindrindra. The same holds true for other data types. Remember that the data needs to follow the contract, it does not have to make sense from a business point of view.

Since the amount of data types supported by the OpenAPI Specification is limited, it is not very hard to generate the required data for most API endpoints. Validation of the response against the OpenAPI document is also quite easy; there are open source packages available to verify a response against a schema and additional validations (e.g. are there no extra properties present in the response?) are also not too complex.

So without actually performing magic, generating the test cases and executing them is very much possible.

Robot Framework OpenApiDriver

Based on the ideas discussed above, I created the OpenApiDriver. The OpenApiDriver is a specialized library based on DataDriver that handles the generation of the test cases and provides a number of keywords to validate the the API implementation matches the OpenAPI document provided by the API.

The main test suite for the project’s acceptance tests looks like this and can serve as an example for if you want to give the library a try in your own project.

*** Settings ***
Library             OpenApiDriver
...                 source=http://localhost:8000/openapi.json
...                 origin=http://localhost:8000
...                 base_path=${EMPTY}
...                 mappings_path=${root}/tests/user_implemented/custom_user_mappings.py
...                 response_validation=INFO
...                 require_body_for_invalid_url=${TRUE}

Test Template       Validate Test Endpoint Keyword

*** Test Cases ***
Test Endpoint for ${method} on ${endpoint} where ${status_code} is expected

*** Keywords ***
Validate Test Endpoint Keyword
    [Arguments]    ${endpoint}    ${method}    ${status_code}
    IF    ${status_code} == 404
        Test Invalid Url    endpoint=${endpoint}    method=${method}
    ELSE
        Test Endpoint
        ...    endpoint=${endpoint}    method=${method}    status_code=${status_code}
    END

Of course, there is more to it than what was discussed above. Did you notice the mappings_path parameter and wondered what that custom_user_mappings.py module is about? Does your API have properties that must be unique (i.e. an employee’s email address or login name)? Are there resources with properties referring to the id of another resource? Maybe some properties have restrictions on them that cannot be expressed in the OpenAPI document?

These are all issues that I faced when implementing OpenApiDriver to validate a (C++) API as it was being implemented according to the OAS specification and solutions to these issues are already in the library. Stay tuned for a future post that delves into these advanced use cases!

At the same time, there are many ways to implement an API that adheres to the OAS and not everything that is supported by the specification is, at present, supported by the OpenApiDriver. Development is ongoing and I’m very much interested in your feedback. What is working for you? What are you missing? Feel free to contact me at robin.mackaij@enqore.tech or raise an issue in the repo!

shadow-left