Skip to content

Example of how Gauge and OpenAPI play nicely together to produce living documentation for APIs

License

Notifications You must be signed in to change notification settings

testautomation/gauge-openapi-example

 
 

Repository files navigation

gauge-openapi-example

Gauge Badge Gauge reviewdog License OpenAPI Validator

Example in Python of how Gauge and OpenAPI play nicely together to produce living documentation for APIs.

NB There is also a separate Java example repository, demonstrating the same workflow but using Java as the test implementation language instead of Python.



Example workflow

  1. Have a collaborative story refinement session to come up with specification examples, using example mapping for instance

  2. Write up the specification examples in Gauge.

    Using the example from this repo, we'd have the following Gauge spec written at this point:

     # Pet store availability
    
     ## Customers can see which pets are available in the pet store
    
     * There is a pet named "doggie" available in the pet store
    

    We don't write the underlying implementation for this Gauge spec yet, that will come below.

  3. Now, let's say that implementing this feature requires a new REST API microservice.

    Create an OpenAPI specification to describe our new API, e.g. the openapi.yaml in this repo.

    (The OpenAPI specification file can be YAML or JSON)

  4. Even though we don't have an implementation for our OpenAPI spec yet, we already have all we need to go ahead and implement the Gauge spec.

    1. Generate an SDK client for our OpenAPI spec. One of the really nice things about OpenAPI is that we can generate client and server code just from the spec. We will use OpenAPI Generator to generate our client SDK code:

      1. Install OpenAPI Generator

      2. Generate the client SDK code, e.g:

        openapi-generator-cli generate -i openapi.yaml -g python -o ./python-client-generated

        (we use Python in our example, but you can generate code in many other languages too)

      3. Install our new Python client SDK library:

        cd python-client-generated/ && sudo python setup.py install && cd ../

    2. Now we have our Python client SDK, we can go ahead and implement the underlying code for our Gauge spec:

      from getgauge.python import step
      import openapi_client
      from openapi_client.api import pet_api
      import os
      
      
      @step("There is a pet named <pet_name> available in the pet store")
      def there_is_an_available_pet_named(pet_name):
          with openapi_client.ApiClient(configuration()) as api_client:
              api_instance = pet_api.PetApi(api_client)
              available_pets = api_instance.find_pets_by_status(["available"])
              print(available_pets)
              assert any(pet.name == pet_name for pet in available_pets)
      
      
      def configuration():
          openapi_host = os.environ.get("OPENAPI_HOST")
          if openapi_host is None:
              configuration = openapi_client.Configuration()
          else:
              configuration = openapi_client.Configuration(host=openapi_host)
      
          configuration.access_token = "YOUR_ACCESS_TOKEN"
          return configuration
      

      We did not have to write much code at all, as the Python client SDK provides all the boilerplate for us.

  5. If we ran the Gauge spec now it would fail, because there is no implementation of the OpenAPI spec for the Python Client SDK to communicate with. Enter Prism.

    Prism is a mock server that effortlessly serves example responses based on an OpenAPI spec.

    1. Install Prism

    2. Setup a Gauge environment variable to point our Gauge spec implementation at the Prism mock server that we just started:

      Create an env/mock/openapi.properties file and also an env/validation-proxy/openapi.properties file, both with this content:

      OPENAPI_HOST = http://127.0.0.1:4010

  6. Now we can run our Gauge spec against our mock environment, and it will pass :-)

    • prism mock openapi.yaml --errors
    • gauge run --env mock specs
  7. We can now go ahead and implement the API, based on our OpenAPI spec of course.

    When we have done so, we can run our Gauge spec against it too, without any modification:

    • gauge run specs

    Even better, we can use Prism as a validation proxy against the real server, which verifies that the implementation is fully compliant with the OpenAPI spec:

    • prism proxy openapi.yaml https://petstore.swagger.io/v2 --errors
    • gauge run --env validation-proxy specs

Benefits of this approach

  1. Collaborative - consumers, solution architects, developers, testers, analysts, Product Owner all have a natural interest in being involved. This is a great silo breaker.
  2. Shift Left - enables testing of APIs before implementation has started
  3. Speeds up and improves service consumer and provider integration: provides API consumers with a working mocked example of the API that they can integrate straightaway.
  4. Design-first APIs
  5. Specification by Example
    • Shared understanding between all parties
    • Living documentation, providing a single source of truth. This API documentation stays up to date because it is executable, and is only written in one place (rather than analysts, developers and testers all writing their own separate documentation.)
  6. API black box testing
    • provides great test coverage
    • decoupled from implementation, so does not get in the way of implementation
  7. Consumer-Driven-Contract-Testing
  8. Enables different languages to be used easily - can choose Python for the client SDK and Java for the server implementation, for instance

Running the spec

Prerequisites

  • Install OpenAPI Generator

  • Generate the Python client SDK code:

    openapi-generator-cli generate -i openapi.yaml -g python -o ./python-client-generated

  • Install the generated Python client SDK code:

    cd python-client-generated/ && sudo python setup.py install && cd ../

  • Install Prism

Run the spec against the mock server

  • prism mock openapi.yaml --errors
  • gauge run --env mock specs

Run the spec against the real server

  • gauge run specs

Run the spec using Prism as a validation proxy against the real server

  • prism proxy openapi.yaml https://petstore.swagger.io/v2 --errors
  • gauge run --env validation-proxy specs

Notes

  • This example uses Gauge, but other natural-language specification tools (e.g. Cucumber, SpecFlow etc) would be fine too.

About

Example of how Gauge and OpenAPI play nicely together to produce living documentation for APIs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 54.1%
  • Dockerfile 36.1%
  • Ruby 9.8%