bravado
bravado copied to clipboard
Automated testing of swaggerfiles
Potentially a dumb question: Is there a way to auto-generate python unit tests using a swagger spec?
Could you describe a bit more what you're trying to achieve? I don't think that this exists currently, but i'd be interested in hearing what you had in mind.
I think, it something like is done in https://github.com/noahdietz/oatts project (but it's on js)
The main idea is to generate template code of tests using operation and status_codes from swagger spec. For example:
$ bravado generate -s ./path/to/swagger.yaml -w ./output/dir
$ ls ./output/dir
pet-test.py pet-{petId}-uploadImage-test.py user-test.py
and content of generated file(-s) is something:
from bravado.requests_client import RequestsClient
from bravado.client import SwaggerClient
http_client = RequestsClient()
http_client.set_api_key(
'api.yourhost.com', 'token'
param_name='api_key', param_in='header'
)
client = SwaggerClient.from_url(
'http://petstore.swagger.io/v2/swagger.json',
http_client=http_client,
)
petstore = Swagger.from_url(..., config={'also_return_response': True, 'validate_requests': True})
class TestPet:
def test_get_pet_petId_200(self):
pet, http_response = petstore.pet.getPetById(petId=42).result()
http_response.should.status_code(200)
def test_get_pet_petId_404(self):
pet, http_response = petstore.pet.getPetById(petId=21).result()
http_response.should.status_code(404)
...
I assume that I can update this tests by correct values for appropriated response codes (i will prepare the state of the "system under tests" by myself) — all I need is generated templates for combination of tests.
Actually i'm thinking that it could generate in next versions such cases as "boundary cases". For example, input request:
- without any mandatory field
- without all optional field
- without auth headers
- wrong types of input params ...
On the same lines as @ujlbu4 Essentially I want to be able to specify things like:
expected_response = {
"body": {},
"meta": {}
}
assertEqual(response.keys(), expected_response.keys())
assertExists(response.body)
And pass on the spec to a jenkins/CI job which generates the correct unittest code, runs it against the designated server and publishes the results. No human intervention once the specs are written