Robot Framework library for RESTful JSON APIs
- RESTinstance relies on Robot Framework's language-agnostic, clean and minimal syntax, for API tests. It is neither tied to any particular programming language nor development framework. Using RESTinstance requires little, if any, programming knowledge. It builts on long-term technologies with well established communities, such as HTTP, JSON (Schema), Swagger/OpenAPI and Robot Framework.
- It validates JSON using JSON Schema, guiding you to write API tests to base on properties rather than on specific values (e.g. "email must be valid" vs "email is [email protected]"). This approach reduces test maintenance when the values responded by the API are prone to change. Although values are not required, you can still test them whenever they make sense (e.g. GET response body from one endpoint, then POST some of its values to another endpoint and verify the results).
- It generates JSON Schema for requests and responses automatically, and the schema gets more accurate by your tests. Output the schema to a file and reuse it as expectations to test the other methods, as most of them respond similarly with only minor differences. Or extend the schema further to a full Swagger spec (version 2.0, OpenAPI 3.0 also planned), which RESTinstance can test requests and responses against. All this leads to reusability, getting great test coverage with minimum number of keystrokes and very clean tests.
On 3.6, 3.7 you can install and upgrade from PyPi:
python3 -m venv venv
source venv/bin/activate
pip install --upgrade RESTinstance
On 2.7 series the package works as well, but using 2.7 is not preferred 2020 onwards:
pip install --user --upgrade virtualenv
virtualenv venv
source venv/bin/activate
pip install --upgrade RESTinstance
These also install Robot Framework if you do not have it already.
There is a step-by-step tutorial in the making, best accompanied with the keyword documentation.
- Create two new empty directories,
atest
andresults
. - Create a new file
atest/YOURNAME.robot
with the content:
*** Settings ***
Library REST https://jsonplaceholder.typicode.com
Documentation Test data can be read from variables and files.
... Both JSON and Python type systems are supported for inputs.
... Every request creates a so-called instance. Can be `Output`.
... Most keywords are effective only for the last instance.
... Initial schemas are autogenerated for request and response.
... You can make them more detailed by using assertion keywords.
... The assertion keywords correspond to the JSON types.
... They take in either path to the property or a JSONPath query.
... Using (enum) values in tests optional. Only type is required.
... All the JSON Schema validation keywords are also supported.
... Thus, there is no need to write any own validation logic.
... Not a long path from schemas to full Swagger/OpenAPI specs.
... The persistence of the created instances is the test suite.
... Use keyword `Rest instances` to output the created instances.
*** Variables ***
${json} { "id": 11, "name": "Gil Alexander" }
&{dict} name=Julie Langford
*** Test Cases ***
GET an existing user, notice how the schema gets more accurate
GET /users/1 # this creates a new instance
Output schema response body
Object response body # values are fully optional
Integer response body id 1
String response body name Leanne Graham
[Teardown] Output schema # note the updated response schema
GET existing users, use JSONPath for very short but powerful queries
GET /users?_limit=5 # further assertions are to this
Array response body
Integer $[0].id 1 # first id is 1
String $[0]..lat -37.3159 # any matching child
Integer $..id maximum=5 # multiple matches
[Teardown] Output $[*].email # outputs all emails as an array
POST with valid params to create a new user, can be output to a file
POST /users ${json}
Integer response status 201
[Teardown] Output response body ${OUTPUTDIR}/new_user.demo.json
PUT with valid params to update the existing user, values matter here
PUT /users/2 { "isCoding": true }
Boolean response body isCoding true
PUT /users/2 { "sleep": null }
Null response body sleep
PUT /users/2 { "pockets": "", "money": 0.02 }
String response body pockets ${EMPTY}
Number response body money 0.02
Missing response body moving # fails if property moving exists
PATCH with valid params, reusing response properties as a new payload
&{res}= GET /users/3
String $.name Clementine Bauch
PATCH /users/4 { "name": "${res.body['name']}" }
String $.name Clementine Bauch
PATCH /users/5 ${dict}
String $.name ${dict.name}
DELETE the existing successfully, save the history of all requests
DELETE /users/6 # status can be any of the below
Integer response status 200 202 204
Rest instances ${OUTPUTDIR}/all.demo.json # all the instances so far
- Make JSON API testing great again:
robot --outputdir results atest/
Bug reports and feature requests are tracked in GitHub. We do respect pull request(er)s.
The logic is detaching the enving from system level (dependencies) as following:
- We use pyenv to manage n Pythons user-wide.
- With Pyenv installed Pythons, we never mess with the system's default Python.
- We ended up to Nox after evaluating
Bash scripts,
make
,invoke
andtox
to achieve automated virtualenving.
To understand the first two of the practices, these are worth reading:
- Real Python's intro to pyenv
- Real Python's virtualenvs primer
- virtualenv compatibility with the stdlib venv module
Third, unlike Tox, Nox uses Python file (noxfile.py
) for configuration, yet:
- Supports multiple Python versions, each session is ran on some
pythonX.X
. - A session is a single virtualenv which is stored in
.venv/<session_name>
. - Every
nox
recreates session, thus virtualenv, unlessreuse_venv=True
.
The pyenv setup works on OS X and on the common Linux distros out of the box:
curl https://pyenv.run | bash
export PATH="$HOME/.pyenv/bin:$PATH"
eval "$(pyenv init -)"
The first script installs it user-wide, thus it never requires sudo
rights.
If you are on Windows, using pyenv might or might not be an option. Regardless, you want to check pyenv-win instead.
We test, develop, build and publish on Python 3.6.9, and use venvs as preferred:
git clone [email protected]:asyrjasalo/RESTinstance.git
cd RESTinstance
pyenv install --skip-existing 3.6.9 && pyenv rehash
python3 -m venv .venv/dev
source .venv/dev/bin/activate
pip install -e .
Nox automates handling .venv/<task>
s for workflows, that on Windows as well:
pip install --upgrade nox
The actual tasks are defined in noxfile.py
, as well as our settings like:
- The default Python interpreter to run all the development tasks is
python3.6
- We explicitly use venv module now for virtualenving, as we develop on Python >= 3.3 anyway
- Whether a new virtualenv is always recreated when the respective task is run (which is default for most of our tasks)
Session is a task, running in the .venv/<task>
. To list all possible sessions:
nox -l
Sessions defined in RESTinstance/noxfile.py
:
* test -> Run development tests for the package.
- testenv -> Run development server for acceptance tests.
* atest -> Run acceptance tests for the project.
- docs -> Regenerate documentation for the project.
- black -> Reformat/unify/"blacken" Python source code in-place.
- prospector -> Run various static analysis tools for the package.
- build -> Build sdist and wheel dists.
- release_testpypi -> Publish dist/* to TestPyPI.
- install_testpypi -> Install the latest (pre-)release from TestPyPI.
- release -> Tag, build and publish a new release to PyPI.
- install -> Install the latest release from PyPI.
- clean -> Remove all .venv's, build files and caches in the directory.
Sessions marked with *
are selected, sessions marked with -
are skipped.
That is, to run both test
s and atest
s:
nox
Session nox -s atest
assumes you have started testapi/
on
mountebank:
nox -s testenv
Running the above assumes you have node
(>= 6) installed in your system.
After started, you can debug requests and responses by tests in web browser at localhost:2525.
Both nox -s test
and nox -s atest
allow passing arguments to pytest
and robot
, respectively:
nox -s test -- test/<test_modulename>.py
nox -s atest -- atest/<atest_suitedir>/<atest_suitefile>.robot
You know, having a virtualenv even for generating docs/
- why not a bad idea:
nox -s docs
Remove all sessions (.venv/
s) as well as temporary files in your working copy:
nox -s clean
Our PyPI distributions are known to work well on Python 3.7 and 2.7 series too:
nox -s clean build
We use zest.releaser for
versioning, tagging and building (universal) bdist_wheel
s.
It uses twine underneath to upload to PyPIs
securely over HTTPS, which can't be done with python setup.py
commands.
This workflow is preferred for distributing a new (pre-)release to TestPyPI:
nox -s test atest docs clean build release_testpypi install_testpypi
If that installed well, all will be fine to let the final release to PyPI:
nox -s release
To install the latest release from PyPI, and in a dedicated venv of course:
nox -s install
For intermediate nox
arguments usage, you'll advance by enabling
shell completion:
eval "$(register-python-argcomplete nox)"
On zsh
, ensure you have bash compatibility enabled in .zshrc
or similar:
autoload -U bashcompinit
bashcompinit
These completions likely do not work on vanilla PowerShell, but can be used on Windows Subsystem for Linux.
Catching errors already write-time, regardless of the editor, is advantaged by Palantir's Python Language Server:
python3 -m venv .venv/dev
source .venv/dev/bin/activate
pip install --upgrade python-language-server[all]
Installing the all bundle, and the LSP plugin for your editor, enables to run useful linters real-time, like:
- Either
autopep8
orblack
(preferred) for automated code formatting isort
for sorting code import statements- McCabe for code complexity checking
mypy
for static type checking on Python 3pycodestyle
for coding style checkingpyflakes
for detecting various coding errors
Remember to (auto-)start the language server on background via your editor.
We want our static analysis checks ran before code even ends up in a commit.
Thus both nox
and nox -s test
commands bootstrap
pre-commit hooks in your git working copy.
The actual hooks are configured in .pre-commit-commit.yaml
.
- export
mb
recorded responses to CI (pre-commit hook:nox -s save_testenv
) - change
nox -s testenv
to load the saved testenv -> rid of--allowInjection
- add CI (GitHub Actions? GitLab?)
- add Python types to pass
prospector --with-tool mypy
- enable pre-commit hook for prospector
RESTinstance is under Apache License 2.0 and was originally written by Anssi Syrjäsalo.
It was first presented at the first RoboCon, 2018.
Contributors:
- jjwong for helping with keyword documentation and examples (also check RESTinstance_starter_project)
- Przemysław "sqilz" Hendel for using and testing RESTinstance in early phase (also check RESTinstance-wrapper)
- Vinh "vinhntb" Nguyen, #52.
- Stavros "stdedos" Ntentos, #75.
- Nicholas "bollwyvl" Bollweg, #84.
- Trey Turner, #86
We use following Python excellence under the hood:
- Flex, by Piper Merriam, for Swagger 2.0 validation
- GenSON, by Jon "wolverdude" Wolverton, for JSON Schema generator
- jsonpath-ng, by Tomas Aparicio and Kenneth Knowles, for handling JSONPath queries
- jsonschema, by Julian Berman, for JSON Schema validator
- pygments, by Georg Brandl et al., for JSON syntax coloring, in terminal Output
- requests, by Kenneth Reitz et al., for making HTTP requests
See requirements.txt for all the direct run time dependencies.
REST your mind, OSS got your back.