Tests are the cornerstone of Assertible and used to check and validate HTTP requests and responses. In a nutshell, a test is a single HTTP request which utilizes assertions and configuration data to prove an endpoint is working as expected.

Tests are designed to be small, reproducible, and deterministic so that there no false-positives and error messages can be reported accurately.


Fig 1.1 Tests for an API

Contents

Create a new test

There are several ways you can create a new test for your web service:

Create a basic test

You can create a basic test from the primary Create Test button in the top-right corner of the service overview:

Fig 1.1 Create test for a web service

Choosing this option gives you a blankslate for configuring a new request, assertions, etc. The authentication will be the service's

default auth and the default endpoint is /.

You can also use the "+" plus button on an endpoint card, and this will create a blank test with the endpoint pre-populated:

Fig 1.2 Create test from API

Import new tests

You can also import new tests from a Swagger/OpenAPI specification, Postman collection, or curl command. With a OpenAPI specification or Postman collection, you can choose specifically which tests to import, which makes it easy to keep you Assertible tests up-to-date with your web service.

Fig 1.2 Import new API tests

Configuring a Test

Tests can be configured from the test settings view. From there, you can edit the name, request settings, variables, more.

Sections


Configure an API test in Assertible

Basic test info

Basic info is all the settings for a test that don't affect the request. All of these settings are available at the top of any test view. The following fields can be updated:

Test Name - Name for your test! (Uniqueness not enforced). In our experience, the test name should be as specific as possible. This helps identify error messages quickly.

Description / Notes - A short description of your test, so other people on your team will know what it's for. GitHub Flavored Markdown is supported. Sometimes it's useful to put a use-case or expected input/output in this field.

Tip! If you import an API from a Swagger / OpenAPI definition, many values are extracted like the description.

Request configuration

At it's core, a test is a single HTTP request. The reason Assertible only utilizes a single request for a test is:

  • Tests are hermetic
  • Reproducibility
  • Deterministic
  • Better failure messages.

Everything about your test's HTTP request can be configured from the test view. By default, a new test will have enough information to be runnable. You can additionally configure the following:

NOTE: You can chain multiple HTTP requests by using setup steps

HTTP Method

The HTTP method for the request. Can be GET, PUT, POST, PATCH, DELETE, HEAD, OPTIONS.

HTTP Scheme

The HTTP scheme. Can be http or https.

HTTP version

Use can use a specific HTTP version. The default is 1.1; other options are 1.0, 0.9.

Host (base URL)

The base URL is a static field from the web service

default environment.

The base URL is visible but cannot be changed.

Endpoint

The path part of the URL. For example, in assertible.com/signup, /signup would be the endpoint.

Authentication & Authorization

Authentication used for the request, if necessary. This field is designed to override any authentication data specified for the web service

Supported auth types are Basic Authentication, Digest Authentication, oAuth v1.0a, oAuth v2.0 Bearer Token, and API Token Auth.

If you have an auth type that is not supported, you can probably set any auth headers manually.

NOTE variable syntax is supported in all auth fields

Use cookies

If checked, all HTTP requests invoked by a test including setup steps.

For example, if you have a setup step that calls a /login endpoint, and requires Cookie headers to be persisted for all subsequent requests.

Max redirects

The maximum number of HTTP redirects allowed before aborting the request. The default is 2, maximum is 100.

The number of max redirects is also applied to all setup and teardown steps. For example, if your test has a "max redirects" value of 0, no redirects will be preformed in any setup, teardown, or main test requests.

Variables

Variables provide a way for you to insert dynamic data into certain parts of a request, like the endpoint. For example, let's say you're testing this endpoint: /tests/{{testid}}. You can populate the {{testid}} part before the test is run. In the image below, {{testid}} will be substituded with a UUID:

Fig 1.6 Route Parameters

In the above example, every time the test is run the final URL will be: https://assertible.com/tests/12345.

Variables can be defined in the 'Variables' section of the test configuration, in the environment, or in a setup steps. Variables can be used in the following parts of a test:

You can splice a variable into any of the above fields by using mustache-like template syntax: {{...}}. If a variable is used, but not defined, the test will fail. The test result will show you the reason for the failure.

Query parameters

Add query parameters to the request.

Query parameters can be disabled for debugging purposes by checking the checkbox next to the corresponding query parameter. If the checkbox is not checked, the query parameter is disabled and won't be used in the request.

Note: No url or percent encoding is done by Assertible.

Request headers

Add custom headers to the request. This is useful for manually setting auth other service specific information.

Request headers can be disabled for debugging purposes by checking the checkbox next to the corresponding header. If the checkbox is not checked, the request header is disabled and won't be used in the request.

Request body

Set a request body.

  • x-www-form-urlencoded

    A set of key value pairs. When using this request body type, Assertible handles URL encoding special characters. URL encoding is done after variable replacement.

  • multipart/form-data

    A set of key value pairs representing a multipart/form-data upload.

    The multipart boundary is generated automatically. A Content-Type: multipart/form-data; boundary=... header is also created automatically and will override any existing Content-Type header configured for the test.

  • Raw

    A raw request body. Assertible does not URL/percent encode or otherwise modify raw request bodies in any way before submitting to the remote server.

Setup and teardown steps

Setups and teardowns are steps that are run before and after the test itself is run. This makes it possible to create multi-step tests for more complex testing requirements. These steps are re-usable across all tests on the web service and can be enabled on any test.

Setup steps

With setup steps, you can make a separate HTTP request and capture the response, generate random variables, or pause for a period of time before running the test.

Teardown steps

Teardown steps allow you to run additional HTTP requests after the test is complete, for tasks like cleaning up left over from a PUT request.

Setup steps

Setup steps allow you to create test variables before your test is run by capturing them from an HTTP request or generating random data. There are two types of setup steps:

There are many use-cases for setup steps, like fetching an auth token to use in your test, or logging into a website to create a session cookie. In all of these cases, you can use setup steps to describe how to fetch and populate variables before your test is run.

Setup steps can be configured on the Setup step tab of the test configuration page.

Fig 1.6 Configure Setup Step

Multiple setup steps

Multiple setup steps can be configured to test more complicated scenarios, setup pre-requisite dynamic data, and chain HTTP requests. Simply enable two or more setup steps, then drag and drop the steps into the desired order.


Test scenarios using multiple setup steps

You might notice that you can only make assertions on the core test request. This is intentional to ensure that the test focuses on testing a specific endpoint and error messages are specific as possible.

NOTE whenever possible, we recommend using static environment variables to model pre-requisite data. The more steps you add to a test, the more likely it is to have flaky results.

Using variables in setups

When creating a setup step, you can use variables from the environment and previous setup steps in the configuration. They can be used in any field of a setup step configuration except the name and the name of any captured variables.

For example, if you have an environment variable name {{authenticationServer}}, you can use that variable within a setup step definition. In the image below, an environment variable is used as the request host on an HTTP setup step:

Fig 1.1 Variable in setup step configuration

HTTP request generator

The HTTP Request setup step allows you to make an additional HTTP request before the main test is run. You can capture data from the HTTP request, save it in a test variable, and use it throughout your tests.

HTTP request variable capture

You can capture variables from the setup's HTTP request, and then use those variables in your test and in teardowns.

Each variable capture requires three fields:

  • Name

    This is the name of the variable, which you can then use throughout the test. For example, you can name the variable userId and then use it as {{userId}}.

  • From

    This field specifies which part of the request to capture the variable; either from the response header or response body.

  • Selector

    The selector describes how to capture a variable. For example, if From is JSON body, then this is a JSON path selector, if From is XML body, then this is an XPath selector, and if it From is Header then this is the name of the response header. If From is Text body, then the selector is not configurable and the variable is populated with the entire contents of the response body.

HTTP request setup examples

For example, if you want to test a POST request to create an entity and a subsequent GET request to fetch the entity, you would have a test comprised of two steps:

  • POST /item in your setup step, and save the id from the response.
  • GET /item/{{id}} in your test by using the id variable.

Setup steps can also be used without populating any variables. For example, if you want to perform a test as a logged in user, you would have a login request in your setup step:

  • POST login form to /login in your setup step
  • GET /account or POST a form to /form in your test step

NOTE: Ensure the use cookies setting is enabled if you want sessions to persist between your setup step and test.

Random generator

A Random setup step allows you to generate random data, save it to a test variable, and use it throughout your tests. This is useful in cases where you don't need to fetch any specific data, but you need to, for example, populate a userName query parameter.

Random setup variable capture

Capturing a variable in a Random setup generator only needs two fields:

  • Type

    The type of random data to generate. This can be Random Text, Random Number, or Random UUID.

  • Name

    This is the name of the variable, which you can then use throughout the test. For example, you can name the variable userId and then use it as {{userId}}.

Pause step

A Pause step step allows you to priefly pause for a duration between test steps. The pause duration is represented in seconds with a minimum of 0.1 seconds and a maximum of 30 seconds.

Teardown steps

Teardown steps are very similar to setup steps, except they are run after the test instead of before. Using teardown steps is useful for things like cleaning up left over test data and logging out of a web service.

Multiple teardown steps can be enabled for a single test to allow for cleaning up data left over from setup steps and running other tasks after the test has completed.

Teardown steps currently support these generators:

HTTP request teardown

This type of teardown allows you to make an HTTP request after the test. The HTTP request can be used to clean up test data, or perform any other post-test actions. Inside of a teardown step, these variables are available:

Additionally, a teardown step can capture variables from the test and use them in it's configuration. Check out the teardown examples for more information.

HTTP request teardown variable capture

In addition to using variables defined in the test, environment, and setup steps, a teardown can capture variables from the test's response. To capture a variable in a teardown step, the following configuration is available:

  • From

    Which part of the test response to capture the variable. This is either Response body or a Response header.

  • Name

    The name of the variable which can then be used within the teardown. For example, if the name is userId then the {{userId}} variable is available.

  • Selector

    The selector describes how to capture a variable. For example, if From is JSON body, then this is a JSON path selector, if From is XML body, then this is an XPath selector, and if it From is Header then this is the name of the response header. If From is Text body, then the selector is not configurable and the variable is populated with the entire contents of the response body.

HTTP request teardown examples

Teardown steps are most useful for cleaning up data left over from the test. For example, imagine a scenario that test's PUTing an entity. This flow could be accomplished by:

  1. Call POST /item in a setup step to create a new item, and save the id returned from the request in a variable.
  2. In the test, make a request to PUT /items/{{itemId}} using the variables saved from step 1.
  3. Call DELETE /item/{{itemId}} during the teardown step, to delete the item created in the setup step.

This test structure is ideal because it allows tests to define and create their own dependencies, leaving little room for flaky failures because it doesn't rely on a complex sequence of requests before the test.

Tests and assertions

Fig 1.4 Tests and Assertions

You can add and manage your test's assertions on the Assertions tab of a test's configuration view. For more information on how to configure assertions, check our assertions documentation.

Test Badges

Each test has an embeddable badge for displaying the current status of only that test's assertions. You can use this to communicate current test states with team members or within documentation pages. The badge will display the current test result state at any given time.

Fig 1.5 Embeddable test badges

You can also create a test badge for a specific environment like staging or production by using the environment query parameter on the trigger url.

Badges can also be created for an entire web service.

Embedding badges

You can find and copy the badge image URL on any test's page, and use it outside of the dashboard. More specifically, there is a copy button next to the badges in the dashboard which will copy the markdown for the badge directly to your clipboard. Here are a few examples of using badges:

Badge monitoring test on production

https://assertible.com/tests/{{test-id}}/status?api_token={{token}}

Badge monitoring test on staging

https://assertible.com/tests/{{test-id}}/status?api_token={{token}}&environment=staging

Markdown:

[![Assertions Status](https://assertible.com/tests/{{test-id}}/status?api_token={{token}})]

HTML:

<img src="https://assertible.com/tests/{{test-id}}/status?api_token={{token}}" alt="Assertions Status" />

The easiest way to test and
monitor your web services

Define and test your web service with Assertible Track and test API deployments across environments Schedule API monitoring and failure alerts

Reduce bugs in web applications by using Assertible to create an automated QA pipeline that helps you catch failures & ship code faster.

Sign up for free