How to use

Here are some examples and tips on how to use izitest.

Project structure

You have the following project tree:

my_c_project
├── build/
│   └── ...
├── docs/
│   └── ...
├── LICENSE
├── README.md
├── my_binary
├── tests/
│   └── ...
└── izitestsuite.py

Creating tests

Test are YAML files so that they are easy to understand and quick to create/edit.

Each test file is composed by a list of tests to execute. Here is an example of one test:

---

- name: 'Command option' # Name of the test
  desc: 'Test bash ''-c'' option' # An optional description

  # Args to APPEND before execution (which means that args
  # passed through CLI WILL NOT be replaced)
  args:
    - '-c'
    - 'echo -n Hello World!'

  stdin: ''

  # If 'expect' is present, it will fake the execution of a 'ref' executable
  expect:
    stdout: 'Hello World!'
    stderr: ''
    retcode: 0

  # Maximum amount of time (in seconds) for the execution
  timeout: 3

  # If True, the test will be skipped
  skip: False

  # Checks to perform
  checks:
    # Exact same outputs
    - 'same_stdout'
    - 'same_stderr'
    - 'same_retcode'

    # Same number of lines in outputs
    - 'same_stdout_size'
    - 'same_stderr_size'

    # Test if anything was outputted
    # - 'has_stdout'
    # - 'has_stderr'

    # Test if nothing was outputted
    # - 'has_no_stdout'
    - 'has_no_stderr'

This will test if bash `-c` option works as expected (with echo -n command).

NOTE: All fields here are the available ones. The only required field is name (and expect if you don’t provide a ref executable but we will see that later).

Testing

Here is an example of izitestsuite.py you can use to test your binary my_binary against the set of tests contained in tests/ directory:

#!/usr/bin/env python3
# coding: utf-8

from izitest import *

if __name__ == "__main__":
    ts = init_testsuite()
    ts.run()

    exit(0)

Then you can run all tests under tests/ by executing:

./izitestsuite.py my_binary

Testing against a binary

You can also test your binary against a reference binary.

If we provide an expect value in our tests, it fakes the execution of a binary that would have return what we expected. But if we do not, then we must provide a reference binary to compare our own (otherwise the test will be skipped).

Let’s assume your binary is supposed to mimic the behavior of your favorite shell command, then you could do:

./izitestsuite.py my_binary --ref /usr/bin/sl  # I know 'sl' is your favorite command

NOTE: You can combine --ref CLI arg and expect, just keep in mind that expect has a higher priority.

This is useful if you need to check that your binary behave the same way as another one in most of the case but has a different behavior in specific cases.

Convenient features

For convenience, some test fields have default values so you can omit them.

Implicit checks

By default, if no check is provided, the test will run same_stdout, same_stderr and same_retcode. Therefore, these two tests are equivalent:

---

- name: 'A basic test'
  checks:
    - 'same_stdout'
    - 'same_stderr'
    - 'same_retcode'

- name: 'Same test but with implicit checks'

Default expectations

Sometimes you expect your binary to just execute well without printing anything.

To do that you can simply write expect: without specifying what you expect, just like that:

---

- name: 'Working test'
  desc: 'Make sure the program is successful and does not output anything'

  expect:
    stdout: ''
    stderr: ''
    retcode: 0

  checks:
    - 'same_stdout' # equivalent to has_no_stdout in this case
    - 'same_stderr'
    - 'same_retcode'

- name: 'Same working test'
  desc: 'Same test as above but using default expect values'

  expect:

  checks:
    - 'same_stdout' # equivalent to has_no_stdout in this case
    - 'same_stderr'
    - 'same_retcode'

You’re really that lazy?

Here are the minimal test definitions:

---

# Minimal test definition
- name: "Just a named test"

# Minimal test definition if you don't provide a ref executable
- name: "What a lazy test"
  expect:

As you can see, these definitions make use of implicit checks and default expectations features.