Basic Structure¶
The interface design and naming is inspired by the Boost Test Library. Following this naming scheme, the unit testing package consists of three basic structural elements:
The basic building blocks of this unit testing framework are assertions. Assertions are used for checking if a condition is true. See Assertion Types for a clarification of the difference between the three assertion types. Assertions are grouped into single test cases and test cases are organized in test suites.
A test suite is a group of test cases that live in a single
procedure file. You can group multiple test suites in a named test environment
by using the optional parameter name
of RunTest()
.
For a list of all objects see Index or use the Search Page.
Test Run¶
A Test Run is executed using RunTest()
with only a single mandatory
parameter which is the Test Suite.
Function definition of RunTest¶
-
variable RunTest(string procWinList, string name = defaultValue, string testCase = defaultValue, variable enableJU = defaultValue, variable enableTAP = defaultValue, variable enableRegExp = defaultValue, variable allowDebug = defaultValue, variable debugMode = defaultValue, variable keepDataFolder = defaultValue, string traceWinList = defaultValue, string traceOptions = defaultValue)¶
Main function to execute test suites with the unit testing framework.
usage example¶RunTest("proc0;proc1", name="myTest")
This command will run the test suites proc0 and proc1 in a test named myTest.
- Parameters
procWinList –
A list of procedure files that should be treated as test suites.
The list should be given semicolon (“;”) separated.
The procedure name must not include Independent Module specifications.
This parameter can be given as a regular expression with enableRegExp set to 1.
name –
(optional) default “Unnamed”
descriptive name for the executed test suites. This can be used to group multiple test suites into a single test run.
testCase –
(optional) default “.*” (all test cases in the list of test suites)
function names, resembling test cases, which should be executed in the given list of test suites (procWinList).
The list should be given semicolon (“;”) separated.
This parameter can be treated as a regular expression with enableRegExp set to 1.
enableJU –
(optional) default disabled, enabled when set to 1:
A JUNIT compatible XML file is written at the end of the Test Run. It allows the combination of this framework with continuous integration servers like Atlassian Bamboo/GitLab/etc. Can not be combined with enableTAP.
enableTAP –
(optional) default disabled, enabled when set to 1:
A TAP compatible file is written at the end of the test run.
Can not be combined with enableJU.enableRegExp –
(optional) default disabled, enabled when set to 1:
The input for test suites (procWinList) and test cases (testCase) is treated as a regular expression.
Example¶RunTest("example[1-3]-plain\\.ipf", enableRegExp=1)
This command will run all test cases in the following test suites:
allowDebug –
(optional) default disabled, enabled when set to 1:
The Igor debugger will be left in its current state when running the tests. Is ignored when debugMode is also enabled.
debugMode –
(optional) default disabled, enabled when set to 1-15:
The Igor debugger will be turned on in the state:
1st bit = 1 (IUTF_DEBUG_ENABLE): Enable Debugger (only breakpoints)
2nd bit = 1 (IUTF_DEBUG_ON_ERROR): Debug on Error
3rd bit = 1 (IUTF_DEBUG_NVAR_SVAR_WAVE): Check NVAR SVAR WAVE
4th bit = 1 (IUTF_DEBUG_FAILED_ASSERTION): Debug on failed assertion
Example¶RunTest(..., debugMode = IUTF_DEBUG_ON_ERROR | IUTF_DEBUG_FAILED_ASSERTION)
This will enable the debugger with Debug On Error and debugging on failed assertion.
keepDataFolder –
(optional) default disabled, enabled when set to 1:
The temporary data folder wherein each test case is executed is not removed at the end of the test case. This allows to review the produced data.
traceWinList – (optional) default “” A list of windows where execution gets traced. The unit testing framework saves a RTF document for each traced procedure file. When REGEXP was set in traceOptions then traceWinList is also interpreted as a regular expression.
traceOptions – (optional) default “” A key:value pair list of additional tracing options. Currently supported is: INSTRUMENTONLY:boolean When set, run instrumentation only and return. No tests are executed. HTMLCREATION:boolean When set to zero, no htm result files are created at the end of the run REGEXP:boolean When set, traceWinList is interpreted as regular expression
- Returns
total number of errors
Test Suite¶
A Test Suite is a group of Test Cases which should belong
together. All test functions are defined in a single
procedure file, RunTest()
calls them from top to bottom. Generally speaking,
a Test Suite is equal to a procedure file.
Therefore tests suites can not be nested, although multiple test suites can be
run with one command by supplying a list to the parameter procWinList
in
RunTest()
.
Note
Although possible, a test suite should not live inside the main program. It should be separated from the rest of the project into its own procedure file. This also allows to load only the necessary parts of your program into the unit test.
Test Case¶
A Test Case is one of the basic building blocks grouping assertions together. A function is considered a test case if it fulfills all of the following properties:
It takes no parameters.
It returns a numeric value (Igor Pro default).
Its name does not end with _IGNORE or _REENTRY.
It is either non-static, or static and part of a regular module.
The first rule is making the test case callable in automated test environments.
The second rule is reserving the _IGNORE namespace to allow advanced users to add their own helper functions. It is advised to define all test cases as static functions and to create one regular distinctive module per procedure file. This will keep the Test Cases in their own namespace and thus not interfere with user-defined functions in ProcGlobal.
A defined list of test cases in a test suite can be run using the optional
parameter testCase
of RunTest()
. When executing multiple test
suites and a test case is found in more than one test suite, it is executed in
every matching test suite.
Test cases can be marked as expected failures, e.g. if the test case is written
before the functions are fully implemented. The assertions are executed, but
neither does the error counter increase nor is the test run aborted. To mark
a test case as expected failure write the keyword in the comment above
(maximum 4 lines above Function
are considered as tags, every tag in
separate line):
// UTF_EXPECTED_FAILURE
Function TestCase_NotWorkingYet()
Only assertions in a test case are marked as expected failures. If the test case
ends due to an Abort
or AbortOnRTE
then the error counter is increased
and the test case fails regularly. If such abort condition should be covered as
expected failure use the following code construct:
// UTF_EXPECTED_FAILURE
Function TestCase_NotWorkingYet()
PASS()
try
CodeThatAborts()
catch
FAIL()
endtry
End
If the test case actually passed the PASS()
prevents an fail due to zero
assertions in the test case. In case the test case should be evaluated as fail
if it passes then remove the PASS()
. The fail due to zero assertions encountered
is a regular fail and not evaluated as expected failure.
Example:¶
In Test Suite TestSuite_1.ipf the Test Cases static Duplicate() and static Unique_1() are defined. In Test Suite TestSuite_2.ipf the Test Cases static Duplicate(), static Unique_2() are defined.
Runtest("TestSuite_1.ipf;TestSuite_2.ipf", testCase="Unique_1;Unique_2;Duplicate")
The command will run the two test suites TestSuite_1.ipf and TestSuite_2.ipf separately. Within every test suites two test cases are execute: the Unique* test case and the Duplicate test case. The Duplicate test cases do not interfere with each other since they are static to the corresponding procedure files. Since the duplicate test cases are found in both test suites, they are also executed in both.
Note
The Test Run will not execute if the one of the specified test cases can not be found in the given list of test suites. This is also applies if no test case could be found using a regular expression pattern.
Assertion Types¶
An assertion checks that a given condition is true or in more general terms
that an entity fulfills specific properties. Test assertions are defined for
strings, variables and waves and have ALL_CAPS
names. The assertion
group is specified with a prefix to the assertion name using one of WARN,
CHECK or REQUIRE. Assertions usually come in these triplets which differ
only in how they react on a failed assertion. The following table clarifies the
difference between the three assertion prefix groups:
Type |
Create Log Message |
Increment Error Count |
Abort execution immediately |
---|---|---|---|
WARN |
YES |
NO |
NO |
CHECK |
YES |
YES |
NO |
REQUIRE |
YES |
YES |
YES |
The most simple assertion is CHECK()
which tests if its argument is
true. If you do not want to increase the error count, you could use the
corresponding WARN()
function and if you want to Abort the execution
of the current test case if the supplied argument is false, you can use the
REQUIRE()
variant for this.
Similar to these simple assertions there are many different checks for typical
use cases. Comparing two variables, for example, can be done with
WARN_EQUAL_VAR()
, or REQUIRE_EQUAL_VAR()
. Take a look at
Example10 for a test case with various assertions.
Note
See Assertions for a complete list of all available checks. If in doubt use the CHECK variant.
Assertions with only one variant are PASS()
and FAIL()
.
If you want to know more about how to use these two special assertions, take a
look at Example7.