[Contents] [TitleIndex] [WordIndex

1. Scicos Tests

2. Non-regression tests

2.1. Purpose

Perform tests on full diagrams - rather than elementary blocks for unitary tests - and check that the current behavior is consistent with reference results obtained on a prior version of the software.

A non-regression test is considered "passed" if results are 100% identical or if different results can be fully explained by voluntary changes in the code.

2.2. Location

SCI/modules/scicos/tests/nonreg_tests/

2.3. Naming conventions

2.4. File structure

2.5. Needed files

A comprehensive set of files for one test must include the following items:

The following files are automatically generated before or during non-regression tests:

2.5.1. Diagram

<DIAGRAM_NAME>.cos

The diagram itself contains all the parameters needed to run the simulation (final time, solver settings, etc.), as well as a context if need be.

Remark: As opposed to unitary tests, context can be defined directly INSIDE the diagram. It is externalized in unitary tests to allow automatic processing of close - yet different - contexts applied to the same base diagram. Here there is no need to have different contexts because each non-regression test is supposed to be unique.

2.5.2. Simulation output

<DIAGRAM_NAME>.<OS>.out
<DIAGRAM_NAME>.<OS>.out.ref

Each non-regression diagram contains a Write to output file block (found in the Sinks palette). It generates text files that can be easily compared to check consistency in the results between two different Scilab/Scicos versions. Some remarks:

2.5.3. Console output

<DIAGRAM_NAME>.<OS>.log
<DIAGRAM_NAME>.<OS>.log.ref

These files are used to log the console output throughout the simulation process. Console can be used to display some debug data to facilitate further comparisons and debug sessions.

2.5.4. Error log

<DIAGRAM_NAME>.<OS>.err
<DIAGRAM_NAME>.<OS>.err.ref

These files result from the redirection of the standard error output to a file.

2.6. How to launch the non-regression tests?

Use the following instructions to launch every test in the Scicos non-regression suite:

-->cd SCI/modules/scicos/tests/nonreg_tests;

-->exec('scicos_nonreg.sci');

-->scicos_nonreg();

   001/029 - [scicos] constant.....................................passed  : Output and reference are equal 
   002/029 - [scicos] delay_anal...................................failed  : Output and reference are NOT equal 
   003/029 - [scicos] disease......................................failed  : Output and reference are NOT equal 
[...]
   028/029 - [scicos] threshold....................................failed  : Output and reference are NOT equal 
   029/029 - [scicos] transferfcn..................................failed  : Output and reference are NOT equal 

   --------------------------------------------------------------------------
   Summary

   tests                       29 - 100.0 % 
   passed                       2 -   6.9 % 
   failed                      27 -  93.1 % 
   skipped                      0 -   0.0 % 
   --------------------------------------------------------------------------
   Details


   TEST : [scicos] delay_anal
     failed  : Output and reference are NOT equal
     Compare the following files for more details:
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/delay_anal.unix.out
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/delay_anal.unix.out.ref

   TEST : [scicos] disease
     failed  : Output and reference are NOT equal
     Compare the following files for more details:
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/disease.unix.out
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/disease.unix.out.ref

[...]

   TEST : [scicos] threshold
     failed  : Output and reference are NOT equal
     Compare the following files for more details:
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/threshold.unix.out
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/threshold.unix.out.ref

   TEST : [scicos] transferfcn
     failed  : Output and reference are NOT equal
     Compare the following files for more details:
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/transferfcn.unix.out
     - /home/vaylet/dev/scilab-5.0/modules/scicos/tests/nonreg_tests/transferfcn.unix.out.ref


   --------------------------------------------------------------------------

-->

You can also choose to run only a subset of all available tests using instructions like:

--> scicos_nonreg('constant'); // run only the 'constant' test

or

--> scicos_nonreg(['constant', 'disease', 'threshold']); // run only the 'constant', 'disease' and 'threshold' tests

2.7. How to use generated files to debug errors or inconsistencies?

If one or several tests fail to pass, files generated during the simulation can be used to obtain useful debug information. The following pair of files can be easily compared in a friendly visual environment like meld or kdiff3 under Linux, and WinMerge under Windows:

2.8. How to add a new test to the existing suite?

Adding a new test to the suite is relatively easy:

  1. Build your diagram as you would if it was a general purpose one. You can even include scopes for debug purposes. Simulations are launched in No Window mode (-nw) so they will be hidden during the non-regression process.

  2. Then just add a Write to output file block (in the Sinks palette) to the diagram, feeding it with all the outputs you want to log for comparison between the two versions to test. Use a Mux block (in the Branching palette) to concatenate signals when needed and don't forget to set the right Input size setting in the Write to output file parameters.

  3. Just in case, launch the simulation manually using the "Simulate/Run" menu item to check that everything is going right (then do not forget to delete the file generated by Write to output file block to keep a clean environment).

3. Unitary tests

3.1. Purpose

Perform tests on elementary blocks and check that the current behavior is consistent with reference results obtained on a prior version of the software.

A unitary tests is considered "passed" if results are 100% identical or if different results can be fully explained by voluntary changes in the code.

3.2. Location

SCI/modules/scicos_blocks/tests/unit_tests/

3.3. Naming conventions

<BLOCK_NAME> = name of the computing function of the tested block (as found in SCI/modules/scicos_blocks/src/)

Example: GAINBLK for the GAIN block (in the Linear palette), accordingly to GAINBLK.c in SCI/modules/scicos_blocks/src/c/

<DIAGRAM_NAME> = <BLOCK_NAME>_<IDX>

A 3-digit index allows up to 999 different diagrams for a single block.

Example:

GAINBLK_001
GAINBLK_023

<TEST_NAME> = <DIAGRAM_NAME>_<IDX>

A 3-digit index allows to define up to 999 different contexts for a single diagram.

Example:

GAINBLK_001_001
GAINBLK_023_009

3.4. File Structure

Each palette has a related subfolder in SCI/modules/scicos_blocks/tests/unit_tests/:

3.5. Needed files

A comprehensive set of files for one test must include the following items:

The following items are optional:

The following files are automatically generated before or during non-regression tests:

3.5.1. Diagram

<TEST_NAME>.cos

The diagram itself contains all the parameters needed to run the simulation (final time, solver settings, etc.). However, a context can be (optionally) defined in an external file to supply custom block settings (see below).

3.5.2. Simulation context

<TEST_NAME>.cxt

The context should be defined in a text file containing name/value pairs defining variables. This file should be human-readable so that the user can modify it anytime.

@TODO: use a better and easier way to store this kind of data (XML?)

@REMARK: Beware of matrices (text files may not be a convenient way to input and store them)

3.5.3. Pre- and post-simulation scripts

<TEST_NAME>_pre.sce

<TEST_NAME>_post.sce

If present, these scripts are launched (respectively) right before and right after the simulation.

For instance, they can be used to set workspace variables needed by "From Workspace" blocks, or in order to display contents of variables generated by To Workspace blocks and allow visual comparison. However, no graphical function like plot should be used (except maybe for debug purposes) because tests results are text-based comparisons only!

@TODO: not yet implemented

3.5.4. Simulation input

<TEST_NAME>.in

Some simulations may need a custom input imported with a Read from File block. Filename should be directly derived from diagram name and thus set to <TEST_NAME>.in. This convention allows an automatic processing by the various Scilab scripts dealing with unit tests.

Before launching any simulation, a function checks whether the input filename matches the right pattern. It loops through the diagram, looking for the only Read from File block available, and fixes its settings automatically. This way, the test designer is allowed to forget to set the right filename.

3.5.5. Simulation output

<TEST_NAME>.out
<TEST_NAME>.out.ref

Each non-regression diagram contains a Write to output file block (found in the Sinks palette). It generates text files that can be easily compared to check consistency in the results between two different Scilab/Scicos versions. Some remarks:

3.5.6. Console output

<TEST_NAME>.log
<TEST_NAME>.log.ref

These files are used to log the console output throughout the simulation process. Console can be used to display some debug data to facilitate further comparisons and debug sessions.

3.5.7. Error log

<TEST_NAME>.<OS>.err
<TEST_NAME>.<OS>.err.ref

These files result from the redirection of the standard error output to a file.

3.6. How to launch the unitary tests?

Use the following instructions to launch every test in the Scicos unitary suite:

-->cd SCI/modules/scicos_blocks/tests/unit_tests;

-->exec('scicos_unitary.sci');

-->scicos_unitary();

 [...]

-->

You can also choose to run only a subset of all available tests using instructions like:

--> scicos_unitary('Linear'); // run only the 'Linear' palette

3.7. How to use generated files to debug errors or inconsistencies?

If one or several tests fail to pass, files generated during the simulation can be used to obtain useful debug information. The following pair of files can be easily compared in a friendly visual environment like meld or kdiff3 under Linux, and WinMerge under Windows:

3.8. How to add a new test to the existing suite?

Adding a new test to the suite is relatively easy:

  1. Build your diagram as you would if it was a general purpose one. You can even include scopes for debug purposes. Simulations are launched in No Window mode (-nw) so they will be hidden during the non-regression process.

  2. Then just add a Write to output file block (in the Sinks palette) to the diagram, feeding it with all the outputs you want to log for comparison between the two versions to test. Use a Mux block (in the Branching palette) to concatenate signals when needed and don't forget to set the right Input size setting in the Write to output file parameters.

  3. Just in case, launch the simulation manually using the "Simulate/Run" menu item to check that everything is going right (then do not forget to delete the file generated by Write to output file block to keep a clean environment).


2022-09-08 09:27