T-VEC Simulink Tester vs. Reactis

A place for discussing topics that do not fit into the other Simulink/Stateflow categories

T-VEC Simulink Tester vs. Reactis

Postby Killian » Fri Jul 31, 2009 10:16 am

Hi,

I don't know well Reactis, I'm currently documenting myself about this tool...
Anyway, to satisfy my curiosity about T-VEC vs. Reactis, I'd like to gather some point of view, feelings or experiments about Reactis and try to get a quick comparative analysis between these 2 tools:
- common features, additionnal features...
- key differentiators...
- best and strong T-VEC arguments...

Thanks

Killian ;)
Killian
 
Posts: 1
Joined: Fri Jul 31, 2009 9:58 am

Re: T-VEC Simulink Tester vs. Reactis

Postby busser » Fri Jul 31, 2009 11:05 am

The basic approach of Reactis is "trial and error" - it tries input values to see what will happen for those values in the model. Reactis is designed around a model execution mechanism and an input value suggestion mechanism. It tries different input value sets and executes the model and provides the user with feedback about which paths through the model are "executed" by the chosen inputs. It provides a number of different ways to create input value sets to try. However, there is no sure way to identify the necessary inputs to satisfy a particular switch guard or relational operator block or if-then-else condition, etc.. For example, if a condition is based on an arithmetic expression, perhaps like this

if (floor (x + y*z - 5) == 999) {
output = 1;
}
else {
output = 0;
}

Reactis may not "guess" the right values of x, y, and z to satisfy this constraint - so the execution of the value sets proposed by Reactis may never cover one of the branches of the if-condition. T-VEC VGS works very differently. T-VEC VGS solves the constraint equation (floor (x + y*z - 5) == 999) to arrive at values for these variables, it does not guess at them.

A minimal set of test vectors produced by T-VEC VGS for this example, with x,y,z having the declared domain of [-1.0e4..1.0e4], is the following

forum_example_vectors.jpg
forum_example_vectors.jpg (52.56 KiB) Viewed 6877 times


Reactis has some successes when operating at low level unit tests, where there are not too many conditions being combined, but it does not scale well and has much difficulty with integration tests for hierarchies of subsystems.

key differentiators...


The T-VEC VGS approach of

* identifying all of the logic paths through the model
* isolating the constraints that govern each path
* solving the constraint equations of each path in terms of equivalence class domains
* selecting boundary point values from the constraint solution set at the constraint boundaries that define the equivalence classes
* using the selected values that were chosen because they solve their associated constraint set and then applying them in as needed in the computation of the expected output values

is superior to the Reactis approach of guessing at input values and then executing them model, using these guessed at values to see what happens, and then recording them if the user believes they are good.

common features, additional features...


Reactis does produce test vectors when it manages to guess the necessary input values. We believe Reactis also has some text driver generation capabilities, but we do not believe its test driver generation technology is as powerful and flexible as that in T-VEC VGS.

Reactis is more graphical, it animates signal paths through the Simulink model that highlight the paths that its execution engine is covering - so, it looks impressive, but it is our view that this is just "pretty pictures" instead of real test generation capability. Our tools do provide VGS to Simulink/Stateflow navigation and highlighting of key subsystems and operational blocks that are involved in specific test vectors. However, we do plan on continuing to add useful T-VEC VGS to-and-from Simulink/Stateflow related navigation features and signal path highlighting in future versions.

best and strong T-VEC arguments...


The T-VEC VGS approach and core technology is both superior by concept and is also more mature. The first working version was available in 1989 and was first used for DO-178B verification credit in 1992 while the founders of T-VEC were at Allied Signal Aerospace. It was designed from the start to be an industrial class tool/technology based on a well-founded and formal model of requirements. T-VEC VGS was created by avionics software engineers at an major avionics supplier, rather than at a university as part of a research grant - which I believe is the origination of Reactis. My colleague and I developed T-VEC while working for Bendix/King - which became Allied Signal Aerospace - as cockpit display designers (CRT-based EFIS systems) in order to do a better job verifying software than the traditional DO-178A structure based testing approach. In fact, it was T-VEC's requirements-based testing/verification approach, and its formal concept of requirements model, that led to the changes in DO-178B which call for requirements-based testing, at least for level A. We served on the DO-178B verification subcommittee and were instrumental in the DO-178B improvements over DO-178A in this area.

Another sales argument is that Reactis is simply a design model-based test generator - for Simulink/Stateflow only. Reactis has no requirements modeling and requirements-based model analysis and test vector generation technology such as our T-VEC Tabular Moder (TTM), which is based on the SCR (Software Cost Reduction) table-based requirements modeling language - but with many significant improvements and enhancements.

I hope these are useful points and help answer your questions. Please email me any time with any other questions you may have.
busser
Site Admin
 
Posts: 52
Joined: Thu Mar 13, 2008 7:42 pm


Return to General Topics

Who is online

Users browsing this forum: No registered users and 0 guests

cron