Difference between revisions of "Simulink Tester for T-VEC"
|Line 49:||Line 49:|
Revision as of 14:27, 19 May 2007
Simulink Tester for T-VEC
The Simulink Tester for T-VEC integrates Simulink/Stateflow the the T-VEC VGS to automate much of the testing process by analyzing the Simulink model to determine the best test cases for validating the model and testing implementations of the model. When used with the Real-Time Workshop™ Generic and Embedded Coders, the T-VEC Tester generates test drivers (harnesses) for executing the test vectors against auto-generated source code. Comprehensive Test Suites
Test generation for Simulink models produces unit, integration and system level test vectors and test drivers necessary to fully verify implementations of models. The test selection process produces the set of test vectors most effective in revealing both decision and computational errors in logical, integer and floating-point domains.
VGS generates test vectors for every path through every atomic subsystem. Each test vector is determined from the constraints in the subsystem under test and the constraints of any lower-level subsystems it references. These tests produce
- Structured Path Coverage
- Decision (Branch) Coverage
- Modified Condition / Decision Coverage
- Statement Coverage
- Interface Coverage
See Simulink/T-VEC Examples for details.
There are potential limitations, usage issues, and guidelines associated with the integration of Simulink/Stateflow with T-VEC.
The Mathworks' Simulink and Stateflow allow users to develop behavioral specification used as basis of for simulation or code generation, and more. When used for code generation, the models represents "what’s in the box."
Simulink was originally used for control system modeling, but with the addition of Stateflow and other features, it now provides hybrid modeling support for integrating control system and state machine models.
The analysis performed prior to test vector generation identifies model errors, such as contradictions or feature interaction problems. These model errors can result in dead code or other undesirable effects.
Complete Verification Artifacts
The T-VEC Tester produces a complete set of artifacts for verifying Simulink models
- Model Analysis Report identifies model errors
- Test Vectors
- Input values
- Expected output values
- Traceability from each test to the Simulink model
- Test Coverage Report
- Test Harnesses for GRT and ERT code generators
- Test results report that details test successes and failures
- Makefiles to fully automate the process
Scalable, Hierarchical Test Generation
Comprehensive testing is only feasible with a bottom up approach that fundamentally requires both unit and integration level testing support. When automatically generating test vectors for every condition and decision in a system, only the lowest level subsystems of the system can be tested in isolation. All higher-level subsystems must be tested in the context of the lower level subsystems on which they depend. Otherwise, there may be paths or threads in the higher-level subsystems that are not supported by the lower levels. This is especially true when multiple subsystems are related and changes could result in feature interaction problems. Some test tools generate tests based on the premise that unit testing a subsystem can ignore or stub out all subsystems on which it depends. This is a fundamentally incorrect and dangerous assumption.
Test Sequence Vectors
Test Driver Generation
The Simulink Tester provide test driver schemas for the ERT and GRT code generators supported by the Mathworks RTW, as well as the Matlab Simulator. Howevever, target execution environment have differences, and the tools provide a general purpose test driver template/schema language for describing a generic test driver that can then be instantiated with test vector information by running the test driver generator. An organization can create any number of test driver scripts/programs that are targeted to specific target environments. Schemas can be created from scratch or tailored from schemas provided with the tool installation.
In the case of LDRA's TBRun environment, the installation provides a few default test driver schemas that are designed to produce .tcf files. These are located in the install area, normally in the directory
Schemas are provided also with the TTM training examples that are normally located in the directory
Specific differences in build environments require customization of one of these general purpose schema. To illustrate, one of these example schemas (testbed_tcf_ghs.sch) was created specifically for a Greenhills Compiler environment. The only other task is configuring the object mapping files that are descriptions of how to relate model names with target code names. Some of the object mapping information is automatically set up by the translator, but some of it needs to be identified and added to the mapping information. In a Simulink environment, only some of the necessary information is available to the translator to help in automatically creating the necessary object mappings for the ERT or GRT code generators. Simulink/Stateflow does not provide any form of API for querying this kind of information, but some of it is extracted by executing functions against the RTW. If a different coder is used, or a customization of the default GRT or ERT coder is used, this mapping information needs to be completed by the user.
There is information in the manuals and user's guide about the test driver generator mechanism, but it is also covered in training classes or workshops.