Difference between revisions of "Examples"

From T-VEC Wiki
Jump to: navigation, search
 
Line 1: Line 1:
This section provides links to a number of examples to illustrate modeling approaches, test driver approaches, and modular organization of models. This section first introduces the SCR modeling approach that has been extended by TTM. Click the following link for [[Simulink/T-VEC Examples|Simulink/T-VEC Examples]].
+
This section provides links to a number of examples to illustrate modeling approaches, test driver approaches, and modular organization of models. The following links focus on modeling examples:
  
===Example Links===
+
*[[T-VEC Tablular Modeler Examples|T-VEC Tablular Modeler (TTM) Examples]]
[[Discrete_Filter]]
+
*[[Simulink/T-VEC Examples|Simulink Tester for T-VEC Examples]]
  
==SCR Methods Supported by TTM==
+
Test driver generation examples are covered separately.
This section discusses a few details about the SCR modeling concepts and rules through the TTM tool.  
+
  
===SCR Basics===
+
*[[Test Driver Generation Examples|Test Driver Generation Examples]]
Simplistically stated, the SCR method is based on the use of decision tables and state machines to describe required behavior of some component. The structure for organizing and relating SCR model elements is based on tables. Tables are used to define data types and variables of the problem. Variables can be defined in terms of primitive types (e.g., Integers, Float, Boolean, Enumeration), or user-defined types. The behavior is modeled using combinations of tables that define functional aspects of the problem using a form of state machines (called Mode Tables), Condition, or Event Tables.
+
 
+
The development of a model relates what system components have to do and how they have to do it. Then, through the generated tests, the model provides a measure of how well a target implementation satisfies the modeled requirements. The elements of “how” a system has to do its function is defined in terms of a set of interfaces specified with model variables and their associated data types. From a high-level perspective, models specify behavior relating input variables to output variables. Models also can represent behavior in terms of historical variables that are referred to as modes or terms.
+
 
+
==TTM Extension Overview==
+
 
+
===Requirements Management===
+
TTM manages requirements through a hierarchical decomposition (i.e., outline format) where each requirement is composed of the following:
+
* Tag. A unique identifier for the requirement comprised of letters, numbers, underscores and periods.
+
* Description. A single line of text further describing the requirement.
+
* Comment. Any additional text.
+
The hierarchy of requirements is managed through the model view, and requirements are decomposed by creating child requirements that display below their parents within the model view
+
 
+
===Requirement-to-Test Traceability===
+
This section provides an example to explain the process for linking DOORS requirements to the TTM requirements model. The tool support for requirement-to-test traceability involves linking various sources of requirements through the model. The model transformation, test vector generation, and test driver generation provide the tool support to link the requirements to the test vectors, test drivers, and test reports. The process has three basic steps:
+
 
+
* A DOORS module is imported into the TTM. There are options to add or delete a DOORS module to TTM or synchronize DOORS modules when they are updated. There is a one-to-one correspondence between a DOORS ID and a TTM requirement ID.
+
* Imported requirements maintain the outline structure that they have within the DOORS environment. One or more DOORS requirements can be linked to an element of a TTM model (e.g., condition/assignment), or linked to a higher level in the TTM model, such as a condition, event, or mode table.
+
* The model translation maintains the link between the requirement ID, and during test generation, the requirement link is an attribute of the test vector. During test driver generation, requirement IDs can be output to the test driver to provide detailed traceability to the executable test cases.
+
 
+
TTM provides requirement management functionality that is similar to a DOORS module. Imported DOORS modules are linked into TTM as read-only modules. Changes to the requirements must be made within DOORS and then synchronized within TTM. Additional requirements can be created directly in TTM if they are not contained within DOORS or if the source requirements are not in a requirement management system such as DOORS. The process to link a requirement to the model is the same.
+
 
+
===Model Includes===
+
 
+
TTM models support the inclusion of existing models of other requirements, interfaces, or functional behavior. As a result, this feature helps consolidate behavior common to multiple models into a single model and includes it in other models where needed. This feature also supports partitioning a model to allow multiple engineers to work on it in parallel. It is described below.
+
 
+
==Modular Organization==
+
 
+
Most organization start their model-based development efforts with a small thread of functionality and transition into one that is performed by a team of developers. Some have evolved to support  product lines. This requires coordination with the design team, system engineering team that write the product technical specification, test team, and the quality assurance organization.
+
 
+
This examples shows an example function such as Check that might have many different types and combination of filtering, matching. This example discusses the organizational and process impacts of developing a feature for the Check component that impacts Filter, Match, and Select.
+
 
+
[[Image:Component_example.jpg|center|Conceptual Components Example]]
+
 
+
+
This example discusses a chronological perspective because the integration of the entire model-based method impacted many different organizations within a company. Before the of model-based development, the components of the Check function were not partitioned with well-defined interfaces; rather, the functionality was coupled, which made testing the functionality in each subcomponent (i.e., Filter, Match, and Select) more difficult. However, there is a verification requirement to demonstrate that every thread through a component or subcomponent is completely tested. Tight coupling makes this requirement difficult to achieve and demonstrate.
+
 
+
The team used interface-driven requirement modeling that starts early during the requirement and design phase. This early modeling helps to create a more testable design and improve the requirement and interface specifications.
+
 
+
As reflected in previous figure, the functionality in the existing system was tightly coupled for numerous historical reasons such as memory space limitations. The interfaces between Filter, Match, and Select were not well-defined. This complicated the testing process, requiring many tests to be initiated from higher levels in the system, such as Check, because some of the inputs could be set upstream from the Check component. In addition, the outputs from the function such as Match were not visible. This made systematic and comprehensive testing of these lower-level components difficult. Normally, ensuring coverage of the threads through the implementation of these lower-level components means increased testing from the high-level components. Sometimes, the number of tests can increase by an order of magnitude.
+
 
+
This effort started early enough that the designers were able to expose the interfaces of both the inputs and outputs, including internal state information to increase the testability significantly. Approximately 80% of the functionality was tested with improved interface support provided by the design team. This approach significantly reduced the complexity of the model and the tests, and provided greater test coverage with fewer tests to reduce time and cost. The remaining 20% represented elements of the components that could not be changed due to performance issues, and impacts on cost, resources, and schedule associated with retesting.
+

Latest revision as of 15:30, 25 February 2007