We cover all boundaries

Skip Navigation

The collection of FAQ's listed below are the most commonly asked questions about our products and technology. Additional information can be obtained through the Public Knowledge Base or through a T-VEC representative. T-VEC clients are welcome to visit the Customer Interface or contact Customer Support for more information.


What is the T-VEC Test Vector Generation System?
The T-VEC Test Vector Generation System is a fully integrated system for automating model analysis, test generation, test coverage analysis and test execution. It includes the only known commercial tool capable of producing test vectors, including test input values, expected test outputs and requirements to test traceability. The T-VEC system performs model analysis on and test generation from requirements models and incorporates extended support for domain testing theory for test case selection. The system also includes the test driver generator that uses a template approach to convert generic test vectors into a form suitable for model simulation or automated testing in a host or target environment.
What is a test vector?
Test vectors include test input values, test expected output values and justification information that links the test to the underlying requirement from which it was derived. T-VEC's test vector generation automatically produces test vectors from a specification using a specialized set of test selection and convergence mechanisms. The test selection settings in effect when a test vector is produced are also included in the generic test vector format. Following is an example test vector in generic format which means it is abstract and not specialized for testing in any specific environment. Typically, test driver generation is used to create a form of the test vector better suited for test execution. Note that the inputs and outputs include name, type, size in bits, value and range information. max_positive<<1>>, RP__1<<1>> OUTPUT max INTEGER 32 0 {-100 .. 100} INPUTS x INTEGER 32 -100 {-100 .. 100} y INTEGER 32 -100 {-100 .. 100} JUSTIFICATION { SOLUTION : 1 STATE_SPACE_SCAN : OFF SWITCHES : LEAST_RECENT, LOW_BOUND, SINGLE, OPPOSITE DCP : 1 max_positive, max_positive_FR__1, cv_max_positive_RP__1, max_positive_RP__1, max_positive_RP__0, both_negative }
What is Domain Testing Theory?
White and Cohen (1980) proposed domain testing theory as a strategy to select test points to reveal domain errors. Their theory is based on the premise that if there is no coincidental correctness, then test cases that localize the boundaries of domains with arbitrarily high precision are sufficient to test all the points in the domain. When there is a strong correlation between the specification constraints and implementation paths, the selected test data should uncover computation and domain errors. A computation error occurs when the correct path through the program is taken, but the output is incorrect due to faults in the computation along the path. A domain error occurs when an incorrect output is generated due to executing the wrong path through a program (Howden [1976]; definitions modified by Zeil [1989]). In practice domain testing theory is based on the intuitive idea that faults in the implementation are more likely to be found by test points chosen near appropriately defined program input and output domain boundaries (Tsai et al. 1990). T-VEC uses an extended form of domain testing theory to choose test points along input domain boundaries such that they are effective in detecting both decision and computation errors.
How are tool licenses managed?
Execution of T-VEC tools is controlled through a license management scheme based on Globetrotter Software's FlexLM license management software. Once a license schedule is acquired from T-VEC, license keys are provided for specific machines based on their Ethernet address. These license keys enable tool operation and are provided as node-locked or floating. Node-locked licenses allow execution on a single machine. Floating licenses allow licenses to be shared by some number of machines on a network. Floating licenses are managed by a license server that limits the number of concurrent users to the number of licenses available.
What are the system requirements for T-VEC products?
Currently, T-VEC's software products are developed to operate on Microsoft Windows 2000 and XP, running on Intel Pentium class machines with a minimum of 512 MB of RAM and 1 GB (recommended).
What languages do the T-VEC tools support?
T-VEC's toolset operates on requirement specifications to automatically generate test vectors (test input and expected output values) useful for verifying that some implementation of the system satisfies the underlying specification. Thus, the toolset and its associated artifacts are designed to be used in conjunction with any programming language, development environment or testing environment. The T-VEC linear form is the specification or modeling language that the toolset supports natively. Specifications can be developed in the linear form directly, but more typically they are created in a modeling language better suited to support development efforts and organizational preferences. These models are then translated into a form compatible with the toolset to support model analysis, test generation, coverage analysis, and test driver generation.
Can I obtain an evaluation of T-VEC's products?
T-VEC's products are available for evaluation. Evaluation licenses are usually granted for a 30 day trial. Obtaining an evaluation copy of the toolset requires execution of a limited software licensing agreement to cover the evaluation period and a non-disclosure agreement to protect the proprietary information transferred during the evaluation period. Typically, the non-disclosures are mutual because it is often necessary for the customer to give T-VEC access to their proprietary development information in order to facilitate the evaluation process. Because T-VEC is an advanced technology, it is common to have T-VEC consultants on site for training and mentoring purposes to jumpstart the evaluation process.
What are the key capabilities of T-VEC Test Generation Technology?
  • Scalable Constraints and Subsystems - Supports specification decomposition and control of the combinatorial explosion of test cases that can result from attempting to test every executation path.
  • Test Vectors - Typically, test case generators only produce test inputs. T-VEC produces test vectors including test inputs, expected outputs (post-conditions) and justification (direct traceability to governing requirements).
  • Specification Execution Engine - Specifications may be manually debugged to locate sources of contradiction.
  • Rigorous Test Selection Heuristics - T-VEC's criteria for selecting tests is based on an extended form of Domain Testing Theory that produces results effective in detecting both domain (ie decision) and computation errors by producing test points along system boundaries.
  • Non-linear Convergence Capability - The ability to generate test vectors for complex expressions.
Where have the tools and technology been successfully applied?
T-VEC has been used over the last 15 years primarily on aerospace applications. The primary application domain is high assurance systems, deployed on embedded targets. These high assurance systems typically undergo the most rigorous specification and verification processes for which T-VEC was developed to support. Following are some example applications from the aerospace domain in which the T-VEC technology was applied:
  • Traffic Collision and Avoidance System (TCAS), FAA certified 1990 and re-certified in 1991
  • MD90 Electrical Power System (EPS), FAA certified 1995
  • Electronic Flight Instrumentation System, Flight Guidance System
  • Joint Strike Figher
  • Medical Devices
  • smart cards
  • Security testing for databases
  • Space systems
  • Command and control
  • more
What benefits can be expected by employing T-VEC's methods and tools?
  • Reduces overall lifecycle time and cost
  • Facilitates verification through automated test vector generation
  • Support for both structured and object-oriented specifications
  • Scalable hierarchical specifications
  • Concurrent development and verification
  • Reduces labor-intensive & error-prone tasks
  • Supports iterative verification
  • Improves management and accessibility
  • Obvious test traceability (every test is mapped to a specific requirement)
  • Formal specification permits capture of statistical information
  • Supports multiple specification methods and tools leveraging existing assets
Does T-VEC support Regression testing? How?
Yes - inherently. T-VEC's approach is to automatically generate tests from requirements models. During system evolution (ie change) that necessitates regression testing, the underlying requirements specifications are updated to characterize the system's new functionality. These updated specifications are used as the basis for automated test generation, and the newly created tests are used in conjunction with the testing infrastructure created during the initial software development effort to retest the updated system.

The savings are dramatic. In one example of applying these techniques, a prime estimated it would take 6 engineers 6 months to update an avionics component. Using automated testing techniques, the job was accomplished in 4.5 weeks by 2 engineers.
How is T-VEC typically used to replace manual processes?
The primary manual processes eliminated or reduced include: - test design (i.e., determining test inputs, expected output, traceability to requirements) - test driver development - specification analysis (i.e., determining specification correctness by identifying contradictions) The key differences from standard approaches to software development include that model analysis and test vector generation are performed both during and immediately following requirements specification by the specifier, not the tester. This effectively zero effort task helps ensure model correctness and produces the test cases necessary for unit and integration testing. Once the software interfaces stablize during design and implementation, test driver design and generation proceeds such that test execution is automatic when the implementation becomes available. The ultimate effect is that the cycle-time can be significantly reduced.
How would automated testing improve quality?
This list is long, the following hits a few key points...
The SEI test maturity model best describes it. Organizations introducing automated testing into their development processes will inherently migrate to higher levels of maturity, in turn improving quality and decreasing cost. The focus shifts from error prone, non-rigorous, and manually intensive, costly testing to measurable and optimizing test activities. Compliance measures are tooled, not subjective.
Automating the test activities permits the effort to be expended on "reasoning" about the system, developing well-formed requirements, and letting the tools demonstrate the compliance and identify contradictory statements of requirements (tests cannot be produced without consistent requirements, so the 1st step in automated testing is actually "proving" consistency within the source model, thus improving the quality of the product early in the development phase by insisting on well-formed requirements).
With automated testing, the test artifacts have a consistent form, so they can be readily measured in a meaningful way. The consistent form of the artifacts also permits automated generation of test drivers (tests which are now ready for injection in a particular target environment) through simple schema definition to automate another manual, error prone, and costly task.
With manual techniques, even the simplest requirements are generally not tested well or with any consistent test point selection heuristics. More complex problems cannot possibly be properly tested in any reasonable amount of time without tooled support. T-VEC has an advanced set of test heuristics exceeding FAA DO-178B guidelines for software aspects of certification. Our heuristics are designed to target corner points of test solution areas which, if you subscribe to domain testing theory (one of the most accepted theories on testing), will yield the highest defect detection. These heuristics have evolved for over a decade and could never be practically reproduced by a human. There are more avenues of discussion here, but in general it should be obvious that automating a particular lifecycle task will inherently improve quality (assuming the automation is effective in eliminating a human task) and automating the most costly and error prone task will have dramatic quality benefits.
Approximately how much time can we expect to save with test automation?
Programs have been measured an order of 6-8 times improvement in cycle time with cost reductions of parallel magnitudes. Regression is even more impressive, because once the infrastructure is in place, incremental testing is generally a push of a button after the requirements model has been updated.

In T-VEC, the tests are generated automatically from a source requirements model (true requirements-based testing, not a separate set of test specifications). This means that the requirements modeler makes their changes, the compliance testing is automatic...tests are generated, converted to target-specific drivers, and assuming an appropriate test harness, automatically injected, results extracted, and reports generated for review. We have achieved complete automation many times.
How long will it take to integrate T-VEC into our organization?
The effort to integrate T-VEC into an organization is highly subject to the particular organization's development practices. The key is to have sufficiently rigorous requirement models which serve as the basis for any downstream automation, testing or otherwise. One new customer formalized several requirements models for a legacy program and completely automated their test activities for it in about 3 weeks with 4-days of our on-site assistance. Although this scenario is rare, it is possible. Typically, technology transfer takes 1.5 to 2 months.
Are any part(s) of testing difficult to automate?
Temporally-related requirements are generally more difficult to automate. (Not from a T-VEC vector generation perspective, but from a test injection perspective.) Having a test harness that can automatically accept a vector and in a general manner determine that a particular response, or series of responses has occurred within a specified time frame is the challenge. Some of our customers incorporate sophisticated test harnesses and successfully accomplished automated temporal-based testing, while other customers rely on manual testing techniques for temporal requirements and use T-VEC for the transformational testing (still typically 90+% of the total required testing). The key point is that T-VEC will produce the desired vectors even for temporal-based requirements, but a suitable test harness must be available to inject/extract the vectors.
Can T-VEC be used with my requirement specifications? How?
Although the T-VEC tools only operate natively on the T-VEC linear form, a common approach is the use of translation techniques to convert specifications developed in other specification languages into a form suitable for processing by the toolset. This approach makes the full capabilities of the toolset applicable to the target specifications. While some translators are available from T-VEC, others can be developed independently or with T-VEC's support to satisfy an organization's specific needs. The success of the translation effort is determined by the rigor or formalism of the source specification language. In the absence of necessary formalism, specification or translation heuristics can be applied to support a meaningful translation. See the products section of the T-VEC website or contact a T-VEC representative for more information.
Where can I learn about SCR?
The Naval Research Lab has developed the SCR tool to support specification and analysis of SCR models. The tool and information on specifying in SCR are available. Contact Connie Heitmeyer (heitmeye@itd.nrl.navy.mil) for additional information.