Discrepancy due to floating point computations

A place for discussing the use and customization of the test results comparison function of the VGS.

Discrepancy due to floating point computations

Postby EatonMN » Fri Dec 12, 2008 12:59 pm

I am experiencing an problem where floating point math result comparisons are slightly off (between expected and actual output results from T-VEC), hence resulting in test failure. Here is a snapshot of the problem.

Test Object Expected Actual Result
1 __DU_Controls_IntTest_rootData._output.yoke_setpoint_deg 0.0e+000 0 OK
2 __DU_Controls_IntTest_rootData._output.Mode_Valve_Pilot2 0 0 OK
2 __DU_Controls_IntTest_rootData._output.torque_avail_Nm 5.09295817894064e-001 0.509171478094775 *** Miscompare
2 __DU_Controls_IntTest_rootData._output.calculated_torque_out_Nm -1.13176848420903e+010 -11314921735.4394 *** Miscompare
EatonMN
 
Posts: 1
Joined: Fri Dec 12, 2008 12:39 pm

Re: Discrepancy due to floating point computations

Postby busser » Fri Dec 12, 2008 5:41 pm

Yes, floating point comparisons are always problematic. This is the reason that we built into T-VEC VGS a means of customizing how the results comparison tool determines whether or not an actual result is close enough to an expected result to be classified as PASS vs FAIL.

The results comparison mechanism defaults to comparing non-floating point variables to exact matches. However, for IEEE FLOAT32 and FLOAT64 variables (single/double in Simulink Tester and float/double in TTM) the default is 3 significant digits (for single/float/FLOAT32 types) and 9 significant digits (for double/FLOAT64 types). However, the user can customize the comparisons on a variable by variable basis. The customized comparison can be expressed in terms of N significant digits, or in terms of percent of expected value. The procedure to use this mechanism is as follows.

Each subsystem for which comparison customization is desired must be compiled with a special option that creates a tolerance file for the subsystem. This is accomplished using one of these VGS menu items. To compile all subsystems in the project and create tolerance files for each of them, select the Project Icon and right click on

project_compile_with_tol.png
project_compile_with_tol.png (35.31 KiB) Viewed 5533 times


To compile an individual subsystem and create the accompanying tolerance file, select the desired subsystem and right click on

subsystem_compile_with_tol.png
subsystem_compile_with_tol.png (70.19 KiB) Viewed 5534 times


The name of the file will be <subsystemName>.tol and it contains tolerance specification information for each output variable for the associated subsystem. It is the T-VEC compiler that creates these tolerance files. This is the reason for the menu item includes both compiling and creating tolerance files.

Once you have a tolerance file for a subsystem, the specific output variable tolerances that are to be customized can be edited with the Tolerance File Editor.

tolerance_editor.png
tolerance_editor.png (63.62 KiB) Viewed 5528 times


Finally, to tell the results comparison process to use the information in the tolerance file - if a tolerance file exists - there is an option in Project Properties (and in the individual Subsystem Properties) dialog box that needs to be selected. This option for setting all subsystems in the project to be compared with corresponding tolerance files (with default names) is located here,

project_compare_with_tol.png
project_compare_with_tol.png (19.91 KiB) Viewed 5536 times


To specify that only specific subsystems are compared using their tolerance files, the option is located in the subsystem properties for each subsystem. This option also includes the ability to customize the name of the tolerance file to be used.

subsystem_compare_with_tol.png
subsystem_compare_with_tol.png (28.23 KiB) Viewed 5534 times


When these options are set, running the expected vs actual results comparison tool, via

invoke_comparison.png
invoke_comparison.png (49.92 KiB) Viewed 5530 times


By using this tolerance specification tool to control the definition of whether or not 2 values should be considered the same, you should be able to clean up the value mis-compare results you are seeing in your model. Please let me know if you have any further questions.
busser
Site Admin
 
Posts: 52
Joined: Thu Mar 13, 2008 7:42 pm


Return to Test Results Comparison

Who is online

Users browsing this forum: No registered users and 0 guests

cron