Precision

Precision is the maximum number of significant digits that can result from an operation. This is controlled by the instruction:
Read syntax diagramSkip visual syntax diagram
>>-NUMERIC DIGITS--+------------+--;---------------------------><
                   '-expression-'      

The expression is evaluated and must result in a positive whole number. This defines the precision (number of significant digits) to which calculations are carried out. Results are rounded to that precision, if necessary.

If you do not specify expression in this instruction, or if no NUMERIC DIGITS instruction has been processed since the start of a program, the default precision is used. The REXX standard for the default precision is 9.

Note that NUMERIC DIGITS can set values below the default of nine. However, use small values with care—the loss of precision and rounding thus requested affects all REXX computations, including, for example, the computation of new values for the control variable in DO loops.


Reference Reference

Feedback


Timestamp icon Last updated: Tuesday, 7 January 2014


http://pic.dhe.ibm.com/infocenter/cicsts/v5r1/topic/com.ibm.cics.rexx.doc//dfhrx/rvse095.html