Scripting Guide PDF PDF
Scripting Guide PDF PDF
Version: 14.1
Published Wednesday, June 19, 2019
Table of contents
Table of contents 3
Getting started 13
Analytics scripting basics 14
Comments 19
Data types 21
Expressions 22
Defining computed fields with expressions 24
Functions 26
Variables 28
Control structures 30
Grouping and looping 33
Top 30 Analytics functions 41
Commands 53
ACCEPT command 54
ACCESSDATA command 59
ACTIVATE command 66
AGE command 68
APPEND command 72
ASSIGN command 80
BENFORD command 83
CALCULATE command 87
CLASSIFY command 89
CLOSE command 94
CLUSTER command 96
COMMENT command 99
COUNT command 101
CREATE LAYOUT command 103
CROSSTAB command 105
CVSEVALUATE command 109
CVSPREPARE command 113
Page 3 of 934
Table of contents
Page 4 of 934
Table of contents
Page 5 of 934
Table of contents
Page 6 of 934
Table of contents
Page 7 of 934
Table of contents
Page 8 of 934
Table of contents
Page 9 of 934
Table of contents
Page 10 of 934
Table of contents
PARAM 854
PASSWORD 866
DATA 869
RESULT 873
PUBLISH 877
Developing analytics 878
Adding analytic headers 882
Analytic development best practices 885
Packaging analysis apps 891
Sample analytic scripts (analysis app) 894
Appendix 899
System requirements 900
Installing ACL for Windows 903
Configuring Python for use with Analytics 905
Unicode versus non-Unicode editions 908
Converting analytics to Unicode 909
Checking for Unicode compatibility 912
Running R scripts on AX Server 914
Running Python scripts on AX Server 918
Analytic engine error codes 922
Variables created by Analytics commands 929
Reserved keywords 933
Page 11 of 934
Getting started
Getting started
Page 13 of 934
Getting started
Commands
Every line in a script executes an ACLScript command and starts with the command name. A command is
an instruction to execute an operation in Analytics.
The command name is followed by one or more parameters that are specified as parameter_name para-
meter_value.
Tip
Depending on the command, some parameters are required and some are optional. You
do not need to specify optional parameters. If they are omitted, the command executes
without them. However, if you omit a required parameter, Analytics uses the default value
for that parameter.
Page 14 of 934
Getting started
Comments
Like any scripting language, you can add comments in ACLScript With the COMMENT keyword. Use com-
ments to make your code easier to understand and to communicate with anyone who may try to read, use,
or understand your script. ACLScript supports two types of comments:
l single line comments – all text following COMMENT is ignored until the end of the line is reached
l multiple line comment blocks – begin with COMMENT and each subsequent line is ignored until the
END keyword, or a blank line, is reached
For more information and examples, see "Comments" on page 19.
Data types
ACLScript supports four basic data types:
l logical – the simplest data type. Logical data expresses a truth value of either true or false
l numeric – contain digits from 0 to 9 and, optionally, a negative sign and a decimal point
l character – a series of one or more characters
l datetime – a date, datetime, or time value expressed in a supported format
Each data type is treated differently by Analytics and can be used in different commands and functions. For
more information about data types, see "Data types" on page 21.
Expressions
An expression is any statement that has a value. The simplest form of expression is a literal such as 2 or
"test", however expressions usually appear as calculations and can be as complex as any valid combination
of operators, conditions, functions, and values that you can imagine:
Expressions are typically used in Analytics to populate computed fields or as input for conditional logic. For
more information about expressions, see "Expressions" on page 22.
Page 15 of 934
Getting started
Functions
Functions are built-in routines that accept a given number of parameters and return a single value. Use
functions to manipulate field contents and variables that are used in commands.
Note
Functions do not modify field data, functions generate and return a new value based on a
calculation or algorithm that uses field data or variables as input. Use the value the func-
tion returns as input for a command.
Functions start with the function name followed directly by an opening parenthesis, a comma-separated
list of 0 or more values that are passed into the function as arguments, and a closing parenthesis.
Example
The BETWEEN(value, min, max) function takes three arguments and returns true if the value falls within
the range or false if it falls outside the range:
l value – the expression or field to test
l min – the minimum of the range
l max – the maximum of the range
Variables
A variable is temporary storage location used to hold a value. Variables have an associated identifier that
lets you reference and work with the value stored in your computer's memory.
ACLScript uses the ASSIGN command to create a variable and assign it a value at the same time:
ASSIGN v_age_in_years = 3
For simplicity you can omit the ASSIGN keyword, however ASSIGN is implicitly used and the same com-
mand runs:
v_age_in_years = 3
Page 16 of 934
Getting started
Note
ACLScript does not support null values. All variables must have an associated value of one
of the supported data types. The script interpreter evaluates the data type using the data
format and qualifier you use to assign the value. For more information, see "Data types" on
page 21.
Using variables
Once a variable is created, you can reference it anywhere you reference field names or variables. You can
also reassign it a new value using the ASSIGN command.
You can also use string interpolation, or variable substitution, to include a variable in a string literal by wrap-
ping the variable name in % characters. When Analytics encounters the substituted variable, it replaces the
placeholder with its corresponding value:
Control structures
A control structure is a component of a script that decides which direction to take based on given para-
meters. ACLScript provides both conditional logic and looping structures.
Conditional logic
ACLScript implements conditional logic as an IF command and as an optional parameter on many com-
mands in the language.
Tip
You use the IF command to control if a command runs or not while you use the IF para-
meter to decide which records in a table a command runs against.
IF command
Page 17 of 934
Getting started
IF parameter
Looping
The LOOP command provides the looping control structure in ACLScript. This command processes the
statements inside the loop for as long as the control test expression evaluates to true.
For more information about control structures, see "Control structures" on page 30
Page 18 of 934
Getting started
Comments
Like an scripting language, you can add comments in ACLScript With the COMMENT keyword. Use com-
ments to make your code easier to understand and to communicate with anyone who may try to read, use,
or understand your script.
Comment types
ACLScript supports two types of comments:
l single line comments – all text following COMMENT is ignored until the end of the line is reached
l multiple line comment blocks – begin with COMMENT and each subsequent line is ignored until the
END keyword, or a blank line, is reached
COMMENT
**********************
** This section of the script prepares data for import
**********************
END
COMMENT
************************************************************
*** Script Name: {App_ID}{Script name}
*** Parameters: {Detailed description}
Page 19 of 934
Getting started
Page 20 of 934
Getting started
Data types
ACLScript supports four basic data types: logical, numeric, character, and datetime.
Character A series of one or more characters. 32,767 Single quotation o 'John Doe'
bytes marks, or double o "John Doe"
quotation marks
Logical The simplest data type. Logical data o T No qualifier ASSIGN v_truth = 5 > 4
expresses a truth value of either true or o F evaluates to T
false.
Comparison operators such as '=', '>',
and '<' return logical values.
Page 21 of 934
Getting started
Expressions
An expression is any statement that has a value. The simplest form of expression is a literal, however
expressions can be as complex as any legal combination of operators, conditions, functions, and values
you can imagine.
Expression components
Literal values
A literal value is a value written exactly as it is meant to be interpreted, such as the character literal 'my
value'. For information about literals, see "Data types" on the previous page.
Operators
Operators are symbols that tell the script interpreter to perform arithmetic, string, comparison, or logical
evaluation of the specified values:
Page 22 of 934
Getting started
Functions
Expressions are evaluated using the values returned by functions. Functions execute with the highest pre-
cedence of any expression component. For more information about functions, see "Functions" on page 26.
Example expressions
Evaluates to 6
(2 + (3 - 2)) * 2
Evaluates to true
((2 + (3 - 2)) * 2) > ROOT(9,0)
Page 23 of 934
Getting started
Tip
Prefix computed field names with c_ to identify them as computed data rather than original
source data.
When the first conditional expression evaluates to true, the value specified for that case is used. In this
example, amount * general_rate is the default value used when neither of the conditional expressions eval-
uate to true.
Page 24 of 934
Getting started
Note
You must add an empty line between the line command and the conditions unless you
include the IF , WIDTH, PIC, or AS parameters on the DEFINE FIELD command. For
more information, see "DEFINE FIELD . . . COMPUTED command" on page 129.
Page 25 of 934
Getting started
Functions
Functions are built-in routines that accept a given number of parameters and return a single value. Use
functions to manipulate field contents and variables that are used in commands.
Note
Functions do not modify field data, functions generate and return a new value based on a
calculation or algorithm that uses field data or variables as input. Use the value the func-
tion returns as input for a command.
Function syntax
Functions start with the function name followed directly by an opening parenthesis, a comma-separated
list of 0 or more values that are passed into the function as arguments, and a closing parenthesis.
Example
The BETWEEN(value, min, max) function takes three arguments and returns true if the value falls within
the range or false if it falls outside the range:
l value – the expression or field to test
l min – the minimum of the range
l max – the maximum of the range
Function arguments
An argument of a function is a specific input value passed into the function.
Function arguments are passed to functions via an argument list. This is a comma-delimited list of literal
values, variables, or expressions that evaluate to values of the parameter data type. For more information
about working with data types, see "Data types" on page 21.
Note
If your project works with European number formats, or if you are writing scripts that are
portable across regions, separate function arguments with a space character instead of a
comma unless you are passing in a signed numeric value. Functions accepting signed
numeric values require an explicit delimiter.
Functions vs commands
The distinction between commands and functions is subtle but critical to using ACLScript:
Page 26 of 934
Getting started
Functions Commands
Use fields, values, or records as input and generate a new Use tables as input and generate new records and tables.
value that is returned.
Used in expressions, computed fields, command para- Used to analyze data, import data, and produce results.
meter values, variables, and filters to assist and modify
command execution.
Page 27 of 934
Getting started
Variables
A variable is temporary storage location used to hold a value. Variables have an associated identifier that
lets you reference and work with the value stored in your computer's memory.
ASSIGN v_age_in_years = 3
For simplicity you can omit the ASSIGN keyword, however ASSIGN is implicitly used and the same com-
mand runs:
v_age_in_years = 3
Note
ACLScript does not support null values. All variables must have an associated value of
one of the supported data types. The script interpreter evaluates the data type using the
data format and qualifier you use to assign the value. For more information, see "Data
types" on page 21.
Using variables
Once a variable is created, you can reference it anywhere you reference field names or variables. You can
also reassign it a new value using the ASSIGN command.
You can also use string interpolation, or variable substitution, to include a variable in a string literal by wrap-
ping the variable name in % characters. When Analytics encounters the substituted variable, it replaces
the placeholder with its corresponding value:
Page 28 of 934
Getting started
Types of variables
Analytics uses the following types of variables:
l system-generated variables – automatically created after executing a command
l permanent variables – remain in your computer's memory until you delete them and persist after clos-
ing the Analytics project
Note
To define a permanent variable, prefix the identifier with an '_': _v_company_name =
'Acme'.
l session variables – remain in your computer's memory until you delete them or until the
Analytics project is closed
Variable identifiers
Variable identifiers are case-insensitive and follow certain conventions related to the type of variable:
l system-generated variable identifiers use all caps: OUTPUTFOLDER
l permanent variable identifiers must have a '_' prefix: _v_permanent
l session variable identifiers use the format v_varname by convention but you are not restricted to this
naming convention
DISPLAY v_age_in_years
When the script encounters this command, it writes the command to the log file. To view the variable value
at that stage of script execution, click the entry in the log.
Tip
You can also use variables to help debug by inserting breakpoints in your script and inspect-
ing the variable values on the Variables tab of the Navigator.
Page 29 of 934
Getting started
Control structures
A control structure is a component of a script that decides which direction to take based on given para-
meters. ACLScript provides both conditional IF logic and looping structures.
The IF command
When using the IF command, you specify a conditional expression followed by the command to execute if
the expression evaluates to true:
This conditional structure controls which code executes, so you can use the IF command when you want to
process an entire table based on the test expression. If the expression evaluates as true, the command is
run against all records in the table. For more information about the IF command, see "IF command" on
page 232.
IF parameter
Many commands accept an optional IF parameter that you can use to filter which records the command is
executed against:
When this statement executes, the script classifies all records in the table where the value of the state field
is 'NY'.
Looping
The LOOP command
The LOOP command provides the looping control structure in ACLScript.
Page 30 of 934
Getting started
Note
The LOOP command must execute within the GROUP command, it cannot standalone.
This command processes the statements inside the loop for as long as the specified WHILE expression is
true:
ASSIGN v_counter = 10
GROUP
LOOP WHILE v_counter > 0
v_total = v_total + amount
v_counter = v_counter - 1
END
END
This structure iterates 10 times and adds the value of the amount field to the variable v_total. At the end of
each iteration, the v_counter variable is decremented by 1 and then tested in the WHILE expression. Once
the expression evaluates to false, the loop completes and the script progresses.
When the loop completes, v_total holds the sum of the 10 records' amount fields.
For more information about looping, see "LOOP command" on page 328.
LOOPING with a subscript
Sometimes the LOOP command does not provide the exact looping functionality you may require. In this
case, you can also call a separate Analytics script to execute a loop using the DO SCRIPT command: DO
SCRIPT scriptName WHILE conditionalTest.
You can use one of the following common methods to control when your loop ends:
l flag – the loop continues until the logical flag variable is set to FALSE
l counter – the loop continues until an incrementing or decrementing variable crosses a conditional
threshold
For more information about calling subscripts, see "DO SCRIPT command" on page 166.
Example
You need to import all the CSV files in the C:\data folder into your project. You can use the DIRECTORY
command to get a list of files from the folder, however you cannot use the IMPORT command inside the
GROUP structure. You need an alternative way of looping through the table that DIRECTORY creates.
To achieve this, you create a main script that:
1. Executes the DIRECTORY command and saves the results to a table.
2. Gets the number of records in the table to use as a counter.
3. Calls a subscript once per record in the table to execute the IMPORT command against the current
record.
Page 31 of 934
Getting started
Main script
Import subscript
COMMENT Import_Subscript
OPEN T_Table_To_Loop
LOCATE RECORD v_Counter
Variables are shared among all scripts that run in the project, so the main script calls the subscript until the
value of v_Counter exceeds the value of v_Num_Records. Each time the subscript executes, it incre-
ments v_Counter.
This structure allows you to call the IMPORT command against each record while looping through the
table. When the main script completes, you have imported all CSV files from the C:\data folder.
Page 32 of 934
Getting started
To calculate this amount, you use the GROUP command. Inside each iteration of GROUP, you:
1. Calculate the running total as of the current record.
2. Extract the invoice number, amount, date, and running total to a results table.
OPEN Ap_Trans
COMMENT iterate over each record in the table and then calculate and extract the running total
GROUP
ASSIGN v_running_total = v_running_total + Invoice_Amount
EXTRACT Invoice_Number, Invoice_Amount, Invoice_Date, v_running_total AS "Running total" TO
results1
END
When the script runs, the commands inside the GROUP block are processed against each record in the
table, from top to bottom, and the running total is calculated and extracted. If we could walk through
GROUP as it runs, this is how it would look:
Page 33 of 934
Getting started
Page 34 of 934
Getting started
Page 35 of 934
Getting started
Note
If a record evaluates to true for more than one case, the record is only processed by the
first IF/ELSE block that tests it. Records are never processed by more than one IF/ELSE
block in a GROUP command.
OPEN Ap_Trans
COMMENT use GROUP IF to run different ASSIGN and EXTRACT commands depending on invoice
amount
GROUP IF Invoice_Amount >= 1000
ASSIGN v_running_total_hi = v_running_total_hi + Invoice_Amount
EXTRACT Invoice_Number, Invoice_Amount, Invoice_Date, v_running_total_hi AS "Running total"
TO results_hi
ELSE IF Invoice_Amount >= 100
ASSIGN v_running_total_med = v_running_total_med + Invoice_Amount
EXTRACT Invoice_Number, Invoice_Amount, Invoice_Date, v_running_total_med AS "Running
total" TO results_med
ELSE
ASSIGN v_running_total_low = v_running_total_low + Invoice_Amount
EXTRACT Invoice_Number, Invoice_Amount, Invoice_Date, v_running_total_low AS "Running
total" TO results_low
END
When the script runs, the GROUP command tests the invoice amount for each record. Depending on the
amount, the record is used to update one of three running totals (low, medium, high) and three result
tables are produced.
Page 36 of 934
Getting started
COMMENT
use GROUP to count commas in each department code field as a way of identifying how many depart-
ments are associated with the record
"LOOP" over each record for each code in the field, extracting each code into its own record
END
GROUP
v_department_count = OCCURS(Department_Code,',')
v_counter = 0
LOOP WHILE v_counter <= v_department_count
v_dept = SPLIT(Department_Code, ',', (v_counter + 1))
EXTRACT FIELDS Invoice_Number, Invoice_Amount, v_dept AS "Department" TO result1
v_counter = v_counter + 1
END
END
When the script runs, the commands inside the GROUP block are processed against each record in the
table, from top to bottom. For each record, the LOOP command iterates over the record once per depart-
ment code in the comma-delimited list and then extracts a record. If we could walk through GROUP and
LOOP as they run, this is how it would look:
Page 37 of 934
Getting started
For the first record in the table, the value of v_department_count is 1, so LOOP iterates twice:
1. For the first iteration of the LOOP:
l v_counter = 0
l v_depart = CCD
The following record is extracted and the value of v_counter is incremented to 1, therefore LOOP
iterates again:
l v_depart = RDR
The following record is extracted and the value of v_counter is incremented to 2, therefore LOOP
does not iterate again and GROUP proceeds to the next record:
For the second record in the table, the value of v_department_count is 0, so LOOP iterates once:
l v_counter = 0
l v_depart = CCD
The following record is extracted and the value of v_counter is incremented to 1, therefore LOOP does not
iterate again and GROUP proceeds to the next record:
Page 38 of 934
Getting started
For the third record in the table, the value of v_department_count is 2, so LOOP iterates three times:
1. For the first iteration of LOOP:
l v_counter = 0
l v_depart = CCD
The following record is extracted and the value of v_counter is incremented to 1, therefore LOOP iter-
ates again:
l v_depart = LMO
The following record is extracted and the value of v_counter is incremented to 2, therefore LOOP iter-
ates again:
l v_depart = RDR
The following record is extracted and the value of v_counter is incremented to 3, therefore LOOP
does not iterate again and GROUP reaches the end of the table:
Page 39 of 934
Getting started
Page 40 of 934
Getting started
Top 30 Analytics functions
The top 30 functions in ACLScript are useful across a number of different tasks. Use these functions reg-
ularly to help you prepare, parse, convert, and harmonize data in your scripts.
ALLTRIM( )
Returns a string with leading and trailing spaces removed from the input string.
Note
It is good practice to use ALLTRIM() on any character field that you are using as input for
another function so that no leading or trailing spaces affect the returned value.
Example
The Vendor_Number field contains the value " 1254". You need to remove this extra space from Vendor_
Number so that you can harmonize the field with data in another table.
UPPER( )
Returns a string with alphabetic characters converted to uppercase.
Example
The Last_Name field contains the value "Smith". You need to make this value uppercase to compare with
an uppercase value from another table.
Page 41 of 934
Getting started
LOWER( )
Returns a string with alphabetic characters converted to lowercase.
Example
The Last_Name field contains the value "Smith". You need to make this value lowercase to compare with
an lowercase value from another table.
PROPER( )
Returns a string with the first character of each word set to uppercase and the remaining characters set to
lowercase.
Example
The Last_Name field contains the value "smith". You need to display it as a proper noun in your output.
SUBSTR( )
Returns a specified substring from a string.
Example
The GL_Account_Code field contains the value "001-458-873-99". You need to extract the first three
bytes, or characters, from the string.
Page 42 of 934
Getting started
LAST( )
Returns a specified number of characters from the end of a string.
Example
The GL_Account_Code field contains the value "001-458-873-99". You need to extract the last two bytes,
or characters, from the string.
SPLIT( )
Returns a specified segment from a string.
Example
The GL_Account_Code field contains the value "001-458-873-99". You need to extract the second seg-
ment of the code from the string.
AT( )
Returns a number specifying the starting location of a particular occurrence of a substring within a character
value.
Example
The GL_Account_Code field contains the value "001-458-873-99". You need to determine the starting byte
position of the value "458" to test whether the GL code's second segment is "458" (start position "5").
Page 43 of 934
Getting started
OCCURS( )
Returns a count of the number of times a substring occurs in a specified character value.
Example
The GL_Account_Code field contains the value "001-458-873-99". You need to determine that the
GL code is correctly formatted by ensuring the data contains three hyphen characters.
LENGTH( )
Returns the number of characters in a string.
Example
The GL_Account_Code field contains the value "001-458-873-99". You need to determine that the
GL code is correctly formatted by ensuring the data contains 14 characters.
STRING( )
Converts a numeric value to a character string.
Example
The Invoice_Amount field contains the value 12345.67. You need to convert this to character data.
Page 44 of 934
Getting started
VALUE( )
Converts a character string to a numeric value.
Tip
VALUE( ) is often used with ZONED( ) to add leading zeros.
Example
The Invoice_Amount field contains the value "12345.67". You need to convert this to numeric data.
CTOD( )
Converts a character or numeric date value to a date. Can also extract the date from a character or numeric
datetime value and return it as a date. Abbreviation for "Character to Date".
Example
The Submission_Date field contains the value "April 25, 2016". You need to convert this to datetime data.
DATE( )
Extracts the date from a specified date or datetime and returns it as a character string. Can also return the
current operating system date.
Example
The Submission_Date field contains the value `20160425`. You need to convert this to character data.
Page 45 of 934
Getting started
ZONED( )
Converts numeric data to character data and adds leading zeros to the output.
Example
The Employee_Number field contains the value "254879". You need to convert the value to a 10-digit
string with leading zeros.
Tip
You must use the VALUE() function to convert the character to numeric data before using
the numeric as input for ZONED().
BINTOSTR( )
Returns Unicode character data converted from ZONED or EBCDIC character data. Abbreviation for "Bin-
ary to String".
Note
Unicode edition only. For non-Unicode editions, see ZONED() above.
Example
The Employee_Number field contains the value "254879". You need to convert the value to a 10-digit
string with leading zeros.
Tip
You must use the VALUE() function to convert the character to numeric data before using
the numeric as input for ZONED(). You then use BINTOSTR() to convert the ASCII data
returned from ZONED() to Unicode.
Page 46 of 934
Getting started
MONTH( )
Extracts the month from a specified date or datetime and returns it as a numeric value (1 to 12).
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the month as char-
acter data with a leading zero.
DAY( )
Extracts the day of the month from a specified date or datetime and returns it as a numeric value (1 to 31).
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the day as char-
acter data.
YEAR( )
Extracts the year from a specified date or datetime and returns it as a numeric value using the YYYY format.
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the year as a
numeric value.
Page 47 of 934
Getting started
HOUR( )
Extracts the hour from a specified time or datetime and returns it as a numeric value using the 24-hour
clock.
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the hours as a
numeric value.
COMMENT returns 10
HOUR(Transaction_Date)
MINUTE( )
Extracts the minutes from a specified time or datetime and returns it as a numeric value.
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the minutes as a
numeric value.
COMMENT returns 2
MINUTE(Transaction_Date)
SECOND( )
Extracts the seconds from a specified time or datetime and returns it as a numeric value.
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the seconds as a
numeric value.
COMMENT returns 52
SECOND(Transaction_Date)
Page 48 of 934
Getting started
CDOW( )
Returns the name of the day of the week for a specified date or datetime. Abbreviation for "Character Day of
Week".
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the name of the
day as character data.
CMOY( )
Returns the name of the month of the year for a specified date or datetime. Abbreviation for "Character
Month of Year".
Example
The Transaction_Date field contains the value `20160815 100252`. You need to extract the name of the
month as character data.
Manipulating strings
Remove or replace segments of character fields using these functions.
INCLUDE( )
Returns a string that includes only the specified characters.
Example
The Address field contains the value "12345 ABC Corporation". You need to extract the address number
and exclude the name of the company.
Page 49 of 934
Getting started
EXCLUDE( )
Returns a string that excludes the specified characters.
Example
The Address field contains the value "12345 ABC Corporation". You need to extract the name of the com-
pany and exclude the address number.
REPLACE( )
Replaces all instances of a specified character string with a new character string.
Example
The Address field contains the value "12345 Acme&Sons". You need to replace the "&" character with the
word " and ".
OMIT( )
Returns a string with one or more specified substrings removed.
Example
The Address field contains the value "12345 Fake St". You need to extract the address without the street
suffix.
REVERSE( )
Returns a string with the characters in reverse order.
Example
Page 50 of 934
Getting started
The Report_Line field contains the value "001 Correction 5874.39 CR ". You need to reverse the value and
omit any leading or trailing spaces.
BLANKS( )
Returns a string containing a specified number of blank spaces.
Example
You need to create a computed field for a region name based on a value in the region_code field. You must
ensure that the default value you specify at the end of the command is at least as long as the longest input
value.
"Southern" IF region_code = 1
"Northern" IF region_code = 2
"Eastern" IF region_code = 3
"Western" IF region_code = 4
BLANKS(v_length)
Page 51 of 934
Commands
Commands
Page 53 of 934
Commands
ACCEPT command
Creates a dialog box that interactively prompts users for one or more script input values. Each input value
is stored in a named character variable.
Note
Using the ACCEPT command to enter passwords is not secure. You should use the
"PASSWORD command" on page 350 instead.
The ACCEPT command is not supported in AX Server analytics.
You can create a more advanced interactive dialog box with the "DIALOG command" on
page 148.
Syntax
ACCEPT {message_text <FIELDS project_item_category> TO variable_name} <...n>
Parameters
Name Description
message_text The label displayed in the dialog box used to prompt for input. Must be a quoted string
or a character variable.
When entering multiple prompts, you can separate them with commas. Using commas
improves script readability, but it is not required:
FIELDS project_item_cat- Creates a drop-down list of project items for user input instead of a text box. The user
egory can select a single project item, field, or variable from the list.
optional project_item_category specifies which item types to display in the list. For example, spe-
cifying xf displays all the project tables in the list. Enclose project_item_category in quo-
tation marks:
FIELDS "xf"
For the codes used to specify categories, see "Codes for project item categories" on
page 57.
You can specify more than one code in the same prompt, but you cannot mix project
items, fields, or variables.
Page 54 of 934
Commands
Name Description
TO variable_name The name of the character variable to use to store the user input. If the variable does not
exist, it is created.
If the variable already exists, its current value is displayed in the dialog box as the
default value.
Note
You cannot use non-English characters, such as é, in the names of vari-
ables that will be used in variable substitution. Variable names that con-
tain non-English characters will cause the script to fail.
The ACCEPT command creates character variables only. If you need
input of another data type, you must convert the character variable to the
required type in subsequent processing in a script. For more information,
see "Input data type" on page 57.
Examples
Prompting the user to select the Analytics table to open
You require a dialog box that prompts the user to select the name of the table to open. The script then opens
the table the user selects:
The percent signs are required because they indicate that the table name to open is stored in the v_table_
name variable. If the percent signs are omitted, the script attempts to open a table called "v_table_name".
Page 55 of 934
Commands
Using a single dialog box with multiple prompts to gather required input
You want to create a single dialog box for all values that the script user must enter.
You use multiple prompts separated by commas in the ACCEPT command to ask the user for multiple
input values. The same dialog box contains prompts for the start date and the end date of a date range:
Remarks
Interactivity
Use ACCEPT to create an interactive script. When the ACCEPT command is processed, the script
pauses and a dialog box is displayed that prompts the user for input that Analytics uses in subsequent pro-
cessing.
You can create separate dialog boxes that prompt for one item at a time, or you can create one dialog box
that prompts for multiple items.
Page 56 of 934
Commands
Project categories
Code Category
xf Tables
xb Scripts
xi Indexes
xw Workspaces
Field categories
Code Category
C Character fields
N Numeric fields
D Datetime fields
L Logical fields
Variable categories
Code Category
c Character variables
n Numeric variables
d Datetime variables
l Logical variables
Page 57 of 934
Commands
or datetime value:
In the example, the start and end dates for this filter are stored as character values. They must be con-
verted to date values in order to be used with a date field that uses a Datetime data type.
Enclosing the variable name in percent signs (%) substitutes the character value contained by the variable
for the variable name. The CTOD( ) function then converts the character value to a date value.
Page 58 of 934
Commands
ACCESSDATA command
Imports data from a variety of ODBC-compliant data sources.
The command takes the form ACCESSDATA64 or ACCESSDATA32 depending on whether you are using
a 64-bit or 32-bit ODBC driver.
Syntax
{ACCESSDATA64 | ACCESSDATA32} {CONNECTOR | ODBC {"Driver"|"Dsn"|"File"}} NAME value
<USER user_id> <PASSWORD num | PROMPT_PASSWORD> TO table_name CHARMAX max_
field_length MEMOMAX max_field_length <ALLCHARACTER> SOURCE (connection_settings)
<HASH(salt_value, fields)>
SQL_QUERY
(SQL_syntax)
END_QUERY
Parameters
Name Description
NAME value The name of the Analytics data connector, the ODBC driver, or the DSN.
For example:
o NAME "Amazon Redshift"
o NAME "Microsoft Access Driver (*.mdb, *.accdb)"
o NAME "My Excel DSN"
o NAME "excel.dsn"
USER user_id The user ID for data sources that require a user ID.
optional
Page 59 of 934
Commands
Name Description
For more information, see "PASSWORD command" on page 350 and "SET command" on
page 408.
Tip
Using the PASSWORD command with PASSWORD num is similar to
using PROMPT_PASSWORD. Both approaches prompt the user for a
password. PROMPT_PASSWORD has the benefit of allowing updating of
the user_id.
CHARMAX max_field_ The maximum length in characters for any field in the Analytics table that originates as
length character data in the source from which you are importing.
The default value is 50. Data that exceeds the maximum field length is truncated when
imported to Analytics.
MEMOMAX max_field_ The maximum length in characters for text, note, or memo fields you are importing.
length
The default value is 100. Data that exceeds the maximum field length is truncated when
imported to Analytics.
ALLCHARACTER Automatically assign the Character data type to all imported fields.
optional Once the data is in Analytics, you can assign different data types, such as Numeric or Dat-
etime, to the fields, and specify format details.
Tip
ALLCHARACTER is useful if you are importing a table that contains
numeric ID values. You can use ALLCHARACTER to prevent Analytics
automatically assigning the Numeric data type to values that should use
the Character data type.
SOURCE connection_set- The connection settings (connection string) required to connect to the data source.
tings
Page 60 of 934
Commands
Name Description
HASH(salt_value, fields) Imports the specified fields as cryptographic hash values. Hash values are one-way trans-
formations and cannot be decoded after you import the fields:
optional
o salt_value – an alphanumeric string that is concatenated with the source data values
to strengthen the hashing of the values in the fields. Enter the hash value as a quoted
string.
The salt value is limited to 128 characters. Do not use any of the following characters:
( ) "
o fields – a list of one or more fields to hash. Enter the fields as a quoted string and sep-
arate each field with a comma.
You must specify the field name you see in the Data Access window preview and sta-
ging area, not the physical field name in the data source.
Note
The field name that is shown in the Data Access window preview is the
field alias value in the SQL query ("field_name" AS "alias"). You must
use the alias value to reference fields.
For information about comparing values hashed during import to values hashed in
ACLScript, see "Comparing data hashed with ACCESSDATA to data hashed with the
ACLScript HASH( ) function" on page 65.
Examples
Importing data using a native Analytics data connector
You need to import data from the Amazon Redshift cloud data service. To do so, you use the
Analytics Amazon Redshift data connector:
Page 61 of 934
Commands
;de-
clarefetch-
mode=0;maxbytea=255;maxlongvarchar=8190;maxvarchar=255;port=5439;servername=acl_
test.high-
bond.-
com;sin-
glerowmode=1;sslmode=require;textaslongvarchar=0;usemultiplestatments=0;useunicode=1)
SQL_QUERY(
SELECT
"entitlement_history"."organization" AS "organization",
"entitlement_history"."user_email" AS "user_email",
"entitlement_history"."plan_id" AS "plan_id",
"entitlement_history"."date_from" AS "date_from",
"entitlement_history"."date_to" AS "date_to"
FROM
"prm"."entitlement_history" "entitlement_history"
) END_QUERY
Page 62 of 934
Commands
`Orders`.`Quantity` AS `Quantity`,
`Product`.`ProdID` AS `Product_ProdID`,
`Product`.`ProdName` AS `ProdName`,
`Product`.`UnitPrice` AS `UnitPrice`,
`Product`.`Descript` AS `Descript`,
`Product`.`ShipWt` AS `ShipWt`
FROM
(`Customer` `Customer`
INNER JOIN
`Orders` `Orders`
ON `Customer`.`CustID` = `Orders`.`CustID`
)
INNER JOIN
`Product` `Product`
ON `Orders`.`ProdID` = `Product`.`ProdID`
WHERE
(
`Customer`.`Region` = 'BC'
OR `Customer`.`Region` = 'WA'
)
) END_QUERY
Page 63 of 934
Commands
Remarks
Note
For more information about how this command works, see the Analytics Help.
Page 64 of 934
Commands
Once you hash the Analytics values, you can compare them with values hashed as part of the
ACCESSDATA command import.
Page 65 of 934
Commands
ACTIVATE command
Adds field definitions stored in an Analytics workspace to the existing set of field definitions in an Analytics
table layout.
Syntax
ACTIVATE <WORKSPACE> workspace_name <OK>
Parameters
Name Description
Examples
Activating a workspace in your Analytics project
You activate the ComplexFormulas workspace:
Activating a workspace saved as a file (.wsp) in the same folder as your Ana-
lytics project
You activate the ComplexFormulas workspace that was saved to a .wsp file:
Page 66 of 934
Commands
Remarks
How it works
ACTIVATE makes workspace field definitions available to the active table. Once you activate a workspace,
its fields remain available for use with the active table until you close the table.
Page 67 of 934
Commands
AGE command
Groups records into aging periods based on values in a date or datetime field. Counts the number of
records in each period, and also subtotals specified numeric fields for each period.
Syntax
AGE <ON> date_field <CUTOFF cutoff_date> <INTERVAL days <,...n>> <SUPPRESS>
<SUBTOTAL numeric_field <...n> |SUBTOTAL ALL> <IF test> <WHILE test> <FIRST
range|NEXT range> <TO {SCREEN|filename|GRAPH|PRINT}> <KEY break_field>
<HEADER header_text> <FOOTER footer_text> <APPEND> <LOCAL> <STATISTICS>
Parameters
Name Description
ON date_field The name of the date or datetime field or the expression to be aged.
Although you can age on a datetime field, only the date portion of datetime values is
considered. The time portion is ignored. You cannot age on time data alone.
CUTOFF cutoff_date The date that values in date_field are compared against.
optional You must specify cutoff_date as an unquoted string in YYMMDD or YYYYMMDD format,
regardless of the format of the date field. For example: CUTOFF 20141231
If you omit CUTOFF, the current system date is used as the cutoff date.
INTERVAL days <,...n> The date intervals (that is, number of days) to use in calculating the aging periods.
optional days represents the beginning of each aging period measured backward in time from
cutoff_date:
o the first days value identifies the beginning of the first aging period
o a first days value of '0' specifies that the first aging period begins on the specified
cutoff_date
o the last days value identifies the end of the final aging period
You must specify the intervals as an unquoted string with comma separated values:
INTERVAL
0,90,180,270,365
The default aging periods are 0, 30, 60, 90, 120, and 10,000 days. An interval of 10,000
days is used to isolate records with dates that are probably invalid.
If required, date intervals can be customized to mirror other internal aging reports.
Page 68 of 934
Commands
Name Description
SUPPRESS Suppresses dates that fall outside the aging period from the command output.
optional
SUBTOTAL numeric_field One or more numeric fields or expressions to subtotal for each group.
<...n> | SUBTOTAL ALL
Multiple fields must be separated by spaces. Specify ALL to subtotal all the numeric
optional fields in the table.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
TO SCREEN | filename | The location to send the results of the command to:
GRAPH | PRINT o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o GRAPH – displays the results in a graph in the Analytics display area
o PRINT – sends the results to the default printer
KEY break_field The field or expression that groups subtotal calculations. A subtotal is calculated each
time the value of break_field changes.
optional
break_field must be a character field or expression. You can specify only one field, but
Page 69 of 934
Commands
Name Description
you can use an expression that contains more than one field.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
STATISTICS Note
optional Cannot be used unless SUBTOTAL is also specified.
Calculates average, minimum, and maximum values for all SUBTOTAL fields.
Examples
Age invoices with subtotaled amounts
You want to age an accounts receivable table on the Invoice_Date field and subtotal the Invoice_Amount
field.
Invoices are grouped into 30-day periods:
l from the cutoff date to 29 days previous
l from 30 days previous to 59 days previous
l so on
Page 70 of 934
Commands
The results include the total outstanding invoice amount for each period:
OPEN Ar
AGE ON Invoice_Date CUTOFF 20141231 INTERVAL 0,30,60,90,120,10000 SUBTOTAL Invoice_
Amount TO SCREEN
Remarks
Note
For more information about how this command works, see the Analytics Help.
Aging periods
The AGE command groups records into aging periods based on values in a date or datetime field. The out-
put results contain a single record for each period, with a count of the number of records in the source table
that fall into each period.
Interval measurement
Aging periods are based on date intervals (that is, number of days) measured backward in time from the cur-
rent system date, or from a cutoff date you specify such as a fiscal period end date.
Future periods
You can create aging periods more recent than the cutoff date by entering negative values for date inter-
vals. For example, the following creates aging periods running forward and backward from the cutoff date:
INTERVAL -60,-30,0,30,60,90
This approach creates a date profile of all the records in a table using different points in time.
Page 71 of 934
Commands
APPEND command
Combines records from two or more Analytics tables by appending them in a new Analytics table.
Syntax
APPEND table_1, table_2, <...n> TO table_name <COMMONFIELDS> <OPEN> <ASCHAR>
<ALLCHAR>
Parameters
Name Description
COMMONFIELDS Only those fields that are common to all tables being appended are included in the out-
put table.
optional
If you omit COMMONFIELDS, all fields from all tables are included in the output table.
Blank values appear in the output table where no fields exist in the source tables.
Page 72 of 934
Commands
Name Description
Tip
For diagrams and screen captures illustrating the two options, see
Appending tables.
Note
The APPEND command does not support appending computed fields.
For more information, see "Computed fields not supported" on page 76.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
ASCHAR Harmonizes fields with identical names but different data categories by converting non-
character fields to the character data category.
optional
For example, you append two tables in which the Employee_ID field is character data in
one table, and numeric data in the other. The numeric Employee_ID field is converted to
character data and the two fields are appended without an error.
ASCHAR is ignored if ALLCHAR is also specified.
ALLCHAR Converts all non-character fields in all tables being appended to the character data cat-
egory.
optional
This global conversion to character data ensures that all identically named fields are
appended without error.
Note
After appending, you can change the data category of an entire appen-
ded field if appropriate for the data contained by the field.
Page 73 of 934
Commands
Examples
Append three monthly transaction tables
The example below appends three monthly transaction tables and outputs a quarterly transaction table
that includes all fields from the three source tables:
Append three employee tables and harmonize fields with different data cat-
egories
The examples below append three divisional employee tables in which some identically named fields use
different data categories.
The first example converts non-character fields to the character data category only when required for har-
monization:
The second example converts all non-character fields to the character data category whether required for
harmonization or not:
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
The APPEND command combines records from two or more tables by appending them and creating a
new table. Appending means to add one group of records to the bottom of another group of records.
Page 74 of 934
Commands
Source table fields with identical physical names and identical data categories are directly appended to one
another.
Fields with physical names that are unique across all the source tables are added to the output table but not
directly appended to any other field.
Tip
If you want to directly append inconsistently named fields, standardize the physical names
of the fields in the table layouts before appending. (Assumes that the fields belong to the
same data category, or that you use ASCHAR or ALLCHAR to harmonize the data cat-
egory of the fields.)
Page 75 of 934
Commands
Note
You can harmonize dissimilar datetime fields by converting them to the character data cat-
egory, and then append them. This approach allows you to combine the data in a single
table. However, depending on the nature of the source data, you may not be able to con-
vert the combined data back to datetime data.
Automatic harmonization
In some situations the APPEND command automatically harmonizes fields in order to append them:
Data category of
fields Harmonization performed
Numeric o Different field lengths are harmonized. The fields are converted to the ACL data type.
o A different number of defined decimal places are harmonized. Decimal places are stand-
ardized on the greatest number of places, with trailing zeros added to numeric values
where necessary. The fields are converted to the ACL data type.
o Different numeric data types such as Print, Float, EBCDIC, and Micro are harmonized by
converting the fields to the ACL data type.
Datetime o Different date, datetime, or time formats in the source data are harmonized by converting
the fields to the Analytics default formats:
l YYYYMMDD
l YYYYMMDD hh:mm:ss
l hh:mm:ss
Page 76 of 934
Commands
If a computed field in a source table has the same name as a physical field in another source table, an error
message appears and the APPEND command is not executed.
Tip
You can append a computed field by first extracting it to convert the field to a physical field.
(For more information, see "EXTRACT command" on page 197.) You then use the extrac-
ted table in the append operation.
Another approach is to recreate the computed field in the appended output table.
Record length
If you include all fields from all source tables when appending, the record length in the output table can be
longer than the longest record in the source tables.
An error message appears if the output record length exceeds the Analytics maximum of 32 KB.
Page 77 of 934
Commands
Sorting
Any existing sort orders in the source tables are separately maintained in the respective record sets in the
output table.
Even if the records in all source tables are sorted, the output table is considered unsorted because the
source records are appended as groups, without consideration of any existing sort order in other source
tables.
For example, if you append monthly or quarterly tables to create an annual table, any internal sorting of
the monthly or quarterly data is retained. If required, you can sort the output table after performing the
append operation.
Table Fields
The first table specified in the APPEND command dictates the order of the fields in the output table. So in
the example above, the order in the output table is:
l Last_name | First_name | Middle_name
Non-common fields
Non-common fields in source tables appear in the output table in the order that they appear in the selected
group of source tables.
For example, when appending these two tables:
Table Fields
Page 78 of 934
Commands
Table Fields
Page 79 of 934
Commands
ASSIGN command
Creates a variable and assigns a value to the variable.
Syntax
ASSIGN variable_name = value <IF test>
Tip
You can omit the ASSIGN keyword because Analytics automatically interprets the fol-
lowing syntax as an assignment operation:
variable_name = value
Parameters
Name Description
variable_name The name of the variable to assign the value to. If the variable does not exist, it is cre-
ated. If the variable already exists, it is updated with the new value.
Do not use non-English characters, such as é, in the names of variables. Variable
names that contain non-English characters will cause scripts to fail.
Note
Variable names are limited to 31 alphanumeric characters. The name
can include the underscore character ( _ ), but no other special char-
acters, or any spaces. The name cannot start with a number.
value The value to assign to the variable. If a new variable is created, the variable type is
based on the data type in value.
IF test A conditional expression that must be true to create the variable or assign the value to
the variable.
Optional
Examples
Assigning a value to a variable
You assign the value of the Amount field in the current record to a variable named v_current_amount.
Page 80 of 934
Commands
Because v_current_amount is a variable, its value does not change unless explicitly changed by another
ASSIGN command:
Remarks
Duration of variables
Variables with names that are not prefaced with an underscore are retained for the duration of the current
Analytics session only.
If you want a variable to be permanently saved with an Analytics project, preface the variable name with an
underscore:
ASSIGN value = _variable_name
Page 81 of 934
Commands
Page 82 of 934
Commands
BENFORD command
Counts the number of times each leading digit (1–9) or leading digit combination occurs in a field, and com-
pares the actual count to the expected count. The expected count is calculated using the Benford formula.
Syntax
BENFORD <ON> numeric_field <LEADING n> <IF test> <BOUNDS> <TO {SCREEN|table_
name|GRAPH|PRINT}> <HEADER header_text> <FOOTER footer_text> <WHILE test> <FIRST
range|NEXT range> <APPEND> <OPEN> <LOCAL>
Parameters
Name Description
LEADING n The number of leading digits to analyze. The value of n must be from 1 to 6.
optional If LEADING is omitted, the default value of 1 is used.
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
BOUNDS Includes computed upper and lower bound values in the output results.
optional If the actual count of more than one digit or digit combination in the output results
exceeds either of the bounds, the data may have been manipulated and should be invest-
igated.
TO SCREEN | table_name The location to send the results of the command to:
| GRAPH | PRINT o SCREEN – displays the results in the Analytics display area
optional o table_name – saves the results to an Analytics table
Page 83 of 934
Commands
Name Description
Specify table_name as a quoted string with a .FIL file extension. For example: TO "Out-
put.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _
), but no other special characters, or any spaces. The name cannot
start with a number.
o GRAPH – displays the results in a graph in the Analytics display area
o PRINT – sends the results to the default printer
HEADER header_text The text to insert at the top of each page of a report.
optional header_text must be specified as a quoted string. The value overrides the Analytics
HEADER system variable.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional
Page 84 of 934
Commands
Name Description
Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
Examples
Outputting results to graph
You run the BENFORD command against the Amount field and output the results to a graph:
Remarks
What data can I test using Benford analysis?
You should only use Benford analysis for testing numeric data composed of "naturally occurring numbers",
such as accounting amounts, transaction amounts, expenses, or address numbers. Benford analysis is not
suitable for numeric data that is constrained in any way.
Follow these guidelines for identifying numeric data that is suitable for Benford analysis:
l Size of the data set – The data set must be large enough to support a valid distribution. Benford ana-
lysis may not give reliable results for fewer than 500 records.
l Leading digit requirement – All numbers from 1 to 9 must have the possibility of occurring as the lead-
ing digit.
Page 85 of 934
Commands
l Leading digit combination requirement – All numbers from 0 to 9 must have the possibility of occur-
ring as the second leading digit, and as any additional digits being analyzed.
l Constrained data – Numeric data that is assigned or generated according to a pre-ordained pattern
is not suitable for Benford analysis. For example, do not use Benford to analyze:
l sequential check or invoice numbers
l any numbering scheme with a range that prevents certain numbers from appearing
l Random numbers – Numbers generated by a random number generator are not suitable for Ben-
ford analysis.
Page 86 of 934
Commands
CALCULATE command
Calculates the value of one or more expressions.
Syntax
CALCULATE expression <AS result_label> <,...n>
Parameters
Name Description
AS result_label The name of the result when displayed on screen and in the Analytics command log.
optional result_label must be a quoted string or a valid character expression.
If omitted, the expression being calculated is used as the result name.
Examples
Performing a simple calculation
You use CALCULATE to multiply 4.70 by 18.50, returning the result 86.95:
Page 87 of 934
Commands
Remarks
How it works
CALCULATE provides the functionality of a calculator combined with access to Analytics functions, vari-
ables, and the data in the currently selected record.
Command output
Depending on where you run CALCULATE, the results are output to different locations:
l From the command line – the result is displayed on screen
l From within a script – the result is recorded in the log
The result_label value is not a variable that you can use in a script. It is only used to identify the calculation
on screen or in the log.
CALCULATE 365/52/7
Returns 1.0027:
CALCULATE 365.0000/52/7
Page 88 of 934
Commands
CLASSIFY command
Groups records based on identical values in a character or numeric field. Counts the number of records in
each group, and also subtotals specified numeric fields for each group.
Syntax
CLASSIFY <ON> key_field <SUBTOTAL numeric_field <...n> |SUBTOTAL ALL> <INTERVALS num-
ber> <SUPPRESS> <TO {SCREEN|table_name|GRAPH|PRINT}> <IF test> <WHILE test> <FIRST
range|NEXT range> <HEADER header_text> <FOOTER footer_text> <KEY break_field> <OPEN>
<APPEND> <LOCAL> <STATISTICS>
Parameters
Name Description
SUBTOTAL numeric_field One or more numeric fields or expressions to subtotal for each group.
<...n> | SUBTOTAL ALL
Multiple fields must be separated by spaces. Specify ALL to subtotal all the numeric fields
optional in the table.
SUPPRESS Note
optional Cannot be used unless INTERVALS is also specified. SUPPRESS is not
available in the Analytics user interface and can only be used as part of
ACLScript syntax in a script or the command line.
Excludes sets of identical values exceeding the maximum specified by INTERVALS from
Page 89 of 934
Commands
Name Description
TO SCREEN | table_name The location to send the results of the command to:
| GRAPH | PRINT o SCREEN – displays the results in the Analytics display area
o table_name – saves the results to an Analytics table
Specify table_name as a quoted string with a .FIL file extension. For example: TO "Out-
put.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _
), but no other special characters, or any spaces. The name cannot
start with a number.
o GRAPH – displays the results in a graph in the Analytics display area
o PRINT – sends the results to the default printer
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Page 90 of 934
Commands
Name Description
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
KEY break_field The field or expression that groups subtotal calculations. A subtotal is calculated each
time the value of break_field changes.
optional
break_field must be a character field or expression. You can specify only one field, but
you can use an expression that contains more than one field.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
STATISTICS Note
optional Cannot be used unless SUBTOTAL is also specified.
Calculates average, minimum, and maximum values for all SUBTOTAL fields.
Examples
Total transaction amount per customer
You want to classify an accounts receivable table on the Customer_Number field and subtotal the Trans_
Amount field. The output results are grouped by customer, and include the total transaction amount for
each customer:
Page 91 of 934
Commands
OPEN Ar
CLASSIFY ON Customer_Number SUBTOTAL Trans_Amount TO "Customer_total.FIL"
OPEN Ar
CLASSIFY ON Customer_Number SUBTOTAL Trans_Amount TO "Customer_stats.FIL"
STATISTICS
OPEN Ap_Trans
CLASSIFY ON Invoice_Amount TO "Grouped_invoice_amounts.FIL" OPEN
SET FILTER TO COUNT > 1
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
CLASSIFY groups records that have the same value in a character or numeric field.
Output contains a single record for each group, with a count of the number of records in the source table
that belong to the group.
Page 92 of 934
Commands
Subtotal field subtotaled field name in source Total + subtotaled alternate column title in source
table table
Average field a_ subtotaled field name in Average + subtotaled alternate column title in
source table source table
Minimum field m_ subtotaled field name in Minimum + subtotaled alternate column title in
source table source table
Maximum field x_ subtotaled field name in Maximum + subtotaled alternate column title in
source table source table
Page 93 of 934
Commands
CLOSE command
Closes an Analytics table, index file, or log file, or ends a Script Recorder session.
Syntax
CLOSE < table_name|PRIMARY|SECONDARY|INDEX|LOG|LEARN>
Parameters
Name Description
Examples
Closing a table by name
You want to close a table called Inventory :
CLOSE Inventory
Page 94 of 934
Commands
CLOSE SECONDARY
Remarks
When to use CLOSE
You typically do not need to close Analytics tables. The active Analytics table automatically closes when you
open another table. The primary table also closes automatically before the OPEN or QUIT commands
execute.
You cannot use CLOSE to close an Analytics project. Use QUIT instead.
Page 95 of 934
Commands
CLUSTER command
Groups records into clusters based on similar values in one or more numeric fields. Clusters can be uni-
dimensional or multidimensional.
Syntax
CLUSTER ON key_field <...n> KVALUE number_of_clusters ITERATIONS number_of_iterations
INITIALIZATIONS number_of_initializations <SEED seed_value> <OTHER field < ...n>> TO table_
name <IF test> <WHILE test> <FIRST range|NEXT range> OPEN {no_
keyword|NOCENTER|NOSCALE}
Parameters
Name Description
ON key_field <...n> One or more numeric fields to cluster. Multiple fields must be separated by spaces.
ITERATIONS number_of_ The maximum number of times the cluster calculation is re-performed.
iterations
INITIALIZATIONS num- The number of times to generate an initial set of random centroids.
ber_of_initializations
SEED seed_value The seed value to use to initialize the random number generator in Analytics.
optional If you omit SEED, Analytics randomly selects the seed value.
OTHER field <...n> One or more additional fields to include in the output.
optional Note
Key fields are automatically included in the output table, and do not
need to be specified using OTHER.
Page 96 of 934
Commands
Name Description
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character (
_ ), but no other special characters, or any spaces. The name cannot
start with a number.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
no_keyword | NOCENTER The method for standardizing key field numeric values.
| NOSCALE o no_keyword – center key field values around zero (0), and scale the values to unit
variance when calculating the clusters
o NOCENTER – scale key field values to unit variance when calculating the clusters,
but do not center the values around zero (0)
o NOSCALE – use the raw key field values, unscaled, when calculating the clusters
Page 97 of 934
Commands
Examples
Clustering on invoice amount
In addition to stratifying an accounts receivable table on the Invoice_Amount field, you also decide to
cluster on the same field.
l Stratifying groups the amounts into strata with predefined numeric boundaries – for example, $1000
intervals.
l Clustering discovers any organic groupings of amounts that exist in the data without requiring that
you decide on numeric boundaries in advance.
Open Ar
CLUSTER ON Invoice_Amount KVALUE 8 ITERATIONS 30 INITIALIZATIONS 10 OTHER No Due
Date Ref Type TO "Clustered_invoices" NOSCALE
As a quick way of discovering how many records are contained in each output cluster, you classify the
Clustered_invoices output table on the Cluster field.
OPEN Clustered_invoices
CLASSIFY ON Cluster TO SCREEN
Remarks
Note
For more information about how this command works, see the Analytics Help.
Page 98 of 934
Commands
COMMENT command
Adds an explanatory note to a script without affecting processing.
Syntax
Single-line comments
COMMENT comment_text
Multiline comments
COMMENT
comment_text
<...n>
<END>
Note
Do not use a caret character ^ to preface lines of comment text. The caret has a special use
in the .acl project file, and comment text is not saved if you preface it with a caret.
Parameters
Name Description
Page 99 of 934
Commands
Examples
Single-line comments
You use single-line comments before commands to add documentation for future users who will maintain
the script:
Multiline comment
You begin each script you write with a multiline comment that explains the purpose of the script:
COMMENT
This analytic identifies multiple records having common
transaction originator IDs (like vendor ID or merchant ID)
where the transaction date values are either equal or one day apart.
This analytic can be used for split invoices, split purchase orders,
split requisitions, and split corporate card transactions.
END
Remarks
When to use COMMENT
Use COMMENT to include information about the purpose of a script, the logic used, and other information
such as the required inputs for the script and the purpose of each variable you define.
The comments in a script are written to the Analytics command log each time the script runs.
COUNT command
Counts the total number of records in the current view, or only those records that meet the specified con-
dition.
Syntax
COUNT <IF test> <WHILE test> <FIRST range|NEXT range>
Parameters
Name Description
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Contains
o If the variable name is COUNT1, it is storing the record count for the most recent com-
mand executed.
o If the variable name is COUNTn, where n is greater than 1, the variable is storing the
record count for a command executed within a GROUP command.
The value of n is assigned based on the line number of the command in the GROUP.
For example, if the command is one line below the GROUP command it is assigned
the value COUNT2. If the command is four lines below the GROUP command, it is
assigned the value COUNT5.
Examples
Storing COUNT1
The result of the COUNT command is stored in the COUNT1 output variable. You can retrieve and store
this value in a user-defined variable.
The COUNT command overwrites the COUNT1 variable each time it is executed, so the value needs to be
stored in a user-defined variable before the command is executed for the second time after the filter is
applied to the table:
OPEN CustomerAddress
COUNT
TotalRec = COUNT1
SET FILTER TO ModifiedDate > '20100101'
COUNT
TotalFilteredRec = COUNT1
Remarks
When to use COUNT
Use the COUNT command to count the number of records in an Analytics table, or to count the number of
records that meet a particular test condition. If no test is specified, the total number of records in the Ana-
lytics table is displayed.
Syntax
CREATE LAYOUT layout_name WIDTH characters <RECORD 0|RECORD 1>
Parameters
Name Description
RECORD 0 | RECORD 1 o If you specify RECORD 0, or omit this parameter, the table layout is created without
any records or a source data file.
optional o If you specify RECORD 1 the table layout is created with a single empty record and a
source data file named layout_name.fil.
Examples
Creating an empty table layout without any records
You create an empty table layout with a record length of 100 characters:
Remarks
The empty table layout is created with a single character field called Field_1. The field length is the same
as the record length you specify with WIDTH.
Note
This command is not supported for use in Analytics analytics run on AX Server.
CROSSTAB command
Groups records based on identical combinations of values in two or more character or numeric fields, and
displays the resulting groups in a grid of rows and columns. Counts the number of records in each group,
and also subtotals specified numeric fields for each group.
Syntax
CROSSTAB <ON> row_field <...n> COLUMNS column_field <SUBTOTAL numeric_field
<...n>|SUBTOTAL ALL> TO {SCREEN|table_name|filename|GRAPH|PRINT} <IF test> <WHILE
test> <FIRST range|NEXT range> <APPEND> <COUNT> <OPEN> <LOCAL> <HEADER header_
text> <FOOTER footer_text>
Parameters
Name Description
ON row_field <...n> The field or expression to use for rows in the resulting grid of rows and columns. You can
specify one or more fields or expressions as the basis for the rows.
COLUMNS column_field The field or expression to use for columns in the resulting grid of rows and columns. You
can specify only one field or expression for the columns.
SUBTOTAL numeric_field One or more numeric fields or expressions to subtotal for each group.
<...n> | SUBTOTAL ALL
Multiple fields must be separated by spaces. Specify ALL to subtotal all the numeric fields
optional in the table.
TO SCREEN | table_name The location to send the results of the command to:
| filename | GRAPH o SCREEN – displays the results in the Analytics display area
| PRINT o table_name – saves the results to an Analytics table
Specify table_name as a quoted string with a .FIL file extension. For example: TO "Out-
put.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Name Description
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _
), but no other special characters, or any spaces. The name cannot
start with a number.
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o GRAPH – displays the results in a graph in the Analytics display area
o PRINT – sends the results to the default printer
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional
Name Description
Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
COUNT Includes record counts as columns. Counts are useful when you use SUBTOTAL.
optional Counts are automatically included if you do not select any subtotal fields.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
Examples
Cross-tabulating an accounts receivable table with SUBTOTAL
You want to cross-tabulate an accounts receivable table on the Customer Number and Transaction Type
fields. You also want to subtotal the Transaction Amount field.
The output is grouped by customer, and within each customer by transaction type. It includes the total trans-
action amount for each customer for each transaction type:
OPEN Ar
CROSSTAB ON Customer_Number COLUMNS Trans_Type SUBTOTAL Trans_Amount COUNT
TO SCREEN
OPEN Ar
CROSSTAB ON Trans_Amount COLUMNS Trans_Type TO SCREEN
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
CROSSTAB groups records that have the same combination of values in two or more character or
numeric fields.
The output contains a grid of rows and columns similar to a pivot table. It includes a single row-column inter-
section for each group, with a count of the number of records in the source table that belong to the group.
CVSEVALUATE command
For classical variables sampling, provides four different methods for projecting the results of sample ana-
lysis to the entire population.
Syntax
CVSEVALUATE BOOKED book_value_field AUDITED audit_value_field ETYPE
{MPU|DIFFERENCE|RATIO SEPARATE|RATIO COMBINED} STRATA boundary_value <,...n>
POPULATION stratum_count,stratum_book_value <,...n> CONFIDENCE confidence_level CUTOFF
value,certainty_stratum_count,certainty_stratum_book_value ERRORLIMIT number PLIMIT
{BOTH|UPPER|LOWER} <TO {SCREEN|filename}>
Parameters
Note
If you are using the output results of the CVSPREPARE and CVSSAMPLE commands as
input for the CVSEVALUATE command, a number of the parameter values are already
specified and stored in variables. For more information, see "CVSPREPARE command"
on page 113 and "CVSSAMPLE command" on page 117.
Do not include thousands separators, or percentage signs, when you specify values.
Name Description
BOOKED book_value_ The numeric book value field to use in the evaluation.
field
AUDITED audit_value_ The numeric audit value field to use in the evaluation.
field
STRATA boundary_value The upper boundary values to use for stratifying the book_value_field.
<,...n>
POPULATION stratum_ The number of records and the total value for each stratum in the book_value_field.
count, stratum_value
Name Description
<,...n>
CONFIDENCE con- The confidence level used during the preparation stage of the classical variables sample.
fidence_level
CUTOFF value, certainty_ o value – the certainty stratum cutoff value used during the preparation and sampling
stratum_count, certainty_ stage of the classical variables sample
stratum_book_value o certainty_stratum_count – the number of records in the certainty stratum
o certainty_stratum_book_value – the total book value of the records in the certainty
stratum
ERRORLIMIT number The minimum number of errors you expect in the sample.
Note
If the actual number of errors you found when you analyzed the sample is
less than the ERRORLIMIT number, the only evaluation method available
is mean-per-unit.
TO SCREEN | filename The location to send the results of the command to:
o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
Examples
Project errors found in the sampled data to the entire population
You have completed your testing of the sampled data and recorded the misstatements you found. You
can now project the errors you found to the entire population.
The example below uses the Difference estimation type to project the results of sample analysis to the
entire population:
Remarks
Note
For more information about how this command works, see the Analytics Help.
Guidelines
The guidelines below help you select an estimation type. You can repeat the evaluation stage with different
estimation types, and compare the results from each.
CVSPREPARE command
Stratifies a population, and calculates a statistically valid sample size for each stratum, for classical variables
sampling.
Syntax
CVSPREPARE ON book_value_field NUMSTRATA number MINIMUM minimum_strata_sample_size
PRECISION value CONFIDENCE confidence_level <CUTOFF value> NCELLS number PLIMIT
{BOTH|UPPER|LOWER} ERRORLIMIT number <MINSAMPSIZE minimum_sample_size> TO
{SCREEN|filename}
Parameters
Note
Do not include thousands separators, or percentage signs, when you specify values.
Name Description
ON book_value_field The numeric book value field to use as the basis for preparing the classical variables
sample.
NUMSTRATA number The number of strata to use for numerically stratifying the book_value_field.
The minimum number of strata is 1, and the maximum is 256.
If you specify NUMSTRATA 1, and do not specify a CUTOFF, the population is unstratified
prior to drawing a sample.
Note
The number of strata cannot exceed 50% of the number of cells specified
for NCELLS.
MINIMUM minimum_ The minimum number of records to sample from each stratum.
strata_sample_size
Leave the default of zero (0) if you do not have a specific reason for specifying a min-
imum number.
PRECISION value The monetary amount that is the difference between the tolerable misstatement and the
expected misstatement in the account.
o Tolerable misstatement – the maximum total amount of misstatement that can occur in
the sample field without being considered a material misstatement
o Expected misstatement – the total amount of misstatement that you expect the sample
field to contain
The precision establishes the range of acceptability for an account to be considered fairly
stated.
Name Description
Reducing the precision decreases the range of acceptability (the margin of error) which
requires an increased sample size.
CONFIDENCE con- The desired confidence level that the resulting sample is representative of the entire pop-
fidence_level ulation.
For example, specifying 95 means that you want to be confident that 95% of the time the
sample will in fact be representative. Confidence is the complement of "sampling risk". A
95% confidence level is the same as a 5% sampling risk.
o If PLIMIT is BOTH, the minimum confidence level is 10%, and the maximum is 99.5%.
o If PLIMIT is UPPER or LOWER, the minimum confidence level is 55%, and the max-
imum is 99.5%.
NCELLS number The number of cells to use for pre-stratifying the book_value_field.
Cells are narrower numeric divisions than strata. Pre-stratification is part of an internal
process that optimizes the position of strata boundaries. Cells are not retained in the final
stratified output.
The minimum number of cells is 2, and the maximum is 999.
Note
The number of cells must be at least twice (2 x) the number of strata spe-
cified for NUMSTRATA.
ERRORLIMIT number The minimum number of errors you expect in the sample.
Name Description
Note
If the actual number of errors you find when you analyze the sample is
less than the ERRORLIMIT number, the only evaluation method available
is mean-per-unit.
MINSAMPSIZE minimum_ The minimum number of records to sample from the entire population.
sample_size
Leave the default of zero (0) if you do not have a specific reason for specifying a min-
optional imum number.
TO SCREEN | filename The location to send the results of the command to:
o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
S_TOP The certainty stratum cutoff value specified by the user, or if none was specified, the
upper boundary of the top stratum calculated by the command.
SBOTTOM The lower boundary of the bottom stratum calculated by the command.
SBOUNDARY All strata upper boundaries calculated by the command, and S_TOP. Does not store
SBOTTOM.
SPOPULATION The count of the number of records in each stratum, and the total monetary value for each
stratum. Excludes items above the certainty stratum cutoff.
Name Contains
SSAMPLE The sample size for each stratum calculated by the command.
Examples
Prepare a classical variables sample
You have decided to use classical variables sampling to estimate the total amount of monetary mis-
statement in an account containing invoices.
Before drawing the sample, you must first stratify the population, and calculate a statistically valid sample
size for each stratum.
You want to be confident that 95% of the time the sample drawn by Analytics will be representative of the
population as a whole.
Using your specified confidence level, the example below stratifies a table based on the invoice_amount
field, and calculates the sample size for each stratum and the certainty stratum:
Remarks
Note
For more information about how this command works, see the Analytics Help.
CVSSAMPLE command
Draws a sample of records using the classical variables sampling method.
Syntax
CVSSAMPLE ON book_value_field NUMSTRATA number <SEED seed_value> CUTOFF value
STRATA boundary_value <,...n> SAMPLESIZE number <,...n> POPULATION stratum_
count,stratum_value <,...n> TO table_name
Parameters
Note
If you are using the output results of the CVSPREPARE command as input for the
CVSSAMPLE command, a number of the parameter values are already specified and
stored in variables. For more information, see "CVSPREPARE command" on page 113.
Do not include thousands separators, or percentage signs, when you specify values.
Name Description
ON book_value_field The numeric book value field to use as the basis for the sample.
NUMSTRATA number The number of strata to use for stratifying the book_value_field.
SEED seed_value The seed value to use to initialize the random number generator in Analytics.
optional If you omit SEED, Analytics randomly selects the seed value.
STRATA boundary_value The upper boundary values to use for stratifying the book_value_field.
<,...n>
POPULATION stratum_ The number of records in each stratum, and the total value for each stratum.
count, stratum_value
<,...n>
Name Description
S_TOPEV The certainty stratum cutoff value specified by the user, or if none was specified, the
upper boundary of the top stratum previously calculated by the CVSPREPARE com-
mand.
Also stores the count of the number of records in the certainty stratum, and their total
monetary value.
SBOTTOMEV The lower boundary of the bottom stratum calculated by the command.
BOUNDARYEV All strata upper boundaries prefilled by the command, or specified by the user. Does not
store S_TOPEV or SBOTTOMEV.
SPOPULATION The count of the number of records in each stratum, and the total monetary value for
each stratum. Excludes items above the certainty stratum cutoff.
Examples
Draw a classical variables sample
You are going to use classical variables sampling to estimate the total amount of monetary misstatement
in an account containing invoices.
After stratifying the population, and calculating a statistically valid sample size for each stratum, you are
ready to draw the sample.
The example below draws a stratified sample of records based on the invoice_amount field, and outputs
the sampled records to the Invoices_sample table:
Remarks
Note
For more information about how this command works, see the Analytics Help.
System-generated fields
Analytics automatically generates four fields and adds them to the sample output table. For each record
included in the sample, the fields contain the following descriptive information:
l STRATUM – the number of the stratum to which the record is allocated
l ORIGIN_RECORD_NUMBER – the original record number in the source data table
l SELECTION_ORDER – on a per-stratum basis, the order in which the record was randomly selected
l SAMPLE_RECORD_NUMBER – the record number in the sample output table
Syntax
DEFINE COLUMN view_name field_name <AS display_name> <POSITION n> <WIDTH
characters> <PIC format> <SORT|SORT D> <KEY> <PAGE> <NODUPS> <NOZEROS> <LINE
n>
Parameters
Name Description
AS display_name The display name (alternate column title) for the field in the view. If you want the display
name to be the same as the field name do not use AS.
optional
Specify display_name as a quoted string. Use a semi-colon (;) between words if you
want a line break in the column title.
POSITION n The position of the column in the view numerically from left to right:
optional o if omitted, the column is placed as the rightmost column at the time that the column is
added
o if a position number is missing, column positions are adjusted so that the columns
are positioned sequentially
o if a position number is already in use, the new column is placed to the left of the
column already using the position number
Name Description
Note
The characters specified by WIDTH are fixed-width characters. Every
character is allotted the same amount of space, regardless of the width
of the actual character.
By default, views in Analytics use a proportional width font that does not
correspond with fixed-width character spacing.
If you want a one-to-one correspondence between the WIDTH value and
the characters in the view, you can change the Proportional Font setting
in the Options dialog box to a fixed-width font such as Courier New.
KEY The column is designated as a break field in reports. Reports are subtotaled and sub-
divided when the value of the column changes. The following restrictions apply to break
optional
fields:
o must be a character field or expression
o if a break field is set in the view, it must be the leftmost column
o the last column in the view cannot be a break field
o if you have more than one break field, all of the columns to the left of any additional
break field must also be break fields
PAGE Inserts a page break each time the value in the break field changes.
optional
Name Description
LINE n The number of lines in the column. If no value is specified, the column defaults to a
single line. The value of n must be between 2 and 60.
optional
Examples
Defining a view with six columns
With the AR table open, you define a view called AR_Report, and define six columns. The columns are dis-
played in the listed order:
OPEN Ar
DEFINE VIEW AR_Report OK
DEFINE COLUMN AR_Report No AS "Customer Number" WIDTH 7 KEY
DEFINE COLUMN AR_Report Date AS "Invoice Date" WIDTH 10
DEFINE COLUMN AR_Report Due AS "Due Date" WIDTH 10
DEFINE COLUMN AR_Report Reference AS "Reference Number" WIDTH 6
DEFINE COLUMN AR_Report Type AS "Transaction Type" WIDTH 5
DEFINE COLUMN AR_Report Amount AS "Transaction Amount" WIDTH 12 PIC "-9999999999.99"
Syntax
DEFINE FIELD field_name data_type start_position length < decimals|date_format> <NDATETIME>
<PIC format> <AS display_name> <WIDTH characters> <SUPPRESS> < field_note>
Parameters
Name Description
data_type The data type to use when interpreting the data. For a list of supported data types, see
"Supported data types" on page 128.
For example, invoice numbers may be stored as numeric values in the data source. To
treat these values as strings rather than numbers, you can define the field as character
data instead.
start_position The starting byte position of the field in the Analytics data file.
Note
Name Description
Note
NDATETIME Date, datetime, or time values stored in a numeric field are treated as datetime data.
optional NDATETIME requires that you also specify the source datetime format using PIC format.
AS display_name The display name (alternate column title) for the field in the view. If you want the display
name to be the same as the field name do not use AS.
optional
Specify display_name as a quoted string. Use a semi-colon (;) between words if you want
a line break in the column title.
Name Description
If you omit WIDTH, the display width is set to the field length in characters.
Note
The characters specified by WIDTH are fixed-width characters. Every char-
acter is allotted the same amount of space, regardless of the width of the
actual character.
By default, views in Analytics use a proportional width font that does not
correspond with fixed-width character spacing.
If you want a one-to-one correspondence between the WIDTH value and
the characters in the view, you can change the Proportional Font setting
in the Options dialog box to a fixed-width font such as Courier New.
field_note Field note text that is added to the field definition in the table layout.
optional field_note must be last, after all other required and optional parameters. The text cannot
be multiline. Quotation marks are not required.
Examples
Defining a character field
Defines a character field called ProdDesc . The column title in the view is Product Description.
non-Unicode Analytics
l Starts at: byte 12 (character position 12)
l Length: 24 bytes (24 characters)
When defining datetime fields that include time data, you must use PIC format,
The example below defines a datetime field called email_timestamp. In the source data, the datetime
format is YYYY/MM/DD hh:mm:ss-hh:mm.
l Starts at: byte 1
l Length: 25 bytes
Remarks
Note
For more information about how this command works, see the Analytics Help.
Character ASCII
CUSTOM
EBCDIC
NOTE
PCASCII
UNICODE
Numeric ACCPAC
ACL
BASIC
BINARY
FLOAT
HALFBYTE
IBMFLOAT
MICRO
NUMERIC
PACKED
UNISYS
UNSIGNED
VAXFLOAT
ZONED
Datetime DATETIME
Logical LOGICAL
Syntax
To define a computed field:
Note
Multiline syntax must be structured exactly as shown in the generic syntax above and the
examples below.
Parameters
Name Description
expression A valid Analytics expression that defines the value of the computed field.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
STATIC The field displays the same value on every line of the table until a new value is
encountered.
optional
For example, if there is a Last Name field in the source data where:
o the first record displays the value "Smith"
o the next five records display blank lines
o the seventh record displays the value "Wong"
In this case, "Smith" displays on six consecutive lines, then "Wong" displays on the sev-
enth line.
AS display_name The display name (alternate column title) for the field in the view. If you want the display
name to be the same as the field name do not use AS.
optional
Specify display_name as a quoted string. Use a semi-colon (;) between words if you
want a line break in the column title.
Name Description
field_note Field note text that is added to the field definition in the table layout.
optional field_note must be last, after all other required and optional parameters. The text cannot
be multiline. Quotation marks are not required.
Examples
Defining a computed field
You define a computed field named Value that is the product of the Cost and Quantity fields:
Remarks
Note
For more information about how this command works, see the Analytics Help.
Syntax
DEFINE RELATION key_field WITH related_table_name INDEX index_name <AS relation_name>
Parameters
Name Description
INDEX index_name The name of the index for the key field in the related table.
You must index the related table on the key field before you can relate the table.
Examples
Relate two tables
The example below relates the open table to the Customer table by using the customer number field
Customer_on_CustNum is the name of the child table index on the key field. A child table index is
required when you relate tables.
If the child table index does not already exist when you run the DEFINE RELATION command, an error
message appears and the relation is not performed.
Tip
If you define a relation in the Analytics user interface, the child table index is automatically
created for you.
OPEN Customer
INDEX ON CustNum TO Customer_on_CustNum
Open Ar
DEFINE RELATION CustNum WITH Customer INDEX Customer_on_CustNum
OPEN Vouchers
INDEX ON voucher_number TO "Vouchers_on_voucher_number"
OPEN Vouchers_items
DEFINE RELATION voucher_number WITH Vouchers INDEX Vouchers_on_voucher_number
OPEN Employees
INDEX ON employee_number TO "Employees_on_employee_number"
OPEN Vouchers_items
DEFINE RELATION Vouchers.created_by WITH Employees INDEX Employees_on_employee_
number
Remarks
Note
For more information about how this command works, see the Analytics Help.
Syntax
DEFINE REPORT view_name
Parameters
Name Description
Examples
Creating a new view
You create a new view called Q4_AR_review:
Syntax
DEFINE TABLE DB {SOURCE database_profile <PASSWORD num> <PASSWORD num> |
SERVER server_profile <PASSWORD num>} <FORMAT format_name> SCHEMA schema
<TITLED acl_table_name> <PRIMARY|SECONDARY> DBTABLE db_tablename FIELDS {field_
names|ALL} <...n> <WHERE condition> <ORDER field_names>
Parameters
SOURCE database_profile The Analytics database profile to use to access the database engine.
Database profiles include information required to connect to the database engine, includ-
ing:
o a reference to the associated server profile
o the database type
o the database name
o user account information
Note
DEFINE TABLE DB supports connecting to only the following
databases: Microsoft SQL Server, Oracle, or DB2.
The password is only required if the database profile does not contain saved passwords.
Use PASSWORD twice after the SOURCE keyword. The first password logs you on to the
server, and the second one logs you on to the database.
FORMAT format_name The name of an Analytics table, or table layout file (.layout), with a table layout that you
want to use.
optional
SCHEMA schema The schema to connect to. You must enclose the schema name in quotation marks.
PRIMARY | SECONDARY Use the table as either a primary or secondary table in multi-file commands. If neither
option is specified, the default value of PRIMARY is used.
optional
DBTABLE database_table The database table that you want to access. database_table must be a quoted string.
Note
Using AX Connector, you can access an unlimited number of related
tables, but no more than five is recommended. Processing time increases
when you access multiple tables.
WHERE condition An SQL WHERE clause that limits the data to those records that meet the specified con-
dition.
optional
You must use valid SQL syntax entered as a quoted string.
When you join tables, Analytics displays the condition of the join in the WHERE clause:
"Table_1.First_name = Table_2.First_name"
ORDER field_names The fields the database engine uses to sort records. field_names must be a quoted string.
optional The command takes longer to run when sorting records. Only use ORDER when sorting
is important.
Examples
Example
You want to access data from a Microsoft SQL Server database via AX Connector. To do this, you use the
DEFINE TABLE DB command. You include the SOURCE parameter to connect to AX Connector through a
database profile:
Remarks
How it works
The Analytics server table is defined as a query that uses a database profile to connect to a database table.
Syntax
DEFINE VIEW view_name <RLINES n> <ALL> <SUPPRESS> <SUMMARIZED> <IF test>
<WHILE test> <HEADER header_text> <FOOTER footer_text> <TO report_file_name <HTML>>
<OK>
Parameters
Name Description
RLINES n The line spacing for detail records in views and reports. By default, detail lines are
single spaced.
optional
ALL All fields in the active Analytics table layout are added to the view.
optional
SUPPRESS Suppresses blank detail lines in reports generated from the view. When the report is
generated the blank detail lines will be omitted from the output. This option applies to
optional
reports based on multiline views.
SUMMARIZED Specifies that reports generated from the view should include subtotals and totals, but
not include the detail lines.
optional
The subtotals are generated based on the break fields defined in the view. If this option
is not selected, the report includes detail lines, as well as subtotals for each of the spe-
cified break fields.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
Name Description
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
TO report_filename HTML The filename and type for reports created from this view.
optional Use the HTML keyword to save reports generated from this view as HTML files (.htm).
By default, generated reports are output as ASCII text files.
Examples
Creating a view
You open the Ar table and create a view called AR_Report, which includes all of the fields in the table layout:
OPEN Ar
DEFINE VIEW AR_Report HEADER "AR Report" ALL OK
DELETE command
Deletes an Analytics project item, a field from a table layout, a variable, one or more table history entries, a
relation between tables, or a file in a Windows folder. Also removes a column from a view.
Syntax
Purpose Syntax
To delete a file
DELETE file_name <OK>
Parameters
Name Description
Name Description
Remove a column
The name of the column to remove from the specified view.
Note
Use the physical field name, not the column display name.
o ALL included – removes all occurrences of the column in the view
o ALL omitted – removes the first (leftmost) occurrence of the column in the view
variable_name | ALL The name of the variable to delete. Use ALL to delete all variables.
If you specify ALL, all occurrences of the following types of the variables are deleted
Name Description
HISTORY retain_history_ Deletes all table history entries except for the number of most recent entries specified by
entries retain_history_entries.
Omit retain_history_entries to delete all entries.
RELATION child_table_ Deletes any relation that has no dependent relations and no related fields referenced in
name | relation_name either the active view or in an active computed field.
Use the options to specify which relation to delete:
o child_table_name – use when the relation was not specifically named (default name
when a relation is created)
o relation_name – use when the relation was specifically named when it was created.
Otherwise, use child_table_name
If you do not use either option, the last relation that was defined gets deleted.
Examples
Deleting a date field
You delete the Date field from the table layout associated with the Ar table:
OPEN Ar
DELETE Date
OPEN Ar
DELETE COLUMN AR_Report Date OK
DELETE COLUMN AR_Report Invoice_Date OK
DIALOG command
Creates a custom dialog box that interactively prompts users for one or more script input values. Each
input value is stored in a named variable.
Note
Using the DIALOG command to enter passwords is not secure. You should use the
"PASSWORD command" on page 350 instead.
The DIALOG command is not supported in AX Server analytics.
You can create a basic interactive dialog box with the "ACCEPT command" on page 54.
Tip
The easiest way to create custom dialog boxes is with the Dialog Builder. For more
information, see Creating custom dialog boxes.
Syntax
DIALOG (DIALOG TITLE title_text WIDTH pixels HEIGHT pixels) (BUTTONSET TITLE
"&OK;&Cancel" AT x_pos y_pos <WIDTH pixels> <HEIGHT pixels> DEFAULT item_num <HORZ>)
<[label_syntax]|[text_box_syntax]|[check_box_syntax]|[radio_button_syntax]|[drop_down_list_syn-
tax]|[project_item_list_syntax]> <...n>
label_syntax ::=
(TEXT TITLE title_text AT x_pos y_pos <WIDTH pixels> <HEIGHT pixels> <CENTER|RIGHT>)
text_box_syntax ::=
(EDIT TO var_name AT x_pos y_pos <WIDTH pixels> <HEIGHT pixels> <DEFAULT string>)
check_box_syntax ::=
(CHECKBOX TITLE title_text TO var_name AT x_pos y_pos <WIDTH pixels> <HEIGHT pixels>
<CHECKED>)
radio_button_syntax ::=
(RADIOBUTTON TITLE value_list TO var_name AT x_pos y_pos <WIDTH pixels> <HEIGHT pixels>
<DEFAULT item_num> <HORZ>)
drop_down_list_syntax ::=
(DROPDOWN TITLE value_list TO var_name AT x_pos y_pos <WIDTH pixels> <HEIGHT pixels>
<DEFAULT item_num>)
project_item_list_syntax ::=
(ITEM TITLE project_item_category TO var_name AT x_pos y_pos <WIDTH pixels> <HEIGHT pixels>
<DEFAULT string>)
Parameters
General parameters
Name Description
DIALOG TITLE title_text Creates the main dialog box and the dialog box title.
title_text must be specified as a quoted string.
BUTTONSET TITLE The labels for the OK and Cancel buttons in the dialog box.
"&OK;&Cancel"
The values should normally not be edited, but if you do edit the values make sure that the
positive value comes before the negative value. For example: "&Yes;&No"
WIDTH pixels The width of the individual control, or the width of the dialog box if specified for the
DIALOG control.
The value is specified in pixels. If no value is specified for a control the width is calculated
based on the longest value contained by the control.
HEIGHT pixels The height of the individual control, or the height of the dialog box if specified for the
DIALOG control.
The value is specified in pixels.
AT x_posy_pos The location of the top left corner of the control in the custom dialog box:
o x_pos is the horizontal distance in pixels from the left-hand side of the dialog box
o y_pos is the vertical distance in pixels from the top of the dialog box
DEFAULT item_num The numeric value that corresponds to the BUTTONSET value that you want to select as
the default.
For example, if the BUTTONSET values are "&OK;&Cancel", specify DEFAULT 1 to select
OK by default.
HORZ Displays the values for the BUTTONSET control horizontally. Values are displayed ver-
tically by default.
optional
Note
For most of the control types, the DIALOG command creates a variable to store user
input. You cannot use non-English characters, such as é, in the names of variables that
will be used in variable substitution. Variable names that contain non-English characters
will cause the script to fail.
By default, some of the DIALOG variables are created as character variables. If you use a
character variable to store numeric or datetime values, you must convert the variable to
the required data type in subsequent processing in a script. For more information, see
"Input data type" on page 154.
Label parameters
Name Description
TO var_name The name of the character variable that stores the input specified by the user.
If the variable already exists, the specified value is assigned. If the variable does not
exist, it is created, and the specified value is assigned.
Name Description
TO var_name The name of the logical variable that stores the True or False value specified by the
user.
If the variable already exists, the specified value is assigned. If the variable does not
exist, it is created, and the specified value is assigned.
RADIOBUTTON Creates radio buttons to present mutually exclusive options to the user.
TO var_name The name of the numeric variable that stores the numeric position of the radio button
value selected by the user.
If the variable already exists, the specified value is assigned. If the variable does not
exist, it is created, and the specified value is assigned.
DEFAULT item_num The numeric value that corresponds to the list item that you want to select as the default.
optional For example, if value_list is "Red;Green;Blue", specify DEFAULT 2 to select Green by
default.
HORZ Displays the values for the control horizontally. Values are displayed vertically by default.
optional
TO var_name The name of the character variable that stores the drop-down list value selected by the
user.
If the variable already exists, the specified value is assigned. If the variable does not
Name Description
DEFAULT item_num The numeric value that corresponds to the list item that you want to select as the default.
optional For example, if value_list is "Red;Green;Blue", specify DEFAULT 2 to select Green by
default when the drop-down list is displayed.
ITEM Creates a project item list to present a list of Analytics project items, such as fields, to the
user.
TO var_name The name of the character variable that stores the name of the project item selected by
the user.
If the variable already exists, the specified value is assigned. If the variable does not
exist, it is created, and the specified value is assigned.
DEFAULT string The exact name of the project item that you want to select as the default.
optional string must be specified as a quoted string.
Examples
Prompting the user for a table and script
In your script, you need to prompt the user to select the Analytics table and script to use to run an analysis .
You specify that the Metaphor_Inventory_2012 table from the ACL_Demo.acl project is selected by
default as the Analytics table, but the user can select any table in the project.
The script to run must also be selected from the list of scripts in the Analytics project:
DIALOG (DIALOG TITLE "Inventory analysis" WIDTH 500 HEIGHT 200 ) (BUTTONSET TITLE
"&OK;&Cancel" AT 370 12 DEFAULT 1 ) (TEXT TITLE "Choose the Analytics project items to analyze."
AT 50 16 ) (TEXT TITLE "Table:" AT 50 50 ) (ITEM TITLE "f" TO "v_table" AT 50 70 DEFAULT "Meta-
phor_Inventory_2012" ) (TEXT TITLE "Script:" AT 230 50 ) (ITEM TITLE "b" TO "v_script" AT 230 70 )
Remarks
Note
For more information about how this command works, see the Analytics Help.
Interactivity
Use DIALOG to create an interactive script. When the DIALOG command is processed, the script pauses
and a dialog box is displayed that prompts the user for input that Analytics uses in subsequent processing.
You can create separate dialog boxes that prompt for one item at a time, or you can create one dialog box
that prompts for multiple items.
ACCEPT versus DIALOG
The ACCEPT command allows you to create a basic interactive dialog box that can have one or more of the
following types of controls:
l text box
l project item list
For basic interactivity, ACCEPT may be all you need. For more information, see "ACCEPT command" on
page 54.
Project categories
Code Category
f Tables
b Scripts
i Indexes
Code Category
w Workspaces
Field categories
Code Category
C Character fields
N Numeric fields
D Datetime fields
L Logical fields
Variable categories
Code Category
c Character variables
n Numeric variables
d Datetime variables
l Logical variables
In the example, the start and end dates for this filter are stored as character values. They must be con-
verted to date values in order to be used with a date field that uses a Datetime data type.
Enclosing the variable name in percent signs (%) substitutes the character value contained by the variable
for the variable name. The CTOD( ) function then converts the character value to a date value.
DIRECTORY command
Generates a list of files and folders in the specified directory.
Syntax
DIRECTORY < file_spec> <SUPPRESS> <SUBDIRECTORY> <APPEND> <TO table_name|file-
name>
Parameters
Name Description
file_spec The Windows folder or files to list and display information for.
optional You can use the asterisk wildcard (*) to list all files with a particular extension, all files
that start with a particular string, or all files in a folder. For example:
o *.fil – lists all files with the .fil extension (Analytics data files)
o Inv*.* – lists all files that begin with "Inv" regardless of what file extension they have
o Results\* or Results\*.* – lists all files in the Results folder
To limit the files listed to a particular folder, you can specify a path relative to the Ana-
lytics project folder, or specify a full path. For example:
o Results\*.* – displays the contents of the Results subfolder in the Analytics project
folder
o C:\ACL Data\Results\*.* – displays the contents of the specified folder
Note
The wildcard character cannot be used in intermediary levels of a spe-
cified file path. It can only be used in the final level of the path, as shown
above.
Paths or file names that contain spaces must be enclosed in double quo-
tation marks.
If you use file_spec, it must be placed before any of the other parameters. If file_spec
appears in any other position, the DIRECTORY command is not processed and an error
is generated.
If you omit file_spec, all files in the folder containing the Analytics project are listed. You
cannot use any of the other parameters if you omit file_spec.
SUPPRESS Suppresses path information in the output, leaving only the file names and properties.
optional
Name Description
folders contained in the Results folder, are searched for .fil files.
Depending on the number of subfolders and files that need to be listed, using
SUBDIRECTORY may result in a delay while the subfolders are searched. Analytics dis-
plays a dialog box showing progress of the command.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
TO table_name | filename The location to send the results of the command to:
optional o table_name – saves the results to an Analytics table
Specify table_name as a quoted string with a .FIL file extension. For example: TO
"Output.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character (
_ ), but no other special characters, or any spaces. The name cannot
start with a number.
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
If you omit TO, the directory listing appears in the Analytics display area.
Examples
Different options for listing files
The ability to list files is useful for ad hoc investigation, and for incorporating in scripting.
A number of different options for listing files with the DIRECTORY command appear below.
DIRECTORY
DIRECTORY *.fil
DIRECTORY Inv*.*
List all the files in a subfolder relative to the Analytics project folder
Lists all the files in the Results subfolder in the folder containing the Analytics project:
DIRECTORY "Results\*"
List all the files in a specified folder and output the list to an Analytics table
Lists all the files in the Results folder and outputs the list to an Analytics table in the folder containing the
Analytics project:
List all the files in one folder and output the list to an Analytics table in
another folder
Lists all the files in the ACL Data\Results folder and outputs the list to an Analytics table in the GL
Audit 2014\Results folder:
The new table Results_Folder_Contents is added to the open project. The associated data file (Res-
ults_Folder_Contents.fil) is created in the specified output folder, which may or may not be the
folder containing the Analytics project.
Remarks
Properties displayed by DIRECTORY
The DIRECTORY command is similar to the DIR command in Windows. In addition to listing files and sub-
folders in a folder, the DIRECTORY command also displays the following file and folder properties:
DISPLAY command
Displays information about the specified Analytics item type. Can also display the result of an expression, or
the output of a function.
Displays the field definitions, and any related child tables, for the currently
DISPLAY active Analytics table.
Displays the name and table layout information for the primary or secondary
DISPLAY table.
{<PRIMARY>|SECONDARY}
o PRIMARY (or no keyword specified) – display information for the currently
active table.
o SECONDARY – display information for the secondary table.
In multiple-table mode, SECONDARY refers to the secondary table asso-
ciated with the currently active table.
The information displayed includes:
o the table layout name
o the source data file name
o any relations between the table and other tables
o the field definition information from the table layout
Displays the table history for the currently active Analytics table.
DISPLAY HISTORY
Note
A table may or may not have associated table history.
Syntax Purpose
Displays the amount of physical memory (RAM) available for use by Ana-
DISPLAY {FREE|SPACE} lytics.
The amount displayed does not include memory reserved for variables. By
default, Analytics reserves 60 KB of physical memory to store variables, but
the amount is automatically increased as necessary.
FREE | SPACE – specify either keyword. The two keywords do the same
thing.
Examples
Display the layout of an Analytics table
Displaying the layout of a table can be useful in a number of circumstances. For example, you may want to
combine two or more tables, and you need to examine field lengths and data types.
The example below displays the layout of the Ap_Trans table:
OPEN Ap_Trans
DISPLAY
Note
If you enter DISPLAY directly in the Analytics command line, the output appears imme-
diately.
If you run DISPLAY in a script, double-click the corresponding DISPLAY entry in the com-
mand log to display the output.
Output to screen
Relationship
'Vendor' related by 'Vendor_No' using index 'Vendor_on_Vendor_No'
File
'Ap_Trans.fil' (format 'Ap_Trans') is your PRIMARY file.
The record length is 59
Fields
Output to screen
TOTAL1 N 278,641.33
OUTPUTFOLDER C "/Tables/Accounts_Payable"
v_field_name C "Invoice_Amount"
v_table_name C "Ap_Trans"
DISPLAY AGE(Invoice_Date)
Remarks
Location of command results
DISPLAY run from the Analytics command line – the results are displayed on screen.
DISPLAY executed in a script – the results are written to the Analytics command log. You can double-
click the command log entry to display the results on screen.
DO REPORT command
Generates the specified Analytics report.
Syntax
DO REPORT report_name
Parameters
Name Description
Example
Printing the default view
You open the AP_Trans table and print the default view:
OPEN AP_Trans
DO REPORT Default_View
Remarks
Running DO REPORT on the command line vs in a script
The settings used to print the report depend on where you run the command:
l from the command line – the Print dialog box opens for you to select the pages to print and configure
other options for the report
l in a script – the report is printed immediately using the default settings for the report
DO SCRIPT command
Executes a secondary script, or an external script, from within an Analytics script.
Syntax
DO <SCRIPT> script_name {<IF test>|<WHILE test>}
Parameters
Name Description
SCRIPT script_name The name of the script to run. You can run secondary scripts in the Analytics project, or
external scripts stored in text files with extensions such as .aclscript, .txt. or .bat.
You can specify a file path to an external script. You must enclose the path in quotation
marks if it contains any spaces.
Note
You cannot call a script that is already running. For example, if ScriptA
calls ScriptB, ScriptB cannot call ScriptA. ScriptA is still running while it
waits for ScriptB to complete.
IF test A conditional expression that is evaluated one time to determine if the script should be
executed. If the condition evaluates to true the script runs, otherwise it does not.
optional
Cannot be used with WHILE in the same command. If both are used, WHILE is ignored
when the script is processed. A comment is entered in the log, but the script does not
stop executing.
WHILE test A conditional expression that is evaluated after the script runs to determine if the script
should be executed again. If the test evaluates to true the script runs again, otherwise it
optional
does not.
Note
If you use WHILE, ensure that your test eventually evaluates to false. If
you do not, the script enters an infinite loop. In case of an infinite loop,
press the Esc key to cancel script processing.
Cannot be used with IF in the same command. If both are used, WHILE is ignored when
the script is processed. A comment is entered in the log, but the script does not stop
executing.
Examples
Executing a subscript repeatedly until the input is validated
You have a subscript that gathers user input using a dialog box. It does the following:
1. Prompts the user for the required values.
2. Checks the user input.
3. Sets the v_validated variable to true when the input values are validated.
To ensure that the user enters valid input, you use DO SCRIPT and include a WHILE condition so that the
script repeats this command until input is validated. Once the value of the variable changes, the main script
moves on to the next command:
Remarks
Related commands
DO SCRIPT is the equivalent of the DO BATCH command found in scripts created with earlier releases of
Analytics.
You cannot include the DO SCRIPT command inside a GROUP command.
DUMP command
Displays the contents of a file, or the current record, in hexadecimal, ASCII, and EBCDIC character encod-
ings.
Note
This command can only be entered in the command line. It cannot be used in scripts.
Syntax
DUMP {RECORD|file_name} <SKIP bytes> <COLUMN bytes> <HORIZONTAL>
Parameters
Name Description
SKIP bytes The number of bytes to bypass before the dump begins. The default is 0.
optional
HORIZONTAL Displays the character encodings in horizontal rows rather than in side-by-side vertical
blocks (the default).
Name Description
optional
Examples
Display the character encoding of the Inventory table
The example below displays the hexadecimal, ASCII, and EBCDIC character encoding of the data in the
Inventory table. The output is arranged in horizontal rows (hexadecimal encoding uses a double row). Each
row represents 97 bytes of data in the Analytics table:
DUPLICATES command
Detects whether duplicate values or entire duplicate records exist in an Analytics table.
Syntax
DUPLICATES <ON> {key_field <D> <...n>|ALL} <OTHER field <...n>|OTHER ALL>
<UNFORMATTED> <TO {SCREEN|table_name|filename|PRINT}> <APPEND> <IF test> <WHILE
test> <FIRST range|NEXT range> <HEADER header_text> <FOOTER footer_text> <PRESORT>
<OPEN> <LOCAL> <ISOLOCALE locale_code>
Parameters
Name Description
ON key_field D <...n> The key field or fields, or the expression, to test for duplicates.
| ALL o key_field – use the specified field or fields
If you test by more than one field, records identified as duplicates require identical
values in every specified field.
Include D to sort the key field in descending order. The default sort order is ascend-
ing.
o ALL – use all fields in the table
If you test by all the fields in a table, records identified as duplicates must be entirely
identical.
An ascending sort order is the only option for ALL.
Note
Undefined portions of records are not tested.
OTHER field <...n> | One or more additional fields to include in the output.
OTHER ALL o field <...n> – include the specified field or fields
optional o ALL – include all fields in the table that are not specified as key fields
UNFORMATTED Suppresses page headings and page breaks when the results are output to a file.
optional
TO SCREEN | table_ The location to send the results of the command to:
name | filename | PRINT o SCREEN – displays the results in the Analytics display area
optional o table_name – saves the results to an Analytics table
Specify table_name as a quoted string with a .FIL file extension. For example: TO
"Output.FIL"
Name Description
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character (
_ ), but no other special characters, or any spaces. The name cannot
start with a number.
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o PRINT – sends the results to the default printer
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Name Description
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
HEADER header_text The text to insert at the top of each page of a report.
optional header_text must be specified as a quoted string. The value overrides the Analytics
HEADER system variable.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
PRESORT Sorts the table on the key field before executing the command.
optional Note
You cannot use PRESORT inside the GROUP command.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
GAPDUPn The total number of gaps, duplicates, or fuzzy duplicate groups identified by the com-
mand.
Examples
Test for duplicate values in one field
The following example:
l tests for duplicate values in the Invoice_Number field
l outputs any records that contain duplicate invoice numbers to a new Analytics table
ESCAPE command
Terminates the script being processed, or all scripts, without exiting Analytics.
Syntax
ESCAPE <ALL> <IF test>
Parameters
Name Description
ALL Terminates all active scripts. If omitted, the current script is terminated.
optional
IF test A test that must evaluate to true before the command is executed. If the test evaluates to
false the command does not run.
optional
Examples
Terminating a script conditionally
You count the number of records in a table, and use the ESCAPE command to terminate the script if the
number of records is less than 100:
COUNT
ESCAPE IF COUNT1 < 100
Remarks
When to use ESCAPE
Use ESCAPE to halt the execution of a script or subscript based on a condition, or to stop the execution of
all running scripts.
ESCAPE ALL
EVALUATE command
For record sampling or monetary unit sampling, projects errors found in sampled data to the entire pop-
ulation, and calculates upper limits on deviation rate, or misstatement amount.
Record samplingMonetary unit sampling
Syntax
EVALUATE RECORD CONFIDENCE confidence_level SIZE sample_size ERRORLIMIT number_
of_errors <TO {SCREEN|filename}>
Parameters
Note
Do not include thousands separators, or percentage signs, when you specify values.
Name Description
CONFIDENCE con- The same confidence level that you specified when you calculated the sample size.
fidence_level
ERRORLIMIT number_of_ The total number of errors, or deviations, that you found in the sample.
errors
TO SCREEN | filename The location to send the results of the command to:
o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
MLEn The Upper error limit frequency rate (computed upper deviation rate) calculated by the
command.
Examples
Project errors found in the sampled data to the entire population
You have completed your testing of the sampled data and recorded the control deviations you found. You
can now project the errors you found to the entire population.
The example below projects two errors found in the sampled data to the entire population, and calculates
an upper error limit frequency rate (computed upper deviation rate) of 6.63%.
For a detailed explanation of how Analytics calculates values when evaluating errors, see Evaluating
errors in a record sample.
Remarks
Note
For more information about how this command works, see the Analytics Help.
Syntax
EVALUATE MONETARY CONFIDENCE confidence_level <ERRORLIMIT book_value, mis-
statement_amount <,...n>> INTERVAL interval_value <TO {SCREEN|filename}>
Parameters
Note
Do not include thousands separators, or percentage signs, when you specify values.
Name Description
CONFIDENCE con- The same confidence level that you specified when you calculated the sample size.
fidence_level
ERRORLIMIT book_ All misstatement errors that you found in the sample.
value,misstatement_
Specify the book value of the amount and the misstatement amount, separated by a
amount
comma. For example, if an amount has a book value of $1,000 and an audit value of
$930, specify 1000,70.
Specify overstatements as positive amounts, and understatements as negative amounts.
For example, if an amount has a book value of $1,250 and an audit value of $1,450, spe-
cify 1250,-200.
Separate multiple book_value, misstatement_amount pairs with a comma:
1000,70,1250,-200
INTERVAL interval_value The interval value that you used when you drew the sample.
Note
The interval value that you used might differ from the interval value initially
calculated by Analytics.
TO SCREEN | filename The location to send the results of the command to:
o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
MLEn The Most Likely Error amount (projected misstatement) calculated by the command.
UELn The Upper Error Limit amount (upper misstatement limit) calculated by the command.
Examples
Project errors found in the sampled data to the entire population
You have completed your testing of the sampled data and recorded the misstatements you found. You
can now project the errors you found to the entire population.
The example below projects three errors found in the sampled data to the entire population, and calculates
several values, including:
l Basic Precision – the basic allowance for sampling risk (18,850.00)
l Most Likely Error – the projected misstatement amount for the population (1,201.69)
l Upper Error Limit – the upper misstatement limit for the population (22,624.32)
For a detailed explanation of how Analytics calculates values when evaluating errors, see Evaluating
errors in a monetary unit sample.
Remarks
Note
For more information about how this command works, see the Analytics Help.
EXECUTE command
Executes an application or process external to Analytics. Emulates the Windows Run command. Can be
used to interact with the Windows command prompt.
Note
Because the EXECUTE command gives you the ability to interact with the operating sys-
tem and applications external to Analytics, technical issues may arise that are beyond the
scope of Analytics's native functionality.
Support can assist with operation of the EXECUTE command inside Analytics, but issues
that arise with processes and applications external to Analytics are not covered under Sup-
port.
Syntax
EXECUTE Windows_Run_command_syntax <ASYNC>
Parameters
Name Description
Windows_Run_com- The name of the application to execute, the folder or file to open, or the command to run,
mand_syntax followed by any required arguments or command switches.
Requires valid Windows Run command syntax enclosed by quotation marks.
RETURN_CODE The code returned by an external application or process run using the EXECUTE com-
mand.
Examples
Open an application
Opens Microsoft Excel:
EXECUTE "Excel"
EXECUTE "AcroRd32.exe"
Close an application
Closes Microsoft Excel:
Note
Use the /f switch with caution. It forces an application to close without presenting any dialog
boxes, such as those for saving changes.
Open a file
Opens the Excel workbook AP_Trans.xlsx:
Note
Running an Analytics script in another project launches a second instance of Analytics. The
script in the second project should end with the QUIT command so that the second instance
of Analytics closes and control is returned to the initial instance of Analytics.
Remarks
Use EXECUTE to perform useful tasks
The EXECUTE command allows you to run Windows and DOS commands from the Analytics command
line or from an Analytics script.
You can use this ability to increase the automation of Analytics scripts by performing a variety of useful
tasks that are not possible using ACLScript syntax alone.
Open other programs and Pass parameters to a Access data from network Incorporate Active Dir-
applications and perform batch file locations ectory account lists
tasks required by the Ana-
lytics script
Open any file in its default Run Analytics scripts in Use FTP to access data Integrate with VBScript
application other Analytics projects from remote locations
Perform file and folder Incorporate waiting peri- Zip or unzip data Integrate with SQL data-
administrative tasks such ods in Analytics scripts bases
as copying, moving, cre-
ating, deleting, or com-
paring files or folders that
exist outside of Analytics
Run external scripts or Incorporate Windows task Encrypt or decrypt data Open web pages
non-Analytics batch files scheduling in Analytics
(.bat) scripts
Note
Specific details of how to perform any of these tasks are beyond the scope of Galvanize Help doc-
umentation. For assistance, consult appropriate Windows operating system documentation, or other
third-party documentation.
o file and folder administrative tasks o external tasks cause an application interface or pop-up
o specifying waiting periods dialog box to open
o any task that subsequent tasks depend on
o subsequent script execution depends on the result in
the RETURN_CODE variable
Quotation marks
The Windows Run command syntax that you use with the EXECUTE command must be enclosed by either
single or double quotation marks.
The following example uses the Windows MD command to create a new folder:
You may find this method easier to read than the second method.
Note
Reversing the order of the nesting – using double quotation marks to enclose the
entire string, and single quotation marks to enclose paths – does not work.
l Two double quotation marks – Use double quotation marks to enclose the entire Run command
string, and use two double quotation marks internally to enclose paths:
If you use this second method, the two double quotation marks used internally must be immediately
adjacent and cannot have a space between them.
You can avoid this complication by creating external scripts or batch files to contain Windows commands,
and by using the EXECUTE command only to start the batch file. For example:
EXECUTE 'C:\My_Batch.bat'
Method Example
EXPORT command
Exports data from Analytics to the specified file format, or to Results in HighBond.
Syntax
EXPORT {<FIELDS> field_name <AS export_name> <...n>|<FIELDS> ALL} <UNICODE> export_
type <SCHEMA> PASSWORD num TO {filename|aclgrc_id} <OVERWRITE> <IF test> <WHILE
test> <{FIRST range|NEXT range}> <APPEND> <KEEPTITLE> <SEPARATOR character>
<QUALIFIER character> <WORKSHEET worksheet_name> <DISPLAYNAME>
Parameters
Name Description
UNICODE Available in the Unicode edition of Analytics only. Applies to text (ASCII), delimited text
(DELIMITED), and XML files only, and to Windows Clipboard (CLIPBOARD) output.
optional
Exports Analytics data with Unicode UTF-16 LE character encoding applied.
o Specify UNICODE – if the data you are exporting contains characters that are not
supported by extended ASCII (ANSI)
o Do not specify UNICODE – if all the characters in the data you are exporting are sup-
ported by extended ASCII (ANSI)
The exported data is encoded as extended ASCII (ANSI).
Note
Any unsupported characters are omitted from the exported file.
For more information, see ACL Unicode products.
export_type The output file format or destination using one of the following options:
o ACCESS – Microsoft Access database file (.mdb)
Name Description
Name Description
Note
PASSWORD may or may not be required, depending on the envir-
onment in which the script runs:
Robots
Analytics Exchange
Name Description
If you omit OVERWRITE, and if data already exists in the target control test (table), the
exported data is appended to the existing data. For more information about appending
in Results, see the "Remarks" below.
Any interpretations related to the target control test (table) dynamically update to reflect
the imported data, whether you overwrite or append.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Description
Examples
Export data to an Excel .xlsx file
You export specific fields from the Vendor table to an Excel .xlsx file:
OPEN Vendor
EXPORT FIELDS Vendor_No Vendor_Name Vendor_City XLSX TO "VendorExport"
OPEN Vendor
EXPORT FIELDS Vendor_No Vendor_Name Vendor_City XLSX TO "VendorExport" WORKSHEET
Vendors_US
OPEN Vendor
EXPORT FIELDS ALL DELIMITED TO "VendorExport"
GROUP
EXPORT FIELDS Vendor_No Vendor_Name DELIMITED TO "AtoM" IF BETWEEN(UPPER
(VENDOR_NAME), "A", "M")
EXPORT FIELDS Vendor_No Vendor_Name DELIMITED TO "NtoZ" IF BETWEEN(UPPER
(VENDOR_NAME), "N", "Z")
END
OPEN AR_Exceptions
EXPORT FIELDS No Due Date Ref Amount Type ACLGRC PASSWORD 1 TO "10926@us"
OVERWRITE
Remarks
Note
For more information about how this command works, see the Analytics Help.
Only one file can be created at a time when you are exporting data to Microsoft Excel and Microsoft
Access.
Exporting to Excel
The following limits apply when exporting data to an Excel file:
o a maximum of 64 characters
Length of field names o for Excel 2.1, a maximum of 248 characters
No matching Excel file o TO filename value A new Excel file is created, A new Excel file is created,
name does not match any with a worksheet with the with a worksheet that uses
existing Excel file name specified name the name of the exported
Analytics table
Matching Excel file name o TO filename value, and A worksheet with the spe- The existing Excel file is
an existing Excel file cified name is added to the overwritten by a new Excel
No matching worksheet
name, are identical existing Excel file file, with a worksheet that
name o WORKSHEET work- uses the name of the
sheet_name does not exported Analytics table
match a worksheet
name in the Excel file
Matching Excel file name o TO filename value, and A worksheet with the spe- The existing Excel file is
and worksheet name an existing Excel file cified name overwrites the overwritten by a new Excel
name, are identical existing worksheet if it was file, with a worksheet that
o WORKSHEET work- originally created from uses the name of the
sheet_name matches a Analytics. exported Analytics table
worksheet name in the
An error message appears
Excel file
and the export operation is
canceled if the existing
Item Details
Required per- The ability to export results to a control test in Results requires a specific HighBond role assign-
missions ment, or administrative privileges:
o Users with a Professional User or Professional Manager role for a Results collection can
export results to any control test in the collection.
Note
Only users with the Professional Manager role can export and overwrite
existing data in a control test.
o HighBond System Admins and Results admins automatically get a Professional Manager
role in all collections in the HighBond organization or organizations they administer.
Export limits The following limits apply when exporting to a control test:
o A maximum of 100,000 records per export
o A maximum of 100,000 records per control test
o A maximum of 500 fields per record
o A maximum of 256 characters per field
You can export multiple times to the same control test, but you cannot exceed the overall limits.
Appending fields Regardless of their order in an Analytics table, exported fields are appended to existing fields in
a control test if they have matching physical field names.
In Analytics, the physical field name is the name in the table layout. Exported fields that do not
match the name of any existing field are added as additional columns to the table in Results.
Display names of fields in Analytics, and in Results, are not considered. However, if you use the
optional AS export_name parameter, the export_name value is used as the physical field name
if you do not use DISPLAYNAME.
When appending data to questionnaire fields, the display name of the column in Results
remains the name that is specified in the questionnaire configuration.
Note
If you are round-tripping data between Results and Analytics, and data ends up
misaligned in Results, you probably have mismatched field names.
For more information, see "Field name considerations when importing and
exporting Results data" on page 259.
Item Details
cifying a password necting to HighBond, no password value is specified, so a password prompt is displayed when
value the script attempts to connect.
For more information, see "PASSWORD command" on page 350.
SET PASSWORD command
If you use the SET PASSWORD command to create the numbered password definition for con-
necting to HighBond, a password value is specified, so no password prompt is displayed, which
is appropriate for scripts designed to run unattended.
For more information, see SET PASSWORD command.
HighBond access token
Regardless of which method you use to create the password definition, the required password
value is a HighBond access token:
o PASSWORD method – Users can acquire an access token by selecting Tools > HighBond
Access Token, and then signing in to HighBond. An access token is returned, which users
can copy and paste into the password prompt.
o SET PASSWORD method – To insert an access token into the SET PASSWORD command
syntax in an Analytics script, right-click in the Script Editor, select Insert > HighBond Token,
and sign in to HighBond. An access token is inserted into the script at the cursor position.
Caution
The returned access token matches the account used to sign in to HighBond.
As a scriptwriter, using your own access token may not be appropriate if you
are writing a script to be used by other people.
Without AS With AS
Field name and display name Field name and display name in Res-
in Results are the field name from ults are the display name in the AS
Without DISPLAYNAME Analytics. parameter.
Field name in Results is the field Field name in Results is the field
name from Analytics. Display name name from Analytics. Display name
in Results is the display name in Results is the display name in the
With DISPLAYNAME from Analytics. AS parameter.
EXTRACT command
Extracts data from an Analytics table and outputs it to a new Analytics table, or appends it to an existing Ana-
lytics table. You can extract entire records or selected fields.
Syntax
EXTRACT {RECORD|FIELDS field_name <AS display_name> <...n>|FIELDS ALL} TO table_name
<IF test> <WHILE test> <FIRST range|NEXT range> <EOF> <APPEND> <OPEN> <LOCAL>
Parameters
Name Description
Name Description
Note
AS works only when extracting to a new table. If you are appending to an
existing table, the alternate column titles in the existing table take pre-
cedence.
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
EOF Execute the command one more time after the end of the file has been reached.
optional This ensures that the final record in the table is processed when inside a GROUP com-
mand. Only use EOF if all fields are computed fields referring to earlier records.
Name Description
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
Examples
Extracting all records in a table to a new table
You create an exact duplicate of the AR_Customer table by extracting all the records to a new Analytics
table. Any computed fields are preserved as computed fields:
OPEN AR_Customer
EXTRACT RECORD TO "AR_Customer_2"
OPEN AR_Customer
EXTRACT FIELDS ALL TO "AR_Customer_2"
Customer_Master table:
OPEN AR_Customer
EXTRACT RECORD TO "AR_Customer_Master" APPEND
OPEN AR_Customer
EXTRACT RECORD TO "C:\Users\Customer Data\AR_Customer_Master" APPEND
OPEN AR_Customer
EXTRACT FIELDS Name Due Date TO "AR_Customer_Dates.fil"
OPEN AR_Customer
EXTRACT FIELDS Name AS "Customer;Name" Due AS "Due;Date" Date AS "Invoice;Date" TO
"AR_Customer_Dates.fil"
OPEN AR_Customer
EXTRACT FIELDS Name Due Date IF Due < `20140701` TO "Overdue.fil"
Remarks
Note
For more information about how this command works, see the Analytics Help.
FIELDSHIFT command
Shifts the start position of a field definition in a table layout.
Syntax
FIELDSHIFT START starting_position COLUMNS bytes_to_shift <FILTER data_filter_name> <OK>
Parameters
Name Description
START starting_position The starting byte position of the first field definition you want to shift.
All field definitions to the right of the specified field definition are also shifted.
If you specify a non-starting byte position, the next starting byte position is used.
Note
Name Description
FILTER data_filter_name The name of the filter that identifies field definitions associated with a particular record
definition.
optional
Examples
Shifting field definitions
You shift the field definition starting at byte 11, and any subsequent field definitions, 4 bytes to the right:
Remarks
Note
For more information about how this command works, see the Analytics Help.
FIND command
Searches an indexed character field for the first value that matches the specified character string.
Note
The FIND command and the FIND( ) function are two separate Analytics features with sig-
nificant differences. For information about the function, see "FIND( ) function" on
page 562.
Syntax
FIND search_value
Parameters
Name Description
Examples
Searching for a specific value
You want to locate the first value in the Card_Number character field that exactly matches or starts with
"8590124".
First you index the Card_Number field in ascending order. Then you run FIND:
Remarks
Note
For more information about how this command works, see the Analytics Help.
INDEX requirement
To use the command, the table you are searching must be indexed on a character field in ascending order.
If multiple character fields are indexed in ascending order, only the first field specified in the index is
searched. The command cannot be used to search non-character index fields, or character fields indexed in
descending order.
Partial matching
Partial matching is supported. The search value can be contained by a longer value in the indexed field.
However, the search value must appear at the start of the field to constitute a match.
FUZZYDUP command
Detects nearly identical values (fuzzy duplicates) in a character field.
Note
To use fuzzy matching to combine fields from two Analytics tables into a new, single Ana-
lytics table, see "FUZZYJOIN command" on page 211.
Syntax
FUZZYDUP ON key_field <OTHER fields> LEVDISTANCE value <DIFFPCT percentage>
<RESULTSIZE percentage> <EXACT> <IF test> TO table_name <LOCAL> <OPEN>
Parameters
Name Description
LEVDISTANCE value The maximum allowable Levenshtein distance between two strings for them to be iden-
tified as fuzzy duplicates and included in the results.
The LEVDISTANCE value cannot be less than 1 or greater than 10. Increasing the
LEVDISTANCE value increases the number of results by including values with a greater
degree of fuzziness – that is, values that are more different from one another.
For more information, see "FUZZYDUP behavior" on page 209.
DIFFPCT percentage A threshold that limits the 'difference percentage' or the proportion of a string that can be
different.
optional
The percentage that results from an internal Analytics calculation performed on poten-
tial fuzzy duplicate pairs must be less than or equal to the DIFFPCT value for the pair to
be included in the results. The DIFFPCT value cannot be less than 1 or greater than 99.
If DIFFPCT is omitted the threshold is turned off and difference percentage is not con-
sidered during processing of the FUZZYDUP command.
For more information, see "FUZZYDUP behavior" on page 209.
RESULTSIZE percentage The maximum size of the set of output results as a percentage of the number of records
in the key field.
optional
For example, for a key field with 50,000 records, a RESULTSIZE of 3 would terminate
processing if the results exceeded 1500 fuzzy duplicates (50,000 x 0.03). No output
Name Description
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
OPEN Opens the table created by the command after the command executes. Only valid if the
Name Description
GAPDUPn The total number of gaps, duplicates, or fuzzy duplicate groups identified by the com-
mand.
Examples
Test a surname field for fuzzy duplicates
You test a surname field for fuzzy duplicates (the Last_Name field in the Employee_List table in
ACL DATA\Sample Data Files\Metaphor_Employee_Data.ACL). The results are output to a
new Analytics table.
l In addition to the test field, other fields are included in the results.
l The maximum allowable Levenshtein distance is 1.
l The proportion of a string that can be different is limited to 50%.
l The size of the results is limited to 20% of the test field size.
l In addition to fuzzy duplicates, exact duplicates are included.
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
The FUZZYDUP command finds nearly identical values (fuzzy duplicates), or locates inconsistent spelling
in manually entered data.
Unlike the ISFUZZYDUP( ) function, which identifies an exhaustive list of fuzzy duplicates for a single char-
acter value, the FUZZYDUP command identifies all fuzzy duplicates in a field, organizes them in groups,
and outputs non-exhaustive results.
FUZZYDUP behavior
The FUZZYDUP command has two parameters that allow you to control the degree of difference between
fuzzy duplicates, and the size of the results:
l LEVDISTANCE
l DIFFPCT
You may need to try different combinations of settings for these two parameters to find out what works best
for a particular data set.
More information
For detailed information about the fuzzy duplicate difference settings, controlling result size, and fuzzy
duplicate groups, see Fuzzy duplicates overview.
Case-sensitivity
The FUZZYDUP command is not case-sensitive, so "SMITH" is equivalent to "smith."
FUZZYJOIN command
Uses fuzzy matching to combine fields from two Analytics tables into a new, single Analytics table.
Note
To detect nearly identical values (fuzzy duplicates) in a single character field, see
"FUZZYDUP command" on page 206.
For various options when joining tables using exactly matching key field values, see "JOIN
command" on page 316.
Syntax
FUZZYJOIN {DICE PERCENT percentage NGRAM n-gram_length|LEVDISTANCE DISTANCE
value} PKEY primary_key_field SKEY secondary_key_field {FIELDS primary_fields|FIELDS ALL}
<WITH secondary_fields|WITH ALL> <IF test> <OPEN> <FIRSTMATCH> TO table_name <WHILE
test> <FIRST range|NEXT range> <APPEND>
Note
You cannot run the FUZZYJOIN command locally against a server table.
You must specify the FUZZYJOIN command name in full. You cannot abbreviate it.
Parameters
Name Description
Name Description
Note
When you specify DICE, the FUZZYJOIN command uses the
DICECOEFFICIENT( ) function in an IF statement to conditionally join
key field values. For detailed information about the function, see
"DICECOEFFICIENT( ) function" on page 535.
LEVDISTANCE – use the Levenshtein distance algorithm
o DISTANCE value – the maximum allowable Levenshtein distance between two strings
for them to qualify as a fuzzy match
Specify a whole number, 1 or greater.
Increasing the value increases the number of matches by including matches with a
greater degree of fuzziness – that is, strings that are more different from each another.
Note
When you specify LEVDISTANCE, the FUZZYJOIN command uses the
LEVDIST( ) function in an IF statement to conditionally join key field val-
ues. For detailed information about the function, see "LEVDIST( ) func-
tion" on page 620.
Unlike the function, the Levenshtein distance algorithm in the
FUZZYJOIN command automatically trims leading and trailing blanks,
and is not case-sensitive.
PKEY primary_key_field The character key field, or expression, in the primary table.
You can specify only one primary key field.
SKEY secondary_key_ The character key field, or expression, in the secondary table.
field
You can specify only one secondary key field.
FIELDS primary_fields | The fields or expressions from the primary table to include in the joined output table.
FIELDS ALL o primary_fields – include the specified field or fields
o ALL – include all fields from the table
Note
You must explicitly specify the primary key field if you want to include it in
the joined table. Specifying ALL also includes it.
WITH secondary_fields | The fields or expressions from the secondary table to include in the joined output table.
WITH ALL o secondary_fields – include the specified field or fields
optional o ALL – include all fields from the table
Note
You must explicitly specify the secondary key field if you want to include it
in the joined table. Specifying ALL also includes it.
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Name Description
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
Note
The IF condition can reference the primary table, the secondary table, or
both.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
FIRSTMATCH Specifies that each primary key value is joined to only the first occurrence of any sec-
ondary key matches.
optional
If the first occurrence happens to be an exact match, any subsequent fuzzy matches for
the primary key value are not included in the joined output table.
If you omit FIRSTMATCH, the default behavior of FUZZYJOIN is to join each primary key
value to all occurrences of any secondary key matches.
FIRSTMATCH is useful if you only want to know if any matches, exact or fuzzy, exist
between two tables, and you want to reduce the processing time required to identify all
matches.
You can also use FIRSTMATCH if you are certain that at most only one match exists in
the secondary table for each primary key value.
Note
FIRSTMATCH is available only as an ACLScript parameter. The option is
not available in the Analytics user interface.
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Name Description
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
Examples
Use fuzzy matching to join two tables as a way of discovering employees
who may also be vendors
The example below joins the Empmast and Vendor tables using address as the common key field (the
Address and Vendor_Street fields).
The FUZZYJOIN command creates a new table with either exactly matched or fuzzy matched primary and
secondary records. The result is a list of any employees and vendors with either an identical address, or a
similar address.
Remarks
Note
For more information about how this command works, see the Analytics Help.
Case sensitivity
The FUZZYJOIN command is not case-sensitive, regardless of which fuzzy matching algorithm you use. So
"SMITH" is equivalent to "smith."
GAPS command
Detects whether a numeric or datetime field in an Analytics table contains one or more gaps in sequential
data.
Syntax
GAPS <ON> key_field <D> <UNFORMATTED> <PRESORT> <MISSING limit> <HEADER header_
text> <FOOTER footer_text> <IF test> <WHILE test> <FIRST range|NEXT range> <TO
{SCREEN|table_name|filename|PRINT}> <APPEND> <LOCAL> <OPEN>
Parameters
Name Description
UNFORMATTED Suppresses page headings and page breaks when the results are output to a file.
optional
PRESORT Sorts the table on the key field before executing the command.
optional Note
You cannot use PRESORT inside the GROUP command.
MISSING limit The output results contain individual missing items rather than gap ranges.
optional The limit value specifies the maximum number of missing items to report for each iden-
tified gap. The default value is 5. If the limit is exceeded for a particular gap, the missing
items are reported as a range for that particular gap.
The limit value does not restrict the total number of missing items reported, it only restricts
the number of missing items reported within a specific gap.
HEADER header_text The text to insert at the top of each page of a report.
optional header_text must be specified as a quoted string. The value overrides the Analytics
HEADER system variable.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
IF test A conditional expression that must be true in order to process each record. The command
Name Description
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
TO SCREEN | table_name The location to send the results of the command to:
| filename | PRINT o SCREEN – displays the results in the Analytics display area
optional o table_name – saves the results to an Analytics table
Specify table_name as a quoted string with a .FIL file extension. For example: TO "Out-
put.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _
), but no other special characters, or any spaces. The name cannot
start with a number.
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o PRINT – sends the results to the default printer
Name Description
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
GAPDUPn The total number of gaps, duplicates, or fuzzy duplicate groups identified by the com-
mand.
Examples
Testing for missing invoice numbers
You use GAPS to ensure that there are no invoice numbers missing from an Invoices table:
OPEN Invoices
GAPS ON Inv_Num PRESORT TO "Invoices_Gaps.fil"
Remarks
Using GAPS on character fields
In addition to testing numeric or datetime fields, you can also test for gaps in numeric data that appears in
a character field. For example, you can test check numbers, which are typically formatted as character
data.
If letters and numbers appear together in a character field, only the numbers are tested and the letters are
ignored.
GROUP command
Executes one or more ACLScript commands on a record before moving to the next record in the table, with
only one pass through the table. Command execution can be controlled by conditions.
Syntax
GROUP <IF test> <WHILE test> <FIRST range|NEXT range>
command
<...n>
<ELSE IF test>
command
<...n>
<ELSE>
command
<...n>
END
Note
Some Analytics commands cannot be used with the GROUP command. For more inform-
ation, see "Commands that can be used inside the GROUP command" on page 225.
Parameters
Name Description
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Description
reached
o NEXT – start processing from the currently selected record until the specified number
of records is reached
Use range to specify the number of records to process.
If you omit FIRST and NEXT, all records are processed by default.
command <...n> One or more ACLScript commands to execute inside the GROUP. For a complete list of
commands supported inside GROUP, see "Commands that can be used inside the
GROUP command" on page 225.
If there is a preceding IF or ELSE IF, the test must evaluate to true.
If the command is listed under ELSE, the command is executed if there are records that
have not been processed by any of the preceding commands. You can include multiple
commands, with each command starting on a separate line.
ELSE IF test Opens an ELSE IF block for the GROUP command. The condition tests records that did
not match the GROUP command test, or any previous ELSE IF tests.
optional
You can include multiple ELSE IF tests and they are evaluated from top to bottom, until
the record evaluates to true and the commands that follow that ELSE IF statement are
executed.
ELSE Opens an ELSE block for the GROUP command. The commands that follow are executed
for records that evaluated to false for all of the previous tests.
optional
Examples
Simple GROUP
Simple groups start with a GROUP command, are followed by a series of commands, and terminate with
an END command:
GROUP
COUNT
HISTOGRAM ON Quantity MINIMUM 0 MAXIMUM 100 INTERVALS 10
CLASSIFY ON Location SUBTOTAL Quantity
END
GROUP IF
Conditional groups execute commands based on whether a condition is true or false. The following
GROUP command is executed only on records with a Product_class value less than 5:
GROUP IF ...ELSE
Records that do not meet the test condition are ignored unless you include an ELSE block.
Any number of commands can follow an ELSE statement. In the following example, all records that do not
meet the condition are processed by having their Quantity field totaled:
As with other groups, use the END command to terminate a nested group. Analytics starts processing the
data only after all group commands have been terminated:
In this example, all of the commands from COUNT up to and including the next GROUP are executed only
if Product_class is less than 05.
The STATISTICS and HISTOGRAM commands are executed if Quantity is greater than zero. However,
because the second GROUP command is nested, the STATISTICS and HISTOGRAM commands are
executed only for records that meet the conditions Product_class < "05" and Quantity > 0.
OPEN Metaphor_Trans_2002
GROUP
TOTAL AMOUNT IF PRODCLS = "03"
TOTAL AMOUNT IF PRODCLS = "05"
TOTAL AMOUNT IF PRODCLS = "08"
TOTAL AMOUNT IF PRODCLS = "09"
END
CLOSE Metaphor_Trans_2002
Remarks
Tip
For a detailed tutorial covering the GROUP and LOOP commands, see "Grouping and loop-
ing" on page 33.
VERIFY
Note
While you can initialize and define a variable inside a GROUP block, it is not recom-
mended. Variables initialized inside a GROUP may cause unexpected results when used.
Inside a GROUP, you can evaluate variables using variable substitution. The value of the variable remains
the same as when the GROUP is entered.
You cannot define a variable inside a GROUP and then reference it using variable substitution:
System-defined variables
Certain commands such as TOTAL and STATISTICS generate system variables based on calculations
that the commands perform. If you use a GROUP to execute these commands, any system variables that
result are numbered consecutively, starting at the line number of the command inside the GROUP (exclud-
ing empty lines) and running to n. The value of n increases by 1 for each line number in the GROUP.
Note
You must wait until the GROUP completes before using any system generated variables
created inside the GROUP. The command must run against each record in the table
before the variable is available. Use these variables after the closing END keyword of the
GROUP.
In the following example, the first TOTAL command generates the variable TOTAL2 and the second gen-
erates TOTAL4. Both of these variables are available to use in subsequent commands once the
GROUP completes:
GROUP
TOTAL Discount IF Order_Priority = "Low"
ASSIGN v_var = "test"
TOTAL Discount IF Order_Priority = "High"
END
Syntax notes
l The multiline syntax listed for the GROUP command is required, and therefore the GROUP com-
mand cannot be entered in the command line.
l Each GROUP command must be terminated with an END command.
l When you use the GROUP command in your scripts, you can improve the readability of the com-
mand block by indenting the commands listed inside the group.
HELP command
Launches the Analytics Help Docs in a browser.
Syntax
HELP
HISTOGRAM command
Groups records based on values in a character or numeric field, counts the number of records in each
group, and displays the groups and counts in a bar chart.
Syntax
HISTOGRAM {<ON> character_field|<ON> numeric_field MINIMUM value MAXIMUM value
{<INTERVALS number>|FREE interval_value <...n> last_interval}} <TO
{SCREEN|filename|GRAPH|PRINT}> <IF test> <WHILE test> <FIRST range|NEXT range>
<HEADER header_text> <FOOTER footer_text> <KEY break_field> <SUPPRESS> <COLUMNS
number> <APPEND> <LOCAL> <OPEN>
Parameters
Name Description
MINIMUM value Applies to numeric fields only. The minimum value of the first numeric interval.
MINIMUM is optional if you are using FREE, otherwise it is required.
MAXIMUM value Applies to numeric fields only. The maximum value of the last numeric interval.
MAXIMUM is optional if you are using FREE, otherwise it is required.
Name Description
Interval values must be in numeric sequence and cannot contain duplicate values:
TO SCREEN | filename | The location to send the results of the command to:
GRAPH | PRINT o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o GRAPH – displays the results in a graph in the Analytics display area
o PRINT – sends the results to the default printer
Note
Histogram results output to a file appear as a textual representation of a
bar chart.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Description
optional header_text must be specified as a quoted string. The value overrides the Analytics
HEADER system variable.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
KEY break_field The field or expression that groups subtotal calculations. A subtotal is calculated each
time the value of break_field changes.
optional
break_field must be a character field or expression. You can specify only one field, but
you can use an expression that contains more than one field.
SUPPRESS Values above the MAXIMUM value and below the MINIMUM value are excluded from
the command output.
optional
COLUMNS number The length of the x-axis in the textual representation of the bar chart if you output his-
togram results to a text file.
optional
The number value is the number of character spaces (text columns) to use for the x-axis
(and the y-axis labels). If you omit COLUMNS, the default of 78 character spaces is
used.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
Examples
Basic histogram for hourly salary
You use HISTOGRAM to create a graph showing the distribution of wages between 0 and 100 dollars per
hour:
Remarks
Note
For more information about how this command works, see the Analytics Help.
Related commands
Creating a histogram using a character field is similar to classifying. Creating a histogram using a numeric
field is similar to stratifying.
Unlike the other grouping operations in Analytics, histograms do not support subtotaling numeric fields.
IF command
Specifies a condition that must evaluate to true in order to execute a command.
Syntax
IF test command
Parameters
Name Description
Examples
Running a command conditionally
You want to use CLASSIFY on a table, but only if the v_counter variable is greater than ten:
Remarks
IF command versus IF parameter
The logic of the IF command differs from the IF parameter that is supported by most commands:
l IF command – determines whether the associated command runs or not, based on the value of the
test expression
l IF parameter – determines whether the command runs against each record in an Analytics table
based on the value of the test expression
Syntax
IMPORT ACCESS TO table <PASSWORD num> import_filename FROM source_filename TABLE
input_tablename CHARMAX max_field_length MEMOMAX max_field_length
Parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
Name Description
Use either an absolute or relative file path to save the data file to a different, existing
folder:
o "C:\data\Invoices.FIL"
o "data\Invoices.FIL"
FROM source_filename The name of the source data file. source_filename must be a quoted string.
If the source data file is not located in the same directory as the Analytics project, you
must use an absolute path or a relative path to specify the file location:
o "C:\data\source_filename"
o "data\source_filename"
TABLE input_tablename The name of the table in the Microsoft Access database file to import.
CHARMAX max_field_ The maximum length in characters for any field in the Analytics table that originates as
length character data in the source from which you are importing.
You can specify from 1 to 255 characters.
MEMOMAX max_field_ The maximum length in characters for text, note, or memo fields you are importing.
length
You can specify from 1 to 32767 characters (non-Unicode Analytics), or 16383 char-
acters (Unicode Analytics).
Examples
Note
For more information about how this command works, see the Analytics Help.
Syntax
IMPORT DELIMITED TO table import_filename FROM source_filename <SERVER profile_name>
source_char_encoding SEPARATOR {char|TAB|SPACE} QUALIFIER {char|NONE}
<CONSECUTIVE> STARTLINE line_number <KEEPTITLE> <CRCLEAR> <LFCLEAR>
<REPLACENULL> <ALLCHAR> {ALLFIELDS|[field_syntax] <...n> <IGNORE field_num> <...n>}
field_syntax ::=
FIELD name type AT start_position DEC value WID bytes PIC format AS display_name
Parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
FROM source_filename The name of the source data file. source_filename must be a quoted string.
If the source data file is not located in the same directory as the Analytics project, you
must use an absolute path or a relative path to specify the file location:
o "C:\data\source_filename"
o "data\source_filename"
SERVER profile_name The server profile name for the AX Server where the data you want to import is located.
Name Description
optional
Analytics edi-
Code tion Source data encoding
Unicode edition Unicode data that does not use UTF-16 LE encoding
To determine the numeric code that matches the source
data encoding, perform an import using the Data Defin-
ition Wizard, select the Encoded Text option, and
find the matching encoding in the accompanying drop-
down list.
3 numeric_ To specify the code, specify 3 , followed by a space, and
code then the numeric code.
SEPARATOR char The separator character (delimiter) used between fields in the source data. You must
| TAB | SPACE specify the character as a quoted string.
You can specify a tab or a space separator by typing the character between double quo-
tation marks, or by using a keyword:
o SEPARATOR " " or SEPARATOR TAB
o SEPARATOR " " or SEPARATOR SPACE
QUALIFIER char | NONE The text qualifier character used in the source data to wrap and identify field values.
You must specify the character as a quoted string.
To specify the double quotation mark character as the text qualifier, enclose the char-
acter in single quotation marks: QUALIFIER '"'.
You can specify that there are no text qualifiers using either of these methods:
o QUALIFIER ""
o QUALIFIER NONE
Name Description
KEEPTITLE Treat the line number specified by STARTLINE as field names instead of data. If you
omit KEEPTITLE, generic field names are used.
optional
If you specify FIELD syntax individually, FIELD name takes precedence over the values
in the first row in the delimited file. In this situation, KEEPTITLE prevents the first row val-
ues from being imported.
CRCLEAR Replaces any CR characters (carriage return) that occur between text qualifiers with
space characters. You must specify QUALIFIER with a char value to use CRCLEAR.
optional
If you use both CRCLEAR and LFCLEAR, CRCLEAR must come first.
LFCLEAR Replaces any LF characters (line feed) that occur between text qualifiers with space
characters. You must specify QUALIFIER with a char value to use LFCLEAR.
optional
If you use both CRCLEAR and LFCLEAR, CRCLEAR must come first.
REPLACENULL Replaces any NUL characters that occur in the delimited file with space characters. The
number of any replaced NUL characters is recorded in the log.
optional
ALLCHAR The Character data type is automatically assigned to all the imported fields.
optional Tip
Assigning the Character data type to all the imported fields simplifies the
process of importing delimited text files. Once the data is in Analytics,
you can assign different data types, such as Numeric or Datetime, to the
fields, and specify format details.
ALLCHAR is useful if you are importing a table with identifier fields auto-
matically assigned the Numeric data type by Analytics when in fact they
should use the Character data type.
FIELD name type The individual fields to import from the source data file, including the name and data
type of the field. To exclude a field from being imported, do not specify it.
For information about type, see "Identifiers for field data types" on page 242.
Note
type is ignored if you specify ALLCHAR.
AT start_position The starting byte position of the field in the Analytics data file.
Name Description
Note
WID bytes The length in bytes of the field in the Analytics table layout.
Note
AS display_name The display name (alternate column title) for the field in the view in the new Analytics
table.
Specify display_name as a quoted string. Use a semi-colon (;) between words if you
want a line break in the column title.
AS is required when you are defining FIELD. To make the display name the same as
the field name, enter a blank display_name value using the following syntax: AS "".
Make sure there is no space between the two double quotation marks.
Name Description
optional field_num specifies the position of the field in the source data. For example, IGNORE 5
excludes the fifth field in the source data from the Analytics table layout.
Note
The data in the field is still imported, but it is undefined, and does not
appear in the new Analytics table. The data can be defined later, if
necessary, and added to the table.
To completely exclude a field from being imported, do not specify it when
you specify fields individually.
Examples
Importing all fields
You import all fields from a comma delimited file to an Analytics table named Employees . The file uses
double quotation marks as the text qualifiers. Data types are automatically assigned based on the set of
rules outlined in "Remarks" on the facing page:
FIELD "Salary" C AT 83 DEC 0 WID 5 PIC "" AS "" FIELD "Bonus_2016" C AT 88 DEC 0 WID 10 PIC
"" AS "Bonus 2016"
Remarks
Note
For more information about how this command works, see the Analytics Help.
Description of field values in the delimited file Examples Data type assigned
Values include a non-numeric character anywhere in the field, with the $995 Character
exception of commas and periods used as numeric separators, and
(995)
the negative sign (-)
Values include only numbers, numeric separators, or the negative 6,990.75 Numeric
sign
-6,990.75
995
Description of field values in the delimited file Examples Data type assigned
A ACL
B BINARY
C CHARACTER
D DATETIME
E EBCDIC
F FLOAT
G ACCPAC
I IBMFLOAT
K UNSIGNED
L LOGICAL
N PRINT
P PACKED
Q BASIC
R MICRO
S CUSTOM
T PCASCII
U UNICODE
V VAXFLOAT
X NUMERIC
Y UNISYS
Z ZONED
Syntax
IMPORT EXCEL TO table import_filename FROM source_filename TABLE input_worksheet_or_
named_range <KEEPTITLE> {ALLFIELDS|CHARMAX max_field_length|[field_syntax] <...n>
<IGNORE field_num> <...n>} <OPEN>
field_syntax ::=
FIELD name type {PIC format|WID characters DEC value} AS display_name
Parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
FROM source_filename The name of the source data file. source_filename must be a quoted string.
If the source data file is not located in the same directory as the Analytics project, you
must use an absolute path or a relative path to specify the file location:
o "C:\data\source_filename"
o "data\source_filename"
TABLE worksheet_or_ The Microsoft Excel worksheet or the named range in the source data file to import:
named_range o you must enter a "$" sign at the end of a worksheet name
Name Description
KEEPTITLE Treat the first row of data as field names instead of data. If omitted, generic field names
are used.
optional
If you are defining fields individually, KEEPTITLE must appear before the first FIELD.
CHARMAX max_field_ Only applies when you are not defining fields individually.
length
The maximum length in characters for any field in the Analytics table that originates as
character data in the source data file.
FIELD name type The individual fields to import from the source data file, including the name and data
type of the field. To exclude a field from being imported, do not specify it.
For information about type, see "Identifiers for field data types" on page 247.
WID characters The length in characters of the field in the Analytics table layout.
AS display_name The display name (alternate column title) for the field in the view in the new Analytics
table.
Specify display_name as a quoted string. Use a semi-colon (;) between words if you
want a line break in the column title.
AS is required when you are defining FIELD. To make the display name the same as
the field name, enter a blank display_name value using the following syntax: AS "".
Make sure there is no space between the two double quotation marks.
Name Description
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
Examples
Importing specified fields
You perform an import that defines a new Analytics table called Credit_Cards . It uses the first row of Excel
data as the field names.
The Analytics table defines and includes three fields from the source table, but excludes the remaining
fields:
Remarks
Note
For more information about how this command works, see the Analytics Help.
Note
When you use the Data Definition Wizard to define a table that includes EBCDIC,
Unicode, or ASCII fields, the fields are automatically assigned the letter "C" (for the
CHARACTER type).
When you enter an IMPORT statement manually, or edit an existing IMPORT statement,
you can substitute the more specific letters "E" or "U" for EBCDIC or Unicode fields.
A ACL
B BINARY
C CHARACTER
D DATETIME
E EBCDIC
F FLOAT
G ACCPAC
I IBMFLOAT
K UNSIGNED
L LOGICAL
N PRINT
P PACKED
Q BASIC
R MICRO
S CUSTOM
T PCASCII
U UNICODE
V VAXFLOAT
X NUMERIC
Y UNISYS
Z ZONED
Syntax
IMPORT GRCPROJECT TO table import_filename PASSWORD num FROM org_id/type_id
<FIELD name AS display_name <...n>>
Parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
Name Description
l PASSWORD command
l SET command
l PASSWORD analytic tag
Note
PASSWORD may or may not be required, depending on the envir-
onment in which the script runs:
Robots
Analytics Exchange
FROM org_id/type_id The organization and type of information that defines the data being imported:
o org_id – the Projects organization you are importing data from
o type_id – the type of information you are importing
The org_id value and the type_id value must be separated by a slash, with no inter-
vening spaces: FROM "125@eu/audits".
The entire string must be enclosed in quotation marks.
Organization ID
org_id must include the organization ID number, and if you are importing from a data
center other than North America, the data center code. The organization ID number and
the data center code must be separated by the at sign (@): FROM "125@eu".
The data center code specifies which regional HighBond server you are importing the
data from:
o ap – Asia Pacific
o au – Australia
o ca – Canada
o eu – Europe
o us – North America
You can use only the data center code or codes authorized for your organization's
installation of HighBond. The North America data center is the default, so specifying
"@us" is optional.
If you do not know the organization ID number, use the Analytics user interface to import
a table from Projects. The organization ID number is contained in the command in the
log. For more information, see Defining ACL GRC data.
Name Description
Type ID
type_id specifies the type of information you are importing. Information in Projects is con-
tained in a series of related tables.
For type_id, use one of the values listed below. Enter the value exactly as it appears
and include underscores, if applicable:
o audits - Projects
o control_test_plans - Control Test Plans
o control_tests - Control Test
o controls - Controls
o finding_actions - Actions
o findings - Issues
o mitigations - Risk Control Associations
o narratives - Narratives
o objectives- Objectives
o risks - Risks
o walkthroughs - Walkthroughs
Tip
For information about how the tables in Projects are related, and the key
fields that you can use to join the tables once you have imported them to
Analytics, see Defining ACL GRC data.
FIELD name AS display_ Individual fields in the source data to import. Specify the name.
name <...n>
If you omit FIELD, all fields are imported.
optional o name must exactly match the physical field name in the Projects table, including
matching the case
o display_name (alternate column title) is the display name for the field in the view in
the new Analytics table. You must specify a display name for each FIELD name. Spe-
cify display_name as a quoted string.
Use a semi-colon (;) between words if you want a line break in the column title.
Unlike some other IMPORT commands in Analytics, you cannot specify a blank dis-
play_name as a way of using the FIELD name as the display name.
Tip
To get the physical field names, use the Analytics user interface to import
the appropriate table from Projects. The physical field names are con-
tained in the command in the log.
Subsequent imports can be scripted.
Examples
Importing all fields from the Projects table
You import all fields from the Projects table for all active projects belonging to organization 286 to an Ana-
lytics table named All_Projects . You include a numbered password definition to authenticate the
connection:
Remarks
Note
For more information about how this command works, see the Analytics Help.
Regardless of which method you use to create the password definition, the required password value is a
HighBond access token:
l PASSWORD method – Users can acquire an access token by selecting Tools > HighBond
Access Token, and then signing in to HighBond. An access token is returned, which users can
copy and paste into the password prompt.
l SET PASSWORD method – To insert an access token into the SET PASSWORD command syn-
tax in an Analytics script, right-click in the Script Editor, select Insert > HighBond Token, and sign
in to HighBond. An access token is inserted into the script at the cursor position.
Caution
The returned access token matches the account used to sign in to HighBond. As a
scriptwriter, using your own access token may not be appropriate if you are writing a script
to be used by other people.
Syntax
IMPORT GRCRESULTS TO table import_filename PASSWORD num FROM Results_resource_path
<FIELD name AS display_name <...n>>
Parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
Name Description
l PASSWORD command
l SET command
l PASSWORD analytic tag
Note
PASSWORD may or may not be required, depending on the environment
in which the script runs:
Robots
Analytics Exchange
FIELD name AS display_ Individual fields in the source data to import. Specify the name.
name <...n>
If you omit FIELD, all fields are imported.
optional
Name
name must exactly match the physical field name in the Results table, including matching
the case. To view the physical field name, do one of the following:
o In Results, click a column header in the Table View. The physical field name appears
after Field Name.
o In Analytics, when you import a Results table, the physical field name appears in par-
entheses after the display name in the dialog box that allows you to select fields.
Note
The Results physical field name is not the display name used for column
headers in the Table View.
Also see "Field name considerations when importing and exporting Results data" on
page 259.
Name Description
Display name
display_name (alternate column title) is the display name for the field in the view in the
new Analytics table. You must specify a display name for each FIELD name. Specify dis-
play_name as a quoted string.
Use a semi-colon (;) between words if you want a line break in the column title.
Unlike some other IMPORT commands in Analytics, you cannot specify a blank display_
name as a way of using the FIELD name as the display name.
Examples
Importing specified fields from a table in Results
You import specified fields from a table in Results to an Analytics table named T and E exceptions :
Remarks
Note
For more information about how this command works, see the Analytics Help.
Results path
Note
The form of the Results path is supplied by an API, and is subject to change. The easiest
and most reliable way to acquire the correct and current syntax for the path is to perform a
manual import of the target data, and copy the path from the command log.
The Results path in the FROM parameter takes the following general form:
FROM "results <-region code>/api/orgs/<org ID>/control_tests/<control test ID>/exceptions
For example: FROM "results/api/orgs/11594/control_tests/4356/exceptions"
The org ID is displayed in the browser address bar when you log in to Launchpad. The control test ID, and
the interpretation ID, are displayed in the address bar when you are viewing those tables in Results.
The table below provides all the variations of the Results path.
l part of Results tables, and contain processing information related to individual records
l additional information – collection name, table name, or record ID number
You must specify the field names of the system-generated columns exactly as they appear below. The
default display names apply when you import from Results through the Analytics user interface. You are
free to change the display names if you are scripting the import process.
metadata.priority Priority
metadata.status Status
metadata.publish_date Published
metadata.assignee Assignee
metadata.group Group
metadata.updated_at Updated
metadata.closed_at Closed
extras.collection Collection
extras.record_id Record ID
Syntax
IMPORT LAYOUT external_layout_file TO table_layout_name
Parameters
Name Description
external_layout_file The name of the external table layout file. If the file name or path includes any spaces it
must be enclosed in quotation marks – for example, "Ap Trans.layout".
The .layout file extension is used by default and does not need to be specified. If
required, you can use another file extension, such as .fmt.
If the layout file is not located in the same folder as the Analytics project, you must use
an absolute path or a relative path to specify the file location – for example,
"C:\Saved layouts\Ap_Trans.layout" or "Saved layouts\Ap_Trans.layout".
TO table_layout_name The name of the imported table layout in the Analytics project – for example,
"Ap Trans May". You must specify the table_layout_name as a quoted string if it con-
tains any spaces. You can specify a table_layout_name that is different from the name
of the external_layout_file.
Example
Importing an external table layout file
You import an external table layout file called Ap_Trans.layout and create a new table layout called Ap_
Trans _May in the Analytics project:
Remarks
When to use IMPORT LAYOUT
Importing an external table layout file and associating it with a data file can save you the labor of creating a
new table layout from scratch:
l If the imported table layout specifies an association with a particular Analytics data file (.fil), and a data
file of the same name exists in the folder containing the project, the imported table layout is auto-
matically associated with the data file in the folder.
l If there is no data file with the same name in the project folder, you need to link the imported table lay-
out to a new data source.
Syntax
IMPORT MULTIDELIMITED <TO import_folder> FROM {source_filename|source_folder} source_
char_encoding SEPARATOR {char|TAB|SPACE} QUALIFIER {char|NONE} <CONSECUTIVE>
STARTLINE line_number <KEEPTITLE> <CRCLEAR> <LFCLEAR> <REPLACENULL>
<ALLCHAR>
Note
You must specify the IMPORT MULTIDELIMITED parameters in exactly the same order
as above, and in the table below.
To import multiple delimited files cleanly, the structure of all the files must be consistent
before importing.
For more information, see "Consistent file structure required" on page 269.
Parameters
Name Description
Example
If you omit TO , the data is imported to the folder containing the Analytics project.
FROM source_filename | The name of the source data files, or the folder containing the source data files.
source_folder
Specify source_filename or source_folder as a quoted string.
The command supports importing four types of delimited file:
Name Description
o *.csv
o *.dat
o *.del
o *.txt
Source data files in the root Analytics project folder
To specify multiple files, use a wildcard character (*) in place of unique characters in file
names. The wildcard character stands for zero (0) or more occurrences of any letter,
number, or special character.
Example
FROM "Transactions_FY*.csv"
selects:
Transactions_FY18.csv
Transactions_FY17.csv
You can use a wildcard in more than one location in a file name, and in a file extension.
Example
FROM "Transactions_FY*.*"
selects:
Transactions_FY18.txt
Transactions_FY17.csv
Source data files not in the root Analytics project folder
If the source data files are not located in the same folder as the Analytics project, you
must use an absolute file path, or a file path relative to the folder containing the project,
to specify the location of the files.
Example
Example
Name Description
Analytics edi-
Code tion Source data encoding
Unicode edition Unicode data that does not use UTF-16 LE encoding
To determine the numeric code that matches the source
data encoding, perform an import using the Data Defin-
ition Wizard, select the Encoded Text option, and
find the matching encoding in the accompanying drop-
down list.
3 numeric_ To specify the code, specify 3 , followed by a space, and
code then the numeric code.
Note
If you do not specify a code, Non-Unicode Analytics automatically uses 0
, and Unicode Analytics automatically uses 2 .
SEPARATOR char The separator character (delimiter) used between fields in the source data. You must
| TAB | SPACE specify the character as a quoted string.
You can specify a tab or a space separator by typing the character between double quo-
tation marks, or by using a keyword:
o SEPARATOR " " or SEPARATOR TAB
o SEPARATOR " " or SEPARATOR SPACE
QUALIFIER char | NONE The text qualifier character used in the source data to wrap and identify field values.
You must specify the character as a quoted string.
To specify the double quotation mark character as the text qualifier, enclose the char-
acter in single quotation marks: QUALIFIER '"'.
You can specify that there are no text qualifiers using either of these methods:
o QUALIFIER ""
o QUALIFIER NONE
Name Description
KEEPTITLE Treat the line number specified by STARTLINE as field names instead of data. If you
omit KEEPTITLE, generic field names are used.
optional
Note
The field names must be on the same line number in all the delimited
files that you import with a single execution of
IMPORT MULTIDELIMITED.
If field names are on different line numbers, see "Consistent file structure
required" on page 269.
CRCLEAR Replaces any CR characters (carriage return) that occur between text qualifiers with
space characters. You must specify QUALIFIER with a char value to use CRCLEAR.
optional
If you use both CRCLEAR and LFCLEAR, CRCLEAR must come first.
LFCLEAR Replaces any LF characters (line feed) that occur between text qualifiers with space
characters. You must specify QUALIFIER with a char value to use LFCLEAR.
optional
If you use both CRCLEAR and LFCLEAR, CRCLEAR must come first.
REPLACENULL Replaces any NUL characters that occur in the delimited file with space characters. The
number of any replaced NUL characters is recorded in the log.
optional
ALLCHAR The Character data type is automatically assigned to all the imported fields.
optional Tip
Assigning the Character data type to all the imported fields simplifies the
process of importing delimited text files. Once the data is in Analytics,
you can assign different data types, such as Numeric or Datetime, to the
fields, and specify format details.
ALLCHAR is useful if you are importing a table with identifier fields auto-
matically assigned the Numeric data type by Analytics when in fact they
should use the Character data type.
Examples
The examples below assume that you have monthly transaction data stored in 12 delimited files:
l Transactions_Jan.csv to Transactions_Dec.csv
Note
A separate Analytics table is created for each delimited file that you import.
Import all the delimited files from the specified folder, and save the Analytics
tables to another folder
This example is the same as the one above, but instead of saving the Analytics tables in the root project
folder, you want to save them in the C:\Point of sale audit\Data\Transaction working
data folder.
Remarks
Consistent file structure required
To import a group of delimited files cleanly using IMPORT MULTIDELIMITED, the structure of all the files in
the group must be consistent.
You can import inconsistently structured delimited files, and subsequently perform data cleansing and stand-
ardizing in Analytics. However, this approach can be labor intensive. In many cases, it is easier to make the
delimited files consistent before importing.
To import multiple delimited files cleanly, the following items need to be consistent across all files:
ACLScript
Item keyword Problem Solution
The character numeric code (Unicode edition of Analytics only) Group source files by encoding type,
set and encod- and do a separate import for each
Source delimited files use different
ing of the group.
character encodings. For example,
source data
some files have ASCII encoding and
some files have Unicode encoding.
Delimiter char- SEPARATOR Source delimited files use a different Do one of the following:
acter separator character (delimiter) o Standardize the separator character
between fields.
in the source files before importing
them.
o Group source files by separator char-
acter, and do a separate import for
each group.
Text qualifier QUALIFIER Source delimited files use a different Do one of the following:
character text qualifier character to wrap and o Standardize the qualifier character in
identify field values.
the source files before importing
them.
o Group source files by qualifier char-
acter, and do a separate import for
each group.
Start line of the STARTLINE Source delimited files have different Do one of the following:
data start lines for the data. o Standardize the start line in the
source files before importing them.
o Group source files that have the
same start line, and do a separate
import for each group.
ACLScript
Item keyword Problem Solution
Field names KEEPTITLE Source delimited files have field Do one of the following:
names on different line numbers. o Standardize the line number with the
field names in the source files before
importing them.
o Group source files that have field
names on the same line number,
and do a separate import for each
group.
Field names KEEPTITLE Some source delimited files have Do one of the following:
field names and some do not. o Add field names to the source files
that require them before importing all
files.
o Group source files that have field
names, and files that do not have
field names, and do a separate
import for each group.
o Omit KEEPTITLE to import all files
using generic field names. Once the
files have been imported to Analytics
tables, you can use the "EXTRACT
command" on page 197 to extract
only the data you want from any
table.
Syntax
IMPORT MULTIEXCEL <TO import_folder> FROM {source_filename|source_folder} TABLE input_
worksheets_or_named_ranges <PREFIX> <KEEPTITLE> <CHARMAX max_field_length>
Note
You must specify the IMPORT MULTIEXCEL parameters in exactly the same order as
above, and in the table below.
Parameters
Name Description
Example
If you omit TO, the data is imported to the folder containing the Analytics project.
FROM source_filename | The name of the source data file or files, or the folder containing the source data file or
source_folder files.
Specify source_filename or source_folder as a quoted string.
Example
FROM "Transactions_FY18.xlsx"
Name Description
Example
FROM "Transactions_FY*.xlsx"
selects:
Transactions_FY18.xlsx
Transactions_FY17.xlsx
You can use a wildcard in more than one location in a file name, and in a file exten-
sion.
Example
FROM "Transactions_FY*.*"
selects:
Transactions_FY18.xlsx
Transactions_FY17.xls
Source data file or files not in the root Analytics project folder
If the source data file or files are not located in the same folder as the Analytics project,
you must use an absolute file path, or a file path relative to the folder containing the pro-
ject, to specify the file location.
Example
Example
Name Description
Note
When you specify a folder, any worksheet in any Excel file in the folder is
imported if the worksheet name matches the TABLE value.
TABLE input_worksheets_ The name of the worksheets or named ranges to import. A separate Analytics table is cre-
or_named_ranges ated for each imported worksheet or named range.
Specify input_worksheets_or_named_ranges as a quoted string.
Use a wildcard (*) in place of unique characters in the names of worksheets or ranges.
For example, "Trans_*$" selects the following worksheets:
o Trans_Jan
o Trans_Feb
o Trans_Mar
o and so on
Note
The wildcard character (*) stands for zero (0) or more occurrences of any
letter, number, or special character.
You can use a wildcard in more than one location. For example, *Trans*$
selects:
l Trans_Jan
l Jan_Trans
PREFIX Prepend the Excel file name to the name of the Analytics tables.
optional Tip
If worksheets in different files have the same name, prepending the Excel
file name allows you to avoid table name conflicts.
KEEPTITLE Treat the first row of data as field names instead of data. If omitted, generic field names
are used.
optional
Name Description
Note
All first rows in the worksheets and named ranges that you import should
use a consistent approach. First rows should be either field names, or
data, across all data sets. Avoid mixing the two approaches in a single
import operation.
If the data sets have an inconsistent approach to first rows, use two sep-
arate import operations.
CHARMAX max_field_ The maximum length in characters for any field in an Analytics table that originates as
length character data in a source data file.
optional
Examples
The examples below assume that you have monthly transaction data for three years stored in three Excel
files:
l Transactions_FY18.xlsx
l Transactions_FY17.xlsx
l Transactions_FY16.xlsx
Each Excel file has 12 worksheets – one for each month of the year. The worksheets also include some
named ranges identifying various subsets of transactions.
Note
A separate Analytics table is created for each worksheet or named range that you import.
Import worksheets
Import all FY18 worksheets
You want to import all 12 monthly worksheets from the FY18 Excel file, and ignore any named ranges.
l you use the wildcard symbol (*) where the month occurs in each worksheet name
l you include the dollar sign ($) at the end of the worksheet name so that only worksheets are selec-
ted, and no named ranges
Import all FY18 worksheets, keep field names, and specify maximum char-
acter field length
This example is the same as the one above, but you want to keep the field names from the Excel files, and
Manage directories
Import all worksheets from all Excel files in the specified folder
You want to import all worksheets from all Excel files in the C:\Point of sale
audit\Data\Transaction master files folder.
l with TABLE, you use only the wildcard symbol (*) so that all worksheets in each file are selected,
and the dollar sign ($) so that only worksheets are selected, and no named ranges
l as a way of reducing the chance of naming conflicts, you use PREFIX to prepend the name of the
source Excel file to each Analytics table name
IMPORT MULTIEXCEL FROM "C:\Point of sale audit\Data\Transaction master files" TABLE "*$"
PREFIX
Import all worksheets from all Excel files in the specified folder, and save
the Analytics tables to another folder
This example is the same as the one above, but instead of saving the Analytics tables in the root project
folder, you want to save them in the C:\Point of sale audit\Data\Transaction working
data folder.
Remarks
Multiple IMPORT EXCEL commands
The IMPORT MULTIEXCEL command actually performs multiple individual IMPORT EXCEL commands –
one for each worksheet imported. If you double-click the IMPORT MULTIEXCEL entry in the log, the indi-
vidual IMPORT EXCEL commands are displayed in the display area.
For information about combining multiple Analytics tables, see "APPEND command" on page 72.
Syntax
IMPORT ODBC SOURCE source_name TABLE table_name <QUALIFIER data_qualifier>
<OWNER user_name> <USERID user_id> <PASSWORD num> <WHERE where_clause>
<TO table_name> <WIDTH max_field_length> <MAXIMUM max_field_length> <FIELDS field
<,...n>>
Parameters
Name Description
SOURCE source_name The data source name (DSN) of the ODBC data source to connect to. The DSN must
already exist and be correctly configured.
Note
You are limited to data sources that use the Windows ODBC drivers that
are installed on your computer. The Analytics native data connectors that
can be used with the ACCESSDATA command may not be available
with IMPORT ODBC.
TABLE table_name The table name in the ODBC data source to import data from.
table_name usually refers to a database table in the source data, but it can refer to any-
thing Analytics imports as a table. For example, if you use the Microsoft Text Driver,
table_name refers to the text file you want to import data from.
QUALIFIER data_qualifier The character to use as the text qualifier to wrap and identify field values. You must spe-
cify the character as a quoted string.
optional
Use single quotation marks to specify the double quotation character: '"'.
OWNER user_name The name of the database user account that owns the table you are connecting to.
optional
Name Description
WHERE where_clause An SQL WHERE clause that limits the records returned based on a criteria you specify.
Must be a valid SQL statement and must be entered as a quoted string:
optional
WIDTH max_field_length The maximum length in characters for any field in the Analytics table that originates as
character data in the source from which you are importing.
optional
You can enter any value between 1 and 254. The default value is 50. Data that exceeds
the maximum field length is truncated when imported to Analytics.
MAXIMUM max_field_ The maximum length in characters for text, note, or memo fields you are importing.
length
You can enter any value between 1 and 1100. The default value is 100. Data that
optional exceeds the maximum field length is truncated when imported to Analytics.
FIELDS field <,...n> Individual fields in the source data to import. Specify the name.
optional If you specify multiple fields, each field must be separated by a comma. If you omit
FIELDS, all fields are imported.
Enclosing the field names in quotation marks makes them case-sensitive. If you use quo-
tation marks, the case of field names must exactly match between FIELDS and the
ODBC data source. If you use quotation marks and the case of the field names does not
match, the fields are not imported.
Name Description
Note
FIELDS must be positioned last among the IMPORT ODBC parameters.
If FIELDS is not positioned last, the command fails.
Examples
Importing data from SQL Server
You import data from a SQL Server database to an Analytics table named Trans_Dec11:
Remarks
Older method of connecting to ODBC data sources
The IMPORT ODBC command is the older method of connecting to ODBC-compliant data sources from
Analytics. The new method of connecting to ODBC data sources uses the Data Access window and the
ACCESSDATA command.
You can continue to use IMPORT ODBC in Analytics. However, this method of connecting is now available
only in scripts and from the Analytics command line. You can no longer access this connection method in
the Data Definition Wizard.
Syntax
IMPORT PDF TO table <PASSWORD num> import_filename FROM source_filename <SERVER pro-
file_name> skip_length <PARSER "VPDF"> <PAGES page_range> {[record_syntax] [field_syntax]
<...n>} <...n>
record_syntax ::=
RECORD record_name record_type lines_in_record transparent [test_syntax] <...n>
test_syntax ::=
TEST include_exclude match_type AT start_line,start_position,range logic text
field_syntax ::=
FIELD name type AT start_line,start_position SIZE length,lines_in_field DEC value WID bytes PIC
format AS display_name
Parameters
General parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
Name Description
num is the number of the password definition. For example, if two passwords have been
previously supplied or set in a script, or when scheduling an analytic, PASSWORD 2 spe-
cifies that password #2 is used.
For more information about supplying or setting passwords, see:
l "PASSWORD command" on page 350
l "SET command" on page 408
l PASSWORD analytic tag
For more information about supplying or setting passwords, see:
l PASSWORD command
l SET command
l PASSWORD analytic tag
FROM source_filename The name of the source data file. source_filename must be a quoted string.
If the source data file is not located in the same directory as the Analytics project, you
must use an absolute path or a relative path to specify the file location:
o "C:\data\source_filename"
o "data\source_filename"
SERVER profile_name The profile name for the server that contains the data that you want to import.
optional
PARSER "VPDF" Use the VeryPDF parser to parse the PDF file during the file definition process.
optional If you omit PARSER, the default Xpdf parser is used.
If you are importing the PDF file for the first time, and you have no reason to do otherwise,
use the default Xpdf parser. If you have already encountered data alignment issues when
using Xpdf, use the VeryPDF parser to see if the parsing results are better.
Name Description
PAGES page_range The pages to include if you do not want to import all of the pages in the PDF file. page_
range must be specified as a quoted string.
optional
You can specify:
o individual pages separated by commas (1,3,5)
o page ranges (2-7)
o a combination of pages and ranges (1, 3, 5-7, 11)
If you omit PAGES, all pages in the PDF file are imported.
RECORD parameter
General record definition information.
Note
Some of the record definition information is specified using numeric codes that map to
options in the Data Definition Wizard.
In scripts, specify the numeric code, not the option name.
Name Description
RECORD record_name The name of the record in the Data Definition Wizard.
Specifying record_name is required in the IMPORT PDF command, but the record_name
value does not appear in the resulting Analytics table.
In the Data Definition Wizard, Analytics provides default names based on the type of
record:
o Detail
o Headern
o Footern
You can use the default names, or specify different names.
Name Description
o 0 – not transparent
o 1 – transparent
Transparent header records do not split multiline detail records.
If a header record splits a multiline detail record in the source PDF file, which can happen
at a page break, specifying 1 (transparent) unifies the detail record in the resulting Ana-
lytics table.
TEST parameter
The criteria for defining a set of records in the PDF file. You can have one or more occurrences of TEST
(up to 8) for each occurrence of RECORD.
Note
Some of the criteria are specified using numeric codes that map to options in the Data
Definition Wizard (option names are shown in parentheses below).
In scripts, specify the numeric code, not the option name.
Name Description
AT start_line, start_pos- o start_line – the line of a record that the criteria apply to
ition, range
For example, if you create a custom map to match zip codes, and the zip codes
appear on the third line of a three-line address record, you must specify 3 in start_
line.
Name Description
Note
For single-line records, the start_line value is always 1.
o start_position – the starting byte position in the PDF file for the comparison against
the criteria
o range – the number of bytes from the starting byte position in the PDF file to use in
the comparison against the criteria
If you are using starting byte position only, without a range, specify 0 for range.
Note
FIELD parameters
Field definition information.
Name Description
FIELD name type The individual fields to import from the source data file, including the name and data type
of the field. To exclude a field from being imported, do not specify it.
For information about type, see "Identifiers for field data types" on page 287.
AT start_line, start_position o start_line – the start line of the field in the record in the PDF file
For multiline records in a PDF file, start_line allows you to start a field at any line of the
record. start_line is always 1 if lines_in_record is 1.
o start_position – the starting byte position of the field in the PDF file
Name Description
Note
SIZE length, lines_in_field o length – the length in bytes of the field in the Analytics table layout
Note
AS display_name The display name (alternate column title) for the field in the view in the new Analytics
table.
Specify display_name as a quoted string. Use a semi-colon (;) between words if you want
Name Description
Examples
Importing data from a specific page of a PDF file
You import data from page 1 of the password-protected PDF file, Vendors.pdf.
One set of detail records, with three fields, is created in the resulting Analytics table, Vendor_List:
Remarks
Note
For more information about how this command works, see the Analytics Help.
For example, if you are defining a Last Name field, which uses a character data type, you would specify
"C": FIELD "Last_Name" C.
Note
When you use the Data Definition Wizard to define a table that includes EBCDIC,
Unicode, or ASCII fields, the fields are automatically assigned the letter "C" (for the
CHARACTER type).
When you enter an IMPORT statement manually, or edit an existing IMPORT statement,
you can substitute the more specific letters "E" or "U" for EBCDIC or Unicode fields.
A ACL
B BINARY
C CHARACTER
D DATETIME
E EBCDIC
F FLOAT
G ACCPAC
I IBMFLOAT
K UNSIGNED
L LOGICAL
N PRINT
P PACKED
Q BASIC
R MICRO
S CUSTOM
T PCASCII
U UNICODE
V VAXFLOAT
X NUMERIC
Y UNISYS
Z ZONED
Syntax
IMPORT PRINT TO table import_filename FROM source_filename <SERVER profile_name> char-
acter_set_value <code_page_number> {[record_syntax] [field_syntax] <...n>} <...n>
record_syntax ::=
RECORD record_name record_type lines_in_record transparent [test_syntax] <...n>
test_syntax ::=
TEST include_exclude match_type AT start_line,start_position,range logic text
field_syntax ::=
FIELD name type AT start_line,start_position SIZE length,lines_in_field DEC value WID bytes PIC
format AS display_name
Parameters
General parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
Name Description
o "C:\data\Invoices.FIL"
o "data\Invoices.FIL"
FROM source_filename The name of the source data file. source_filename must be a quoted string.
If the source data file is not located in the same directory as the Analytics project, you
must use an absolute path or a relative path to specify the file location:
o "C:\data\source_filename"
o "data\source_filename"
SERVER profile_name The profile name for the server that contains the data that you want to import.
optional
character_set_value The character set used to encode the Print Image (Report) file. The following values are
supported:
o 0 – ASCII
o 1 – EBCDIC
o 2 – Unicode
o 3 – Encoded text
code_page_number If you specified 3 (Encoded text) for character_set_value, you must also enter a code
page number.
optional
RECORD parameter
General record definition information.
Note
Some of the record definition information is specified using numeric codes that map to
options in the Data Definition Wizard.
In scripts, specify the numeric code, not the option name.
Name Description
RECORD record_name The name of the record in the Data Definition Wizard.
Specifying record_name is required in the IMPORT PRINT command, but the record_
name value does not appear in the resulting Analytics table.
In the Data Definition Wizard, Analytics provides default names based on the type of
record:
o Detail
o Headern
o Footern
You can use the default names, or specify different names.
record_type The three possible record types when defining a Print Image file:
Name Description
o 0 – detail
o 1 – header
o 2 – footer
Note
You can define multiple sets of header and footer records in a single exe-
cution of IMPORT PRINT, but only one set of detail records.
lines_in_record The number of lines occupied by a record in the Print Image file.
You can define single-line or multiline records to match the data in the file.
TEST parameter
The criteria for defining a set of records in the Print Image file. You can have one or more occurrences of
TEST (up to 8) for each occurrence of RECORD.
Note
Some of the criteria are specified using numeric codes that map to options in the Data
Definition Wizard (option names are shown in parentheses below).
In scripts, specify the numeric code, not the option name.
Name Description
Name Description
o 4 – (Blank) matching records must contain one or more blank spaces, in the specified
start line, at the specified start position, or in all positions of the specified range
o 5 – (Non-Blank) matching records must contain one or more non-blank characters
(includes special characters), in the specified start line, at the specified start position,
or in all positions of the specified range
o 7 – (Find in Line) matching records must contain the specified character, or string of
characters, anywhere in the specified start line
o 8 – (Find in Range) matching records must contain the specified character, or string
of characters, in the specified start line, anywhere in the specified range
o 10 – (Custom Map) matching records must contain characters that match the spe-
cified character pattern, in the specified start line, starting at the specified position
AT start_line, start_pos- o start_line – the line of a record that the criteria apply to
ition, range
For example, if you create a custom map to match zip codes, and the zip codes
appear on the third line of a three-line address record, you must specify 3 in start_
line.
Note
For single-line records, the start_line value is always 1.
o start_position – the starting byte position in the Print Image file for the comparison
against the criteria
o range – the number of bytes from the starting byte position in the Print Image file to
use in the comparison against the criteria
If you are using starting byte position only, without a range, specify 0 for range.
Note
Name Description
FIELD parameters
Field definition information.
Name Description
FIELD name type The individual fields to import from the source data file, including the name and data
type of the field. To exclude a field from being imported, do not specify it.
For information about type, see "Identifiers for field data types" on page 296.
AT start_line, start_pos- o start_line – the start line of the field in the record in the Print Image file
ition
For multiline records in a Print Image file, start_line allows you to start a field at any
line of the record. start_line is always 1 if lines_in_record is 1.
o start_position – the starting byte position of the field in the Print Image file
Note
SIZE length, lines_in_field o length – the length in bytes of the field in the Analytics table layout
Note
Name Description
Note
The number of lines specified for a field cannot exceed the number of
lines specified for the record containing the field.
AS display_name The display name (alternate column title) for the field in the view in the new Analytics
table.
Specify display_name as a quoted string. Use a semi-colon (;) between words if you
want a line break in the column title.
AS is required when you are defining FIELD. To make the display name the same as
the field name, enter a blank display_name value using the following syntax: AS "".
Make sure there is no space between the two double quotation marks.
Examples
Importing data from a Print Image (Report) file
You import data from the Print Image (Report) file, Report.txt.
One header record, and one set of detail records, with five fields, is created in the resulting Analytics table,
Inventory_report:
RECORD "Detail" 0 1 0 TEST 0 0 AT 1,59,59 7 "." FIELD "Field_3" X AT 1,6 SIZE 9,1 DEC 0 WID 9
PIC "" AS "Item ID" FIELD "Field_4" C AT 1,16 SIZE 24,1 DEC 0 WID 24 PIC "" AS "Item Desc."
FIELD "Field_5" N AT 1,40 SIZE 10,1 DEC 0 WID 10 PIC "" AS "On Hand" FIELD "Field_6" N AT
1,50 SIZE 12,1 DEC 2 WID 12 PIC "" AS "Cost" FIELD "Field_7" N AT 1,62 SIZE 12,1 DEC 2 WID 12
PIC "" AS "Total"
Remarks
Note
For more information about how this command works, see the Analytics Help.
A ACL
B BINARY
C CHARACTER
D DATETIME
E EBCDIC
F FLOAT
G ACCPAC
I IBMFLOAT
K UNSIGNED
L LOGICAL
N PRINT
P PACKED
Q BASIC
R MICRO
S CUSTOM
T PCASCII
U UNICODE
V VAXFLOAT
X NUMERIC
Y UNISYS
Z ZONED
Syntax
IMPORT SAP PASSWORD num TO table_name SAP SOURCE "SAP AGENT" import_details
Parameters
Name Description
TO table_name The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
SAP SOURCE Required for importing SAP data. "SAP AGENT" is the only available option.
"SAP AGENT"
import_details The details of the query. Must be enclosed by the <q></q> tags, and uses the tags listed
in "Direct Link query tags" on page 301 to define the query.
Name Description
Examples
Performing a multi-table query
This example performs a multi-table query using the IMPORT SAP command.
Correct order and nesting of the tags is necessary to create a valid query string. The tags in the example are
ordered and nested correctly. Use this example to determine the required order and nesting of
IMPORT SAP query tags.
Note
To assist readability, this example is formatted using multiple lines. In your script, the com-
mand and the query string must be entered without any line breaks.
Tip
The syntax of an IMPORT SAP query string is typically complex. The best way to add
IMPORT SAP commands with query strings to your scripts is to copy an existing IMPORT
SAP command from the Log tab in Analytics, then edit the query tags as necessary.
<a>T00001</a>
<td>Purchasing Document Header</td>
<fs>
<f>EBELN</f>
<f>BUKRS</f>
<f>BSTYP</f>
<f>BSART</f>
<f>STATU</f>
<f>WKURS</f>
</fs>
<wc>
<w>
<f>BUKRS</f>
<o>0</o>
<l>1000</l>
<h></h>
</w>
</wc>
</t>
<t>
<n>EKPO</n>
<a>T00002</a>
<td>Purchasing Document Item</td>
<fs>
<f>EBELP</f>
<f>WERKS</f>
<f>MENGE</f>
<f>BRTWR</f>
</fs>
<wc></wc>
</t>
</ts>
<js>
<jc>
<pt>
<pa>T00001</pa>
<pf>EBELN</pf>
</pt>
<ct>
<ca>T00002</ca>
<cf>EBELN</cf>
</ct>
</jc>
</js>
</q>
Remarks
The IMPORT SAP command is only supported if Direct Link is installed and configured.
The table in "Direct Link query tags" below lists the tags that can be included in the import_details para-
meter. The Required column uses the following values to indicate when tags must be present:
l Y – Required
l N – Optional
l M – Required for multi-table queries only
l B – Required, but no value should be passed
l W – Optional when filters are used
l S – Required when scheduled mode is specified
Table Alias <a> M The alias that uniquely identifies the table within the query. This allows
the same table to be used more than once.
The maximum length is 6 characters.
All Rows <ar> Y Indicates that all matching rows should be returned as part of the
query's result set.
Valid values are:
1 – Overrides the number of records specified in the <r> tag (Maximum
Rows)
0 – Returns the number of records specified in the <r> tag (Maximum
Rows)
This tag always appears after the <r></r> tag.
Child Table Field <cf> M The field in the child table that the join condition is based on.
Client Filename <cf> Y Identifies the target file on the client system where the results of the
query will be stored.
Destination <d> N Identifies a destination in the SAP RFC library file (sapnwrfc.ini) that
is used to locate an SAP system.
Data Length <dl> B The number of characters in each row, including carriage return and
line feed characters indicating the end of the record (CR+LF, or the
hexadecimal characters 0D+0A).
Date <dt> S Required when using scheduled mode. Specifies the time to run the
SAP job.
Must be formatted as YYYYMMDD. For example, December 31, 2014
must be specified as 20141231.
Expected Rows <e> B The expected number of rows the query will return.
Filter Field <f> W The native field name that the filter applies to.
Fields <fs> Y The list of fields in the table that will be returned as part of the query res-
ults.
High Value <h> W Contains the high value when using the Between operator. Ignored
when using any other operator.
Job Count <jcount> B Used internally by SAP to identify a Background mode query.
Job Name <jname> B Used internally by SAP to identify a Background mode query.
Join Relationships <js> Y The list of join conditions that link tables within the query.
Join Switch <jw> N Numeric equivalent of the join switch enumerated type.
Valid values are:
0 – Inner Join
1 – Left Outer Join
Low Value <l> W Contains either the lowest value when using the Between operator or
the value when using any other operator.
Language <lg> Y Language identifier used to determine the locale of fields in the SAP
database.
Parent Table Field <pf> M The field in the parent table the join condition is based on.
Maximum Rows <r> Y The maximum number of rows the query should return.
Selected <s> Y If the <s> tag appears below the <f> tag, it indicates whether the field
will be returned as part of the query's result set.
System <s> Y If the <s> tag appears below the <q> tag, it identifies the type of system
this query is used against (currently only SAP is supported).
Server Filename <sf> B Identifies the file on the server that holds the results of a Background
mode query.
Server Group <sg> N The name of the server group. Maximum 20 characters.
Name
Table Description <td> Y The table description from the SAP data dictionary. It should always
appear below the <a> tag.
Time <tm> S Required when using scheduled mode. Specifies the time to run the
SAP job.
Must be formatted as hhmmss. For example, 2:30 pm must be specified
as 143000.
Tables <ts> Y The list of tables from which the query will extract data.
Filters <wc> W The list of filters that will be applied to the data contained within the
table.
Filter Switch <ws> N Numeric equivalent of the filter switch enumerated type.
Valid values are:
0 – (Or) And (Or)
1 – (And) Or (And)
Syntax
IMPORT XBRL TO table import_filename FROM source_filename CONTEXT context_name <...n>
[field_syntax] <...n> <IGNORE field_num> <...n>
field_syntax ::=
FIELD name type AT start_position DEC value WID bytes PIC format AS display_name
Parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
FROM source_filename The name of the source data file. source_filename must be a quoted string.
If the source data file is not located in the same directory as the Analytics project, you
must use an absolute path or a relative path to specify the file location:
o "C:\data\source_filename"
o "data\source_filename"
CONTEXT context_name The XBRL context to define the table from. If you specify more than one context, all con-
texts must be of the same type (instant, period, or forever).
Name Description
FIELD name type The individual fields to import from the source data file, including the name and data type
of the field. To exclude a field from being imported, do not specify it.
For information about type, see "Identifiers for field data types" on the facing page.
AT start_position The starting byte position of the field in the Analytics data file.
Note
WID bytes The length in bytes of the field in the Analytics table layout.
Note
AS display_name The display name (alternate column title) for the field in the view in the new Analytics
table.
Specify display_name as a quoted string. Use a semi-colon (;) between words if you want
a line break in the column title.
AS is required when you are defining FIELD. To make the display name the same as the
field name, enter a blank display_name value using the following syntax: AS "". Make
sure there is no space between the two double quotation marks.
Name Description
Examples
Importing an XBRL file to an Analytics table
You import data from the Current_AsOf context in an XBRL file to an Analytics table called Financials :
Remarks
Note
For more information about how this command works, see the Analytics Help.
A ACL
B BINARY
C CHARACTER
D DATETIME
E EBCDIC
F FLOAT
G ACCPAC
I IBMFLOAT
K UNSIGNED
L LOGICAL
N PRINT
P PACKED
Q BASIC
R MICRO
S CUSTOM
T PCASCII
U UNICODE
V VAXFLOAT
X NUMERIC
Y UNISYS
Z ZONED
Syntax
IMPORT XML TO table import_filename FROM source_filename [field_syntax] <...n>
field_syntax ::=
FIELD name type AT start_position DEC value WID bytes PIC format AS display_name RULE xpath_
expression
Parameters
Name Description
TO table The name of the Analytics table to import the data into.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
FROM source_filename The name of the source data file. source_filename must be a quoted string.
If the source data file is not located in the same directory as the Analytics project, you
must use an absolute path or a relative path to specify the file location:
o "C:\data\source_filename"
o "data\source_filename"
FIELD name type The individual fields to import from the source data file, including the name and data type
of the field. To exclude a field from being imported, do not specify it.
For information about type, see "Identifiers for field data types" on page 311.
Name Description
AT start_position The starting byte position of the field in the Analytics data file.
Note
WID bytes The length in bytes of the field in the Analytics table layout.
Note
AS display_name The display name (alternate column title) for the field in the view in the new Analytics
table.
Specify display_name as a quoted string. Use a semi-colon (;) between words if you want
a line break in the column title.
AS is required when you are defining FIELD. To make the display name the same as the
field name, enter a blank display_name value using the following syntax: AS "". Make
sure there is no space between the two double quotation marks.
RULE xpath_expression The XPath expression used to select the field contents from the XML file.
XPath is a standard way of accessing data from XML files. For example, acct/title/text()
retrieves the text within the <title> tag in the XML file.
Examples
Importing data from an XML file to an Analytics table
You import data from an XML file to an Analytics table named Employees :
IMPORT XML TO Employees "Employees.fil" FROM "emp.XML" FIELD "Empno" C AT 1 DEC 0 WID 6
PIC "" AS "" RULE "/RECORDS/RECORD/Empno/text()" FIELD "First" C AT 7 DEC 0 WID 13 PIC ""
AS "" RULE "/RECORDS/RECORD/First/text()" FIELD "Last" C AT 20 DEC 0 WID 20 PIC "" AS ""
RULE "/RECORDS/RECORD/Last/text()" FIELD "HireDate" D AT 40 DEC 0 WID 10 PIC "YYYY-MM-
DD" AS "" RULE "/RECORDS/RECORD/HireDate/text()" FIELD "Salary" N AT 50 DEC 2 WID 8 PIC
"" AS "" RULE "/RECORDS/RECORD/Salary/text()"
Remarks
Note
For more information about how this command works, see the Analytics Help.
A ACL
B BINARY
C CHARACTER
D DATETIME
E EBCDIC
F FLOAT
G ACCPAC
I IBMFLOAT
K UNSIGNED
L LOGICAL
N PRINT
P PACKED
Q BASIC
R MICRO
S CUSTOM
T PCASCII
U UNICODE
V VAXFLOAT
X NUMERIC
Y UNISYS
Z ZONED
INDEX command
Creates an index for an Analytics table that allows access to the records in a sequential order rather than a
physical order.
Syntax
INDEX <ON> {key_field <D> <...n>|ALL} TO file_name <IF test> <WHILE test> <FIRST range|NEXT
range> <OPEN> <ISOLOCALE locale_code>
Parameters
Name Description
ON key_field D <...n> | ALL The key field or fields, or the expression, to use for indexing.
You can index by any type of field, including computed fields and ad hoc expressions,
regardless of data type.
o key_field – use the specified field or fields
If you index by more than one field, you create nested indexing in the table. The order
of nesting follows the order in which you specify the fields.
Include D to index the key field in descending order. The default index order is ascend-
ing.
o ALL – use all fields in the table
If you index by all the fields in a table you create nested indexing. The order of nesting
follows the order in which the fields appear in the table layout.
An ascending index order is the only option for ALL.
TO file_name The name of the index and the associated index file. The index file is created with an .INX
extension.
Note
In the Analytics user interface, index names are limited to 64 alpha-
numeric characters. The name can include the underscore character ( _ ),
but no other special characters, or any spaces. The name cannot start
with a number.
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
Name Description
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
OPEN Open the table and apply the index to the table.
optional
Examples
Note
For more information about how this command works, see the Analytics Help.
OPEN Vendor
INDEX ON Vendor_City to "CityIndex" OPEN
In the Vendor table, you create an index on the Vendor City field. Later, you apply the index to the table:
OPEN Vendor
INDEX ON Vendor_City to "CityIndex"
.
.
.
SET INDEX TO "CityIndex"
JOIN command
Combines fields from two Analytics tables into a new, single Analytics table.
Note
To use fuzzy matching to join tables, see "FUZZYJOIN command" on page 211.
Syntax
JOIN {PKEY primary_key_fields|PKEY ALL} {FIELDS primary_fields|FIELDS ALL} {SKEY sec-
ondary_key_fields|SKEY ALL} <WITH secondary_fields|WITH ALL> {no_
keyword|MANY|UNMATCHED|PRIMARY|SECONDARY|PRIMARY SECONDARY} <IF test> TO
table_name <LOCAL> <OPEN> <WHILE test> <FIRST range|NEXT range> <APPEND>
<PRESORT> <SECSORT> <ISOLOCALE locale_code>
Parameters
Name Description
PKEY primary_key_fields | The key field or fields, or expression, in the primary table.
PKEY ALL o primary_key_fields – use the specified field or fields
o ALL – use all fields in the table
FIELDS primary_fields | The fields or expressions from the primary table to include in the joined output table.
FIELDS ALL o primary_fields – include the specified field or fields
o ALL – include all fields from the table
Note
You must explicitly specify the primary key field or fields if you want to
include them in the joined table. Specifying ALL also includes them.
SKEY secondary_key_ The key field or fields, or expression, in the secondary table.
fields | SKEY ALL o secondary_key_fields – use the specified field or fields
o ALL – use all fields in the table
WITH secondary_fields | The fields or expressions from the secondary table to include in the joined output table.
WITH ALL o secondary_fields – include the specified field or fields
optional o ALL – include all fields from the table
Note
You must explicitly specify the secondary key field or fields if you want to
include them in the joined table. Specifying ALL also includes them.
You cannot specify WITH if you are using the UNMATCHED join type.
Name Description
o all matched primary records and the first Matched primary and secondary
matched secondary record
(1st secondary match)
MANY
o all matched primary records and all Matched primary and secondary
matched secondary records
o one record for each match between the (all secondary matches)
primary and secondary tables
UNMATCHED
PRIMARY
Name Description
o all primary records (matched and All primary and matched secondary
unmatched) and the first matched sec-
ondary record
Note
The keyword BOTH is the same as specifying PRIMARY.
SECONDARY
o all secondary records (matched and All secondary and matched primary
unmatched) and all matched primary
records
Only the first instance of any duplicate sec-
ondary matches is joined to a primary
record.
PRIMARY SECONDARY
o all primary and all secondary records, All primary and secondary
matched and unmatched
Only the first instance of any duplicate sec-
ondary matches is joined to a primary
record.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
Name Description
Note
For most join types, an IF condition applies only to the primary table.
The one exception is a many-to-many join, in which the IF condition can
also reference the secondary table.
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Description
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
PRESORT Sorts the primary table on the primary key field before executing the command.
optional Note
You cannot use PRESORT inside the GROUP command.
SECSORT Sorts the secondary table on the secondary key field before executing the command.
optional Note
You cannot use SECSORT inside the GROUP command.
Examples
Join two tables as a way of discovering employees who may also be vendors
The example below joins the Empmast and Vendor tables using address as the common key field (the
Address and Vendor_Street fields).
The JOIN command creates a new table with matched primary and secondary records, which results in a list
of any employees and vendors with the same address.
This version of the JOIN command includes all fields from the primary and secondary tables in the joined out-
put table.
OPEN Ar PRIMARY
OPEN Customer SECONDARY
JOIN PKEY CustNo FIELDS CustNo Due Amount SKEY CustNo UNMATCHED TO "Cus-
tomerNotFound.fil" OPEN PRESORT SECSORT
Remarks
Note
For more information about how this command works, see the Analytics Help.
LIST command
Outputs the data in one or more fields in an Analytics table to a display formatted in columns.
Syntax
LIST {FIELDS field_name <AS display_name> <...n>|FIELDS ALL} <LINE number field_list> <TO
{SCREEN|filename|PRINT}> <UNFORMATTED> <IF test> <WHILE test> <FIRST range|NEXT
range> <HEADER header_text> <FOOTER footer_text> <SKIP lines> <EOF> <APPEND>
Parameters
Name Description
LINE number field_list More than one line is used in the output for each record:
optional o number – the line number, must be between 2 and 60 inclusive
o field_list – the fields to include on that line
Name Description
UNFORMATTED The output is displayed as unformatted text. Output is identical to that created by the
EXPORT ASCII command. Unformatted data can be output to a file for further pro-
optional
cessing by other software programs.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
SKIP lines Inserts the specified number of blank lines between each record in the list. For example,
LIST ALL SKIP 1 produces a double spaced list (one blank line between each record).
optional
EOF Execute the command one more time after the end of the file has been reached.
optional This ensures that the final record in the table is processed when inside a GROUP com-
mand. Only use EOF if all fields are computed fields referring to earlier records.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional
Examples
Listing exceptions and saving to a text file
You use LIST to create a report listing exceptions identified in an inventory table. The report is saved as a
text file:
LIST Product_number Description Quantity Unit_cost Value IF Quantity < 0 OR Unit_cost < 0
HEADER "Negative Values" TO "Exceptions.txt"
Remarks
When to use LIST
Use LIST to print data, display data on screen, to export it to a text file.
LOCATE command
Searches for the first record that matches the specified value or condition, or moves to the specified record
number.
Syntax
LOCATE {IF test <WHILE test> <FIRST range|NEXT range>|RECORD num}
Parameters
Name Description
IF test The value or condition to search for. You must enclose character literal values in quo-
tation marks, and datetime values in backquotes.
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Examples
Locate the first record that matches a specified value
The following examples illustrate using LOCATE to find the first occurrence of a specific value in a table:
LOCATE IF Vendor_Name = "United Equipment" AND Invoice_Amount > 1000 AND Invoice_Date >
`20140930`
LOCATE RECORD 50
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
Use the LOCATE command to move directly to the first record in a table matching the specified value or
condition.
If the specified value or condition is found, the first matching record in the table is selected. If the specified
value or condition is not found, the table is positioned at the first record.
You can also use LOCATE to move directly to a specific record number
For more information about SET EXACT, see "SET command" on page 408.
For more information about the Exact Character Comparisons option, see Table tab (Options dialog box).
LOOP command
Executes a series of ACLScript commands repeatedly on a record while a specified condition evaluates to
true.
Note
The LOOP command must be enclosed inside the GROUP command.
Syntax
LOOP WHILE test
command
<...n>
END
Parameters
Name Description
WHILE test The test that must evaluate to true for the commands inside the LOOP command to be
executed. If the test evaluates to true the commands are executed repeatedly until the
test evaluates to false.
Examples
Splitting a comma-delimited field
You have a table containing invoice data and you need to isolate specific information for invoice amounts
per department. One invoice may be related to more than one department, and department codes are
stored in comma-delimited format in the table.
To extract the invoice amounts per department, you:
1. Use a GROUP command to process the table record by record.
2. Calculate the number of departments (n) associated with each record.
3. Use the LOOP command to iterate n times over the record to extract data for each department asso-
ciated with the record.
COMMENT
use GROUP to count commas in each department code field as a way of identifying how many depart-
ments are associated with the record
"LOOP" over each record for each code in the field, extracting each code into its own record
END
GROUP
v_department_count = OCCURS(Department_Code,',')
v_counter = 0
LOOP WHILE v_counter <= v_department_count
v_dept = SPLIT(Department_Code, ',', (v_counter + 1))
EXTRACT FIELDS Invoice_Number, Invoice_Amount, v_dept AS "Department" TO result1
v_counter = v_counter + 1
END
END
Remarks
Tip
For a detailed tutorial covering the LOOP and GROUP commands, see "Grouping and loop-
ing" on page 33.
How it works
Each LOOP command must specify a WHILE condition to test, and be closed with an END statement. The
commands between LOOP and END are executed repeatedly for the current record as long as the specified
test is true.
If the test is initially false, the commands are not executed.
MERGE command
Combines records from two sorted Analytics tables with an identical structure into a new Analytics table
that uses the same sort order as the original tables.
Syntax
MERGE {ON key_fields|PKEY primary_key_fields SKEY secondary_key_fields} <IF test> TO table_
name <LOCAL> <OPEN> <WHILE test> <FIRST range|NEXT range> <APPEND> <PRESORT>
<ISOLOCALE locale_code>
Parameters
Name Description
Sorting requirement
The key fields in the primary and secondary tables must both be sorted in ascending
order. If one or both key fields are unsorted, or sorted in descending order, the MERGE
command fails.
You can use PRESORT to sort the primary key field. If the secondary key field is unsor-
ted, you must first sort it in a separate sort operation before performing the merge.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Name Description
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Description
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
PRESORT Sorts the primary table on the primary key field before executing the command.
optional Note
You cannot use PRESORT inside the GROUP command.
Omit PRESORT:
o If the primary key field is already sorted
o If you are merging two tables using an indexed common key field
Examples
Merge tables with identical key field names
The following example merges two tables with identical key field names:
The following example merges two tables with different key field names:
Remarks
Note
For more information about how this command works, see the Analytics Help.
Alternatives to merging
Merging can be tricky to perform correctly. You can get the same result by appending, or by extracting and
appending, and then sorting.
For more information, see "APPEND command" on page 72, and "EXTRACT command" on page 197.
If the two source tables are already sorted, merging is more efficient and can execute more quickly.
NOTES command
Creates, modifies, or removes a note associated with an individual record in an Analytics table.
Syntax
NOTES <IF test> <TEXT note_text> <APPEND> <CLEAR>
Parameters
Name Description
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
o If you do not specify an IF test, the note text is added to each record in the table
o If you specify an IF test and CLEAR, the notes for those records that meet the con-
dition are deleted
TEXT note_text The text to add as a note. note_text must be either a string enclosed in quotation marks,
or a character expression.
optional
APPEND The note text is added to the end of any existing notes. If omitted, any existing notes are
overwritten.
optional
CLEAR Notes are deleted. Even if all record notes in a table are deleted, the auto-generated
RecordNote field is not deleted from the table layout.
optional
Examples
Adding the same note to multiple records
Any existing notes for the specified records are overwritten:
NOTES CLEAR
Remarks
Deleting the RecordNote field
To delete the RecordNote field from the table layout, and all notes in the table, use the DELETE NOTES
command without any of the options specified.
NOTIFY command
Sends an email notification message.
Syntax
NOTIFY USER username <PASSWORD pwd> MAILBOX pathname ADDRESS recipient <CC cc_
recipient> <BCC bcc_recipient> <SUBJECT subject> MESSAGE message <ATTACHMENT path-
name>
Parameters
Name Description
MAILBOX pathname The SMTP server name to use to send the email message. For example:
MAILBOX "mailserver.example.com"
ADDRESS recipient The email address of one or more recipients. Separate multiple email addresses with a
comma.
Enter a maximum of 1020 characters.
CC cc_recipient The email address of one or more carbon copy recipients. Separate multiple email
addresses with a comma.
optional
Enter a maximum of 1000 characters.
BCC bcc_recipient The email address of one or more blind carbon copy recipients. Separate multiple email
addresses with a comma.
optional
MESSAGE message The body text of the email message. The message is plain text and does not support
HTML.
If you want to insert a line break in your message, use two carat characters: ^^.
Name Description
ATTACHMENT pathname The path and filename of one or more attachments. Must be a quoted string.
optional Specify multiple attachments by entering a comma separated list of files for pathname:
ATTACHMENT "result1,result2"
Examples
Sending an error report email
You are running a script, and you want to send a notification email if the script fails. Using NOTIFY, you
define the email message and include two attachments:
l the log file
l a .fil file containing recorded errors
Remarks
Recipients and attachments
You can use the NOTIFY command to send email notification messages to one or more recipients. Mes-
sages can include attached data files and Analytics projects.
The NOTIFY command can be used to notify the appropriate personnel when a script fails unexpectedly.
Error handling
If Analytics is unable to connect with the mail server, it makes five additional attempts to connect, with a 10-
second pause between each attempt. If all connection attempts are unsuccessful, the NOTIFY command is
canceled, with a message written to the log, but the script continues processing.
You can use the SET command to change this default behavior. You can specify a different number of con-
nection attempts and a different amount of time between attempts, or you can turn off additional con-
nection attempts. You can also specify that Analytics stops processing a script if the NOTIFY command is
canceled. For more information, see "SET command" on page 408.
An invalid email recipient is not considered a failure of the NOTIFY command and does not cause a script
to stop regardless of the associated setting.
OPEN command
Opens an Analytics table and the associated data file.
Syntax
OPEN {table_name|data_file <FORMAT layout_name>} <BUFFERLENGTH length> <CRLF>
<DBASE> <INDEX index_file> <PRIMARY|SECONDARY> <SKIP bytes> <RELATION key_field>
Parameters
Name Description
data_file The data file to associate with the table specified by FORMAT layout_name.
Analytics assumes a file extension of .fil if no extension is specified. To open a file with no
extension, insert a period (.) at the end of the file name.
FORMAT layout_name The Analytics table layout to apply to the data file that you open as a table.
optional
BUFFERLENGTH n The length in bytes of the input buffer area to be allocated to the table. The default value
is 33,000 bytes.
optional
Larger buffer areas may improve processing speed at the expense of RAM available for
storing Analytics commands.
If any IBM variable length blocks are read which exceed the buffer length, Analytics dis-
plays an error message and stops processing. The default value is set in the Buffer Size
field in the Table tab in the Options dialog box.
You will seldom need to change BUFFERLENGTH n, because the default is sufficient to
handle almost all situations.
CRLF Specifies that a variable length ASCII file is to be read. Analytics automatically adjusts for
the varying record lengths.
optional
By default, files are assumed to be fixed-length files.
DBASE Specifies that the data source is a dBASE file. Analytics recognizes the type of dBASE file
and automatically creates a table from the file description. Can be omitted for dBASE files
optional
with a .dbf extension.
INDEX index_file The index file to apply to the table when it is opened.
optional The file extension for the index file name is assumed to be .inx when none is specified.
Name Description
PRIMARY | SECONDARY Specifies that a table is opened as either a primary table or a secondary table. If omitted,
the table is opened as a primary table.
optional
SKIP bytes The number of bytes to bypass at the physical start of the table.
optional SKIP can be used to ignore table header records or leading portions of the table that do
not follow the layout of the remainder of the table. If omitted, the table is read starting at
the first byte.
Note
RELATION key_field Specifies that the table is to be opened as an ad hoc related table. Analytics does not
retain this relation when the table is closed.
optional
You must also specify the INDEX parameter when you use RELATION. key_field is the
key field or expression used to create the relation between two tables.
Examples
Opening a table while specifying a table layout
You open the April_2012 table using the March_2012 table layout:
OPEN Inventory
OUTLIERS command
Identifies statistical outliers in a numeric field. Outliers can be identified for the field as a whole, or for sep-
arate groups based on identical values in one or more character, numeric, or datetime key fields.
Syntax
OUTLIERS {AVERAGE|MEDIAN} {PKEY key_field <...n>|NOKEY} ON numeric_field <OTHER field
<...n>> NUMSTDEV number_of_std_devs <IF test> <TO {SCREEN|table_name}> <PRESORT>
<WHILE test> <FIRST range|NEXT range> <OPEN>
Note
You cannot run the OUTLIERS command locally against a server table.
You must specify the OUTLIERS command name in full. You cannot abbreviate it.
Parameters
Name Description
AVERAGE | MEDIAN The method for calculating the center point of the values in numeric_field (the outlier
field).
o AVERAGE – calculate the average (mean) of the values
o MEDIAN – calculate the median of the values
The center point is calculated for either:
o the numeric field as a whole
o the numeric values for each key field group
The center point is subsequently used in calculating the standard deviation of the
numeric field, or of each group.
Note
If you specify MEDIAN, numeric_field must be sorted. Use PRESORT if
numeric_field is not already sorted.
Tip
If the data you are examining for outliers is significantly skewed,
MEDIAN might produce results that are more representative of the bulk
of the data.
PKEY key_field <...n> If you specify PKEY, outliers are identified at the group level. If you specify NOKEY, out-
| NOKEY liers are identified at the field level.
o PKEY key_field – the field or fields to use for grouping the data in the table
Key fields can be character, numeric, or datetime. Multiple fields must be separated
Name Description
ON numeric_field The numeric field to examine for outliers. You can examine only one field at a time.
Outliers are values that fall outside the upper and lower boundaries established by the
field or group standard deviation, or by a specified multiple of the standard deviation.
OTHER field <...n> One or more additional fields to include in the output.
optional Note
Key fields and the outlier field are automatically included in the output
table, and do not need to be specified using OTHER.
NUMSTDEV number_of_ In numeric_field, the number of standard deviations from the mean or the median to the
std_devs upper and lower outlier boundaries. You can specify any positive integer or decimal
numeral (0.5, 1, 1.5, 2 . . . )
The formula for creating outlier boundaries is:
mean/median ± (number_of_std_devs * standard deviation)
Note
Standard deviation is a measure of the dispersion of a data set – that is,
how spread out the values are. The outliers calculation uses population
standard deviation.
NUMSTDEV 2
Name Description
Any value greater than the upper boundary, or less than the lower boundary, is included
as an outlier in the output results.
Note
For the same set of data, as you increase the value in number_of_std_
devs you potentially decrease the number of outliers returned.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
TO SCREEN | table_name The location to send the results of the command to:
optional o SCREEN – displays the results in the Analytics display area
o table_name – saves the results to an Analytics table
Specify table_name as a quoted string with a .FIL file extension. For example: TO
"Output.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character (
_ ), but no other special characters, or any spaces. The name cannot
start with a number.
Name Description
Tip
If the appropriate field or fields in the input table are already sorted, you
can save processing time by not specifying PRESORT.
Note
You cannot use PRESORT inside the GROUP command.
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
Examples
Identifying transaction amounts that are out of the ordinary
You want to identify transaction amounts that are out of the ordinary across the entire Ar table in Sample
Project.acl.
You decide to set the outlier boundaries at 3 times the standard deviation of the Amount field. The test
returns 16 outliers in the table of 772 records.
OPEN Ar
OUTLIERS AVERAGE NOKEY ON Amount NUMSTDEV 3 PRESORT TO "Outliers_AR.fil" OPEN
You repeat the test, but increase the standard deviation multiple to 3.5. The test now returns only 6 outliers
because the outlier boundaries are farther away from the center point of the values in the Amount field.
OPEN Ar
OUTLIERS AVERAGE NOKEY ON Amount NUMSTDEV 3.5 PRESORT TO "Outliers_AR.fil"
OPEN
Identifying transaction amounts that are out of the ordinary for each cus-
tomer
For each customer in the Ar table in Sample Project.acl, you want to identify any transaction
amounts that are out of the ordinary.
You decide to set the outlier boundaries at 3 times the standard deviation of each customer's group of
transactions.
OPEN Ar
OUTLIERS AVERAGE PKEY No ON Amount NUMSTDEV 3 PRESORT TO "Outliers_Customer_
AR.fil" OPEN
The test returns 7 outliers. The standard deviation and the average are reported for each customer's
group of transactions:
438.81 ± (3 * 772.44)
= 438.81 ± 2,317.32
= (1,878.51) (lower boundary)
= 2,756.13 (upper boundary)
Using MEDIAN to identify transaction amounts that are out of the ordinary for
each customer
You use MEDIAN, instead of AVERAGE, to perform the same outlier test that you performed in the
example above.
OPEN Ar
OUTLIERS MEDIAN PKEY No ON Amount NUMSTDEV 3 PRESORT TO "Outliers_Customer_AR_
Median.fil" OPEN
The test returns 10 outliers instead of the 7 that are returned in the previous test. Depending on the nature
of the data, MEDIAN and AVERAGE can return somewhat different results:
Remarks
Note
For more information about how this command works, see the Analytics Help.
l For number_of_std_devs, substitute the actual standard deviation multiple you used.
l If you used median as a center point rather than average, substitute MEDIAN for AVERAGE.
3. Paste this expression into the Analytics command line, edit it as required, and press Enter:
l For number_of_std_devs, substitute the actual standard deviation multiple you used.
l If you used median as a center point rather than average, substitute MEDIAN for AVERAGE.
PASSWORD command
Creates a password definition, without a password value, that prompts users for a password while a script
is running.
Syntax
PASSWORD num < prompt>
Parameters
Name Description
prompt A valid character expression to display in the dialog box used to prompt for the pass-
word. Enclose literal strings in quotation marks.
optional
If prompt is omitted, a default dialog box with no message is displayed.
Examples
Prompting for password information
You use the PASSWORD command to prompt the user for the three passwords required in a script. Once
the user enters the required passwords, the script can complete the remaining processing without inter-
ruption:
PASSWORD 1 "Password:"
REFRESH Abc PASSWORD 1
Remarks
When to use PASSWORD
Use the PASSWORD command to prompt a user to enter password information before a script accesses,
imports, or refreshes password-protected data.
You can create up to ten different password definitions in a script .
PASSWORD is useful if:
l you want to avoid typing an actual password in a script, which the SET PASSWORD command
requires
l individual users need to enter distinct passwords
PAUSE command
Pauses a script, and displays information in a dialog box for users.
Syntax
PAUSE message <IF test>
Parameters
Name Description
message A message to display in the dialog box. The maximum length is 199 characters.
message must be enclosed in quotation marks. If the message contains double quo-
tation marks, enclose it in single quotation marks.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
Examples
Displaying an error message
You require user input to meet specific requirements. When you detect that the input does not meet those
requirements, you use the PAUSE command and display an error message in a dialog box:
Remarks
When to use PAUSE
Use PAUSE to display read-only messages on screen in the course of running a script. You can display
error messages or information such as the result of an analytic operation.
How it works
While the message dialog box is displayed, execution of the script is halted and only resumes once the user
clicks OK to close the message dialog box. For this reason, you cannot use PAUSE in scripts or analytics
that must run unattended.
Limitations
PAUSE has the following limitations:
l cannot be included inside the GROUP command
l cannot be used in analytics run in Robots, or on AX Server
PREDICT command
Applies a predictive model to an unlabeled data set to predict classes or numeric values associated with
individual records.
Syntax
PREDICT MODEL model_name TO table_name <IF test> <WHILE test> <FIRST range|NEXT
range>
Parameters
Name Description
MODEL model_name The name of the model file to use for predicting classes or values. You use a model file
previously generated by the TRAIN command.
You must specify the *.model file extension. For example:
MODEL "Loan_default_prediction.model"
Note
The model file must have been trained on a data set with the same fields
as the unlabeled data set – or substantially the same fields.
Name Description
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _ ),
but no other special characters, or any spaces. The name cannot start
with a number.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Examples
Use a classification model to make predictions
You input a classification model to the PREDICT command to make predictions about which current loan
applicants will default if given a loan.
You previously produced the classification model using the TRAIN command with a set of historical loan
data, including loan default information.
OPEN "Loan_applicants_current"
PREDICT MODEL "Loan_default_prediction.model" TO "Loan_applicants_default_predicted.FIL"
houses.
You previously produced the regression model using the TRAIN command with a set of recent house
sales data, including the sale price.
OPEN "House_price_evaluation"
PREDICT MODEL "House_price_prediction.model" TO "House_prices_predicted.FIL"
Remarks
Note
For more information about how this command works, see the Analytics Help.
PRINT command
Prints a text file, an Analytics log file, or an Analytics project item that has been exported as an external file –
a script (.aclscript), a table layout (.layout), or a workspace (.wsp). You can also print a graph that has been
generated by a command.
Syntax
PRINT {file_name|GRAPH}
Parameters
Name Description
Examples
Printing a log file
To print the log file for the ACL_Demo.acl project, specify the following command:
Printing a graph
To print the graph produced from the BENFORD command, specify the following commands:
OPEN Metaphor_APTrans_2002
BENFORD ON Invoice_Amount LEADING 1 TO GRAPH
PRINT GRAPH
Remarks
Selecting a printer
The printer used is the default printer configured in Microsoft Windows. To change the printer you need to
change the default printer in Windows.
Related commands
To print the contents of an Analytics table in a project, use the DO REPORT command.
PROFILE command
Generates summary statistics for one or more numeric fields, or numeric expressions, in an Analytics table.
Syntax
PROFILE {<FIELDS> numeric_field <...n>|<FIELDS> ALL} <IF test> <WHILE test> <FIRST
range|NEXT range>
Parameters
Name Description
FIELDS numeric_field Specify individual fields to profile, or specify ALL to profile all numeric fields in the
<...n> | FIELDS ALL Analytics table.
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Examples
Profiling a single field
OPEN Employee_Payroll
PROFILE FIELDS Salary
Remarks
Statistics displayed in output
The following statistics are displayed for each numeric field or numeric expression specified for the com-
mand:
l total value
l absolute value
l minimum value
l maximum value
QUIT command
Ends the current session and closes Analytics.
Syntax
QUIT
Examples
Check if a file exists and close Analytics if it does not
You have created a script for others to run, but if a required file does not exist, you want to close Analytics.
The example below checks if the required Inventory.csv file exists, and closes Analytics if it does not:
IF FILESIZE("Inventory.csv") = -1 QUIT
OPEN Inventory
SUMMARIZE ON Location ProdCls SUBTOTAL Value TO "Inventory_value_by_location_class.FIL"
PRESORT CPERCENT
QUIT
Remarks
Changes are saved
When QUIT executes, any Analytics tables that are open are saved and closed before quitting.
If you modified the active view or a script and did not save the changes, Analytics prompts you to save the
changes before quitting.
RANDOM command
Generates a set of random numbers.
Syntax
RANDOM NUMBER n <SEED seed_value> MINIMUM min_value MAXIMUM max_value
<COLUMNS n> <UNIQUE> <SORTED> <TO {SCREEN|filename}> <APPEND>
Parameters
Name Description
SEED seed_value The value used to initialize the random number generator.
optional If you specify a seed value, it can be any number. Each unique seed value results in a
different set of random numbers. If you respecify the same seed value, the same set of
random numbers is generated. Regenerating the same set of random numbers can be
required if you need to replicate analysis.
o Seed value – explicitly specify a seed value, and save the value, if you want the
option of replicating a particular set of random numbers.
o No seed value – enter a seed value of ‘0’, or leave the seed value blank, if you want
Analytics to randomly select a seed value.
MINIMUM min_value The smallest possible number in the set of random numbers. Any valid numeric value or
expression is allowed.
MAXIMUM max_value The greatest possible number in the set of random numbers. Any valid numeric value or
expression is allowed.
COLUMNS n The number of columns used to display the set of random numbers.
optional If you omit COLUMNS, the default is 6 columns.
Name Description
TO SCREEN | filename The location to send the results of the command to:
optional o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
If you omit TO, the set of random numbers is output to screen.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
Examples
Generate a text file with 100 random numbers
You want to pull 100 hard copy files at random from a set of files with numbering that ranges from 10,000 to
20,000.
You can use the RANDOM command to generate a text file with 100 random numbers between 10,000 and
20,000. You then pull the hard copy files that match the random numbers. The numbers are arranged in 10
columns, are unique, and are sorted in ascending order:
RANDOM NUMBER 100 SEED 45387 MINIMUM 10000 MAXIMUM 20000 COLUMNS 10 UNIQUE
SORTED TO "Random_Numbers.txt"
Remarks
Note
For more information about how this command works, see the Analytics Help.
RCOMMAND command
Passes an Analytics table to an external R script as a data frame and creates a new table in the
Analytics project using output from the external R script.
Syntax
RCOMMAND FIELDS field <...n> RSCRIPT path_to_script TO table_name <IF test> <WHILE test>
<FIRST range|NEXT range> <KEEPTITLE> <SEPARATOR character> <QUALIFIER character>
<OPEN>
Parameters
Name Description
FIELDS field_name <...n> The fields from the source Analytics table, or the expressions, to include in the data frame
that is sent to the R script.
Depending on the edition of Analytics that you are using, you may encounter errors when
sending data containing some special characters to R:
o non-Unicode – "\"
o Unicode – "ÿ" or "Ŝ"
o Both – box drawing characters such as blocks, black squares, and vertical broken bars
Note
Mixed language data is also not supported, for example a table con-
taining both Japanese and Chinese characters.
RSCRIPT path_to_script The full or relative path to the R script on the file system. Enclose path_to_script in quo-
tation marks.
Name Description
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _
), but no other special characters, or any spaces. The name cannot
start with a number.
The output table is created from the data frame or matrix that the R script returns.
IF test A condition that must be met to process the current record. The data frame passed to the
R script contains only those records that meet the condition.
optional
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
KEEPTITLE Treat the first row of data as field names instead of data. If omitted, generic field names
are used.
optional
This option is required if you want to retrieve data using column names in the R script.
SEPARATOR character The character to use as the separator between fields. You must specify the character as a
quoted string.
optional
The default character is a comma.
Note
Avoid using any characters that appear in the input fields. If the
SEPARATOR character appears in the input data, the results may be
affected.
QUALIFIER character The character to use as the text qualifier to wrap and identify field values. You must spe-
cify the character as a quoted string.
optional
The default character is a double quotation mark.
Name Description
Note
Avoid using any characters that appear in the input fields. If the
QUALIFIER character appears in the input data, the results may be
affected.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
Examples
Getting R up and running (Hello world)
You create a hello world script to test your connection between Analytics and R:
Analytics command
R script (analysis.r)
srcTable<-acl.readData()
Analytics command
R script (analysis.r)
Analytics command
R script (analysis.r)
Analytics command
R script (analysis.r)
Analytics command
R script (analysis.r)
Remarks
Note
For more information about how this command works, see the Analytics Help.
# stores the Analytics table in a data frame called myTable that can be referenced throughout the script
myTable<-acl.readData()
To retrieve data from a cell in the data frame, you can use one of the following approaches:
# Retrieves the value in the first row and second column of the data frame
myTable[1,2]
Note
Coordinates are based on the order of fields specified in the command, not the table
layout or view that is currently open.
l Using row and column names:
# Retrieves the value in the first row and "myColumnTitle" column of the data frame
myTable["1","myColumnTitle"]
You must specify the KEEPTITLE option of the command to use column names.
Rows are named "1", "2", "3", and increment accordingly. You may also use a combination of names
and coordinates.
Note
You must return a data frame or a matrix to Analytics when the R script terminates.
Ensure the columns in the data frame or matrix contain only atomic values and not lists,
matrices, arrays, or non-atomic objects. If the values cannot be translated into
Analytics data types, the command fails.
Logical Logical
Numeric Numeric
Character Character
R log file
Analytics logs R language messages to an aclrlang.log file in the project folder. Use this log file for
debugging R errors.
Tip
The log file is available in the Results folder of Analytics Exchange analytic jobs.
REFRESH command
Updates the data in an Analytics table from its associated data source.
Syntax
REFRESH < table_name> <PASSWORD num>
Parameters
Name Description
table_name The name of the Analytics table to refresh. If you do not specify a table_name, the open
table is refreshed.
optional
Examples
Refreshing a table with no password required
If a password is not required for the data source, just specify the REFRESH command and the name of the
Analytics table to refresh.
REFRESH Invoices
If you are refreshing a table originally imported from a password-protected data source using the
ACCESSDATA command, the password prompt is automatic and does not need to be separately specified:
REFRESH Invoices
The disadvantage of this method is that the password appears as clear text in the script.
COMMENT
//ANALYTIC Refresh Table
//PASSWORD 1 "Enter your password:"
END
REFRESH Invoices PASSWORD 1
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
The REFRESH command updates the contents of a table by re-running the IMPORT command, or the
ACCESSDATA command, initially used to define and import the table.
REFRESH and ACCESSDATA
The following guidelines apply when refreshing a table imported from an ODBC data source using the
ACCESSDATA command.
l Open table – If the table is open when you refresh it, you temporarily need disk space equal to twice
the size of the table. If you have limited disk space, close the table first before refreshing it.
l Analytics 12 – Tables that were imported using the ACCESSDATA command in version 12 of
Analytics are not refreshable, even if you are using a more recent version of Analytics.
If you want to be able to refresh these tables, re-import them using Analytics 12.5 or later.
REFRESH and passwords
You can use the REFRESH command with password-protected data sources that exist in a database, or
in a cloud data service.
You cannot use the REFRESH command with password-protected file-based data sources, such as Excel
files. The one exception is password-protected PDFs.
RENAME command
Renames an Analytics project item or a file.
Syntax
RENAME item_type name <AS|TO> new_name <OK>
Parameters
Name Description
item_type name The type and name of the project item or file that you want to rename.
Note
In most cases, you cannot rename an item or file if it is active, open, or in
use.
Specify one of the following valid types:
o FIELD – physical data field, computed field, or variable
l The table containing the field must be open. However, the active view cannot
include the field.
l You cannot rename a field that is referenced by a computed field.
o FORMAT – Analytics table
o INDEX – index
o REPORT – report or view
o WORKSPACE – workspace
o SCRIPT (or BATCH) – script
o DATA – Analytics data file (.fil)
o FILE – data file in the file system
o LOG – Analytics log file (.log)
o TEXT – text file
Examples
Renaming a field
You need to rename the ProdNo field to ProdNum . You use OK to perform the action without additional con-
firmation:
OPEN Inventory
RENAME FIELD ProdNo AS ProdNum OK
REPORT command
Formats and generates a report based on the open Analytics table.
Syntax
REPORT <ON break_field <PAGE> <NODUPS> <WIDTH characters> <AS display_name>>
<...n> FIELD other_fields <WIDTH characters> <AS display_name> <...n> <SUPPRESS>
<NOZEROS> <LINE n other_fields> <PRESORT < sort_field>> <...n> <SUMMARIZED> <SKIP n>
<EOF> <TO {SCREEN|PRINT|filename <HTML>}> <IF test> <WHILE test> <FIRST range|NEXT
range> <HEADER header_text> <FOOTER footer_text> <APPEND>
Parameters
Name Description
ON break_field PAGE The character field or fields used to break the report into sections.
NODUPS WIDTH char-
A new report section and subtotal is created each time the value in break_field
acters AS display_name
changes. Breaking reports into sections can make them easier to scan.
<...n>
o break_field – the field to use as a break field
optional
To run a report based on a view (DO REPORT), the break field must be the leftmost
character field in the view.
o PAGE – inserts a page break when the break field value changes
o NODUPS – suppresses duplicate display values in the break field
For example, if the customer name is listed for each invoice record, you can make
the report more readable by listing only the first instance of each customer name.
o WIDTH characters – the output length of the field in characters
o AS display_name – the display name (alternate column title) for the field in the report
Specify display_name as a quoted string. Use a semi-colon (;) between words if you
want a line break in the column title. If you want the display name to be the same as
the field name, or an existing display name in the source table, do not use AS.
Note
You must specify ON to use break_field , PAGE, NODUPS, or
PRESORT.
Name Description
The SUBTOTAL and ACCUMULATE keywords are synonyms for FIELD, and have been
deprecated. All numeric fields are automatically subtotaled.
Note
Break fields are automatically included in the report and do not need to
be specified as other_fields.
LINE n other_fields Specifies the number of output lines in the column and the fields to appear on the line
number n.
optional
If no value is specified, the column defaults to a single line. The value of n must be
between 2 and 60 inclusive.
Column headings on the report are determined solely by the fields on the first line.
other_fields specifies appropriate fields or expressions for the report.
PRESORT sort_field <...n> o Sorts break_field , if one or more break fields are specified.
o Sorts sort_field, if one or more sort fields are specified.
optional
PRESORT does not sort the fields listed as other_fields unless they are also listed as
sort_field.
SUMMARIZED Produces a report with subtotals and totals only, and no detail lines.
optional Subtotals are generated for the unique break field values. If SUMMARIZED is not spe-
cified, Analytics produces a report that includes detail lines, as well as subtotals for
each of the specified key break fields.
EOF Execute the command one more time after the end of the file has been reached.
optional This ensures that the final record in the table is processed when inside a GROUP com-
mand. Only use EOF if all fields are computed fields referring to earlier records.
TO SCREEN | PRINT| file- The location to send the results of the command to:
name <HTML> o SCREEN – displays the results in the Analytics display area
optional o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
Name Description
folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o PRINT – sends the results to the default printer
By default, reports output to a file are saved as ASCII text files. Specify HTML if you want
to output the report as an HTML file (.htm).
If you omit TO, the report is output to screen.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
HEADER header_text The text to insert at the top of each page of a report.
optional header_text must be specified as a quoted string. The value overrides the Analytics
HEADER system variable.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional
Name Description
Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
Examples
Generating an HTML report
You generate a report from the Ar table and output the report to a formatted HTML file:
OPEN Ar
REPORT ON No FIELD Due Type Amount TO "C:\Reports\AR.htm" HTML
RETRIEVE command
Retrieves the result of a Direct Link query submitted for background processing.
Syntax
RETRIEVE table_name PASSWORD num
Parameters
Name Description
table_name The name of the table originally created in Analytics by the Direct Link query.
The table must already exist before you use RETRIEVE.
Examples
Retrieving the Background mode query result
You set the password and then retrieve the Background mode query result for an Analytics table named
DD02T_Data:
Remarks
Before you begin
This command is only supported if Direct Link is installed and configured.
SAMPLE command
Draws a sample of records using either the record sampling or monetary unit sampling method.
Record samplingMonetary unit sampling
Syntax
Note
The syntax does not include filtering (IF statements) or scope parameters because apply-
ing these options compromises the validity of a sample.
Parameters
Note
Do not include thousands separators when you specify values.
Name Description
Name Description
NUMBER sample_size
Use the random selection method.
All records are randomly selected from the entire data set.
Specify the sample size that was generated by calculating the sample size.
ORDER Note
optional Random selection method only.
You can only use ORDER when specify FIELDS.
Adds the ORDER field to the output results.
This field displays the order in which each record is randomly selected.
RECORD | FIELDS field_ o RECORD – the entire record is included in the output table
name <...n> o FIELDS – individual fields, rather than the entire record, are included in the output
table
Specify the field(s) or expressions to include. If you specify multiple fields, they must
be separated by spaces.
Name Description
Specify table_name as a quoted string with a .FIL file extension. For example: TO
"Output.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character (
_ ), but no other special characters, or any spaces. The name cannot
start with a number.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
MERSENNE_TWISTER Note
optional Cell and random selection methods only.
The random number generator in Analytics uses the Mersenne-Twister algorithm.
If you omit MERSENNE_TWISTER, the default Analytics algorithm is used.
Note
You should only use the default Analytics algorithm if you require back-
ward compatibility with Analytics scripts or sampling results created prior
to Analytics version 12.
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
Examples
Draw a record sample
You are going to use record sampling to estimate the rate of deviation from a prescribed control in an
account containing invoices.
After calculating a statistically valid sample size, you are ready to draw the sample. You are going to use the
random selection method.
The example below:
l Samples the open Analytics table
l Uses the random selection method with a seed value of 123456
l Specifies a sample size of 95 records
l Includes only specified fields in the output table
l Specifies that the random number generator in Analytics uses the Mersenne-Twister algorithm
SAMPLE ON RECORD RANDOM 123456 NUMBER 95 FIELDS RefNum CustNum Amount Date
Type TO "Ar_record_sample" OPEN MERSENNE_TWISTER
Remarks
Note
For more information about how this command works, see the Analytics Help.
Syntax
Note
The syntax does not include filtering (IF statements) or scope parameters because apply-
ing these options compromises the validity of a sample.
Parameters
Note
Do not include thousands separators when you specify values.
Name Description
Name Description
SUBSAMPLE Note
optional You can only use SUBSAMPLE when specify FIELDS.
Adds the SUBSAMPLE field to the output results.
If each amount in a sample field represents a total of several separate transactions, and
you want to perform audit procedures on only one transaction from each sampled total
amount, you can use the values in the SUBSAMPLE field to randomly select the indi-
vidual transactions.
For more information, see Performing monetary unit sampling.
NOREPLACEMENT The same record is not selected more than once. As a result, the sample may contain
fewer records than calculated by the SIZE command.
optional
If NOREPLACEMENT is omitted, or if you specify REPLACEMENT, records can be
selected more than once.
ORDER Note
optional Random selection method only.
You can only use ORDER when specify FIELDS.
Adds the ORDER field to the output results.
This field displays the order in which each record is randomly selected.
Name Description
RECORD | FIELDS field_ o RECORD – the entire record is included in the output table
name <...n> o FIELDS – individual fields, rather than the entire record, are included in the output
table
Specify the field(s) or expressions to include. If you specify multiple fields, they must
be separated by spaces.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
MERSENNE_TWISTER Note
optional Cell and random selection methods only.
The random number generator in Analytics uses the Mersenne-Twister algorithm.
If you omit MERSENNE_TWISTER, the default Analytics algorithm is used.
Name Description
Note
You should only use the default Analytics algorithm if you require back-
ward compatibility with Analytics scripts or sampling results created prior
to Analytics version 12.
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
Examples
Draw a monetary unit sample
You are going to use monetary unit sampling to estimate the total amount of monetary misstatement in an
account containing invoices.
After calculating a statistically valid sample size, you are ready to draw the sample. You are going to use the
fixed interval selection method.
The example below:
l Samples the open Analytics table based on a transaction amount field
l Uses the fixed interval selection method with an interval value of $6,283.33
l Specifies that the first record selected contains the 100,000th monetary unit (the number of cents in
$1,000)
l Uses a top stratum cutoff of $5,000
l Includes the entire record in the output table
SAMPLE ON Amount INTERVAL 6283.33 FIXED 1000.00 CUTOFF 5000.00 RECORD TO "Ar_mon-
etary_unit_sample" OPEN
Remarks
Note
For more information about how this command works, see the Analytics Help.
SAVE command
Copies an Analytics table and saves it with a different name, or saves an Analytics project.
Syntax
To create a copy of an Analytics table and save it with a different name:
SAVE
Parameters
Name Description
new_table The name of the new Analytics table to create and save.
Note
Table names are limited to 64 alphanumeric characters. The name can
include the underscore character ( _ ), but no other special characters, or
any spaces. The name cannot start with a number.
FORMAT ACL_table The name of the existing Analytics table. Use the name of the table layout, not the name
of an associated data file.
Examples
Creating a new table based on an existing one
You create a new table called Payables_March based on the existing table Payables_master. Pay-
ables_March can then be linked to the March payables data file:
Remarks
How it works
SAVE FORMAT produces a result similar to copying and pasting an Analytics table in the Overview tab in
the Navigator. A new Analytics table is created and associated to the same data file or data source as the
original table.
If required, you can link the newly created table to a different data source.
Syntax
SAVE LAYOUT {FILE|TABLE} TO {file_name|table_name}
Parameters
Name Description
FILE | TABLE o FILE – save an Analytics table layout to an external table layout file (.layout)
o TABLE – save table layout metadata to an Analytics table (.fil)
TO file_name | table_ The name of the output file, and the output location:
name o filename – the name of the .layout file
Specify the filename as a quoted string. For example: TO "Ap_Trans.layout".
The .layout file extension is used by default, so specifying it is optional.
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
folder:
l TO "C:\Ap_Trans.layout"
l TO "Table Layouts\Ap_Trans.layout"
Note
Limit the table layout name to 64 alphanumeric characters, not includ-
ing the .layout extension, to ensure that the name is not truncated
when the table layout is imported back into Analytics.
The name can include the underscore character ( _ ), but no other
special characters, or any spaces. The name cannot start with a num-
ber.
o table_name –the name of the Analytics table and .fil file
Specify the table_name as a quoted string. For example: TO "Ap_Trans_layout_
metadata.fil".
Name Description
Examples
Saving a table layout to an external table layout file (.layout)
The following examples save the table layout used by the open table to an external table layout file called
Ap_Trans.layout:
Here, the table layout file is saved in the Analytics project folder:
Remarks
SAVE LAYOUT file vs table
The SAVE LAYOUT command is used for two different purposes:
l FILE – saves the table layout of the open Analytics table to an external table layout file with a .lay-
out extension
l TABLE – extracts the metadata from the table layout of the open Analytics table and saves it to a
new Analytics table
Note
The field names in the new table are always generated in English regardless of which loc-
alized version of Analytics you are using.
decimals The number of decimal places in the field (numeric fields only)
format The format of the field (datetime and numeric fields only)
Additional details
Computed fields are included in the extracted metadata, but the expression used by the
computed field, and any conditions, are not recorded. Start position, field length, and
Computed fields decimal places are also not recorded for computed fields.
Related fields Related fields are not included because they are not part of the table layout.
Field-level filters Field-level filters and field notes are not included.
Field notes
Alternate column title The values recorded for alternate column title and column width are the ones specified in
the table layout, not the view-level values that can be specified for columns.
Column width
Syntax
SAVE LOG <SESSION> AS filename {<ASCII>|HTML} <OK>
Parameters
Name Description
SESSION Only log entries for the current Analytics session are saved.
optional
OK If a file with the same name as filename already exists, it is overwritten without con-
firmation.
optional
Examples
Save the command log from payables analysis
You have performed data analysis on the March payables file and you want to save the associated com-
mand log as part of your working papers.
The following example saves the entries from the current Analytics session to an HTML file. If a file with the
same name already exists it is overwritten without confirmation:
Syntax
SAVE TABLELIST {FILE|TABLE} TO {table_name|file_name}
Parameters
Name Description
Examples
Creating a new table
You create a new table in the Analytics project called Table_list_complete:
Remarks
Output columns
The output Analytics table or CSV file contains three columns:
l table_name – the name of the Analytics table layout
l type – an indication whether the Analytics table is a local table or a server table
l Data_file_Path – the full path to the source data file
Syntax
SAVE WORKSPACE workspace_name {field_name <...n>}
Parameters
Name Description
workspace_name The name of the workspace to create and add to the current Analytics project.
field_name <...n> The name of the field to add to the workspace. You can include multiple field names sep-
arated by spaces.
Example
Activating a workspace
You create a workspace called Inventory_margin with two computed fields from the Metaphor_Invent-
ory_2002 table. Then you activate the workspace so that the fields are available in the Inventory table:
OPEN Metaphor_Inventory_2002
SAVE WORKSPACE Inventory_margin Gross_unit_margin Percent_unit_margin
OPEN Inventory
ACTIVATE WORKSPACE Inventory_margin OK
Remarks
Field names used to create computed fields must match
The names of any fields used in expressions that create a computed field that is saved in a workspace
must match the names of the fields in the table that uses the workspace.
For example, if a workspace contains the computed field Value=Sale_price*Quantity, the active table
must also contain fields called Sale_price and Quantity .
SEEK command
Searches an indexed character field for the first value that matches the specified character expression or
character string.
Syntax
SEEK search_expression
Parameters
Name Description
Examples
Locate the first value in a field that matches a character variable
The Card_Number field has been defined as a character field and is indexed in ascending order.
The example below locates the first value in the field that exactly matches, or starts with, the value contained
in the v_card_num variable.
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
Use the SEEK command to move directly to the first record in a table containing the specified search_
expression in the indexed character field.
l If the search_expression is found – the first matching record in the table is selected.
l If the search expression is not found – the message "No index matched key" is displayed, and the
table is positioned at the first record with a greater value than the search expression.
If there are no values in the indexed field greater than the search expression, the table is positioned
at the first record.
Index required
To use SEEK to search a character field, you must first index the field in ascending order. If multiple char-
acter fields are indexed in ascending order, only the first field specified in the index is searched.
SEEK cannot be used to search non-character index fields, or character fields indexed in descending
order.
SEQUENCE command
Determines if one or more fields in an Analytics table are in sequential order, and identifies out-of-sequence
items.
Syntax
SEQUENCE <ON> {<FIELDS> field <D> <...n>|<FIELDS> ALL} <UNFORMATTED>
<ERRORLIMIT n> <IF test> <WHILE test> <FIRST range|NEXT range> <TO
{SCREEN|filename|PRINT}> <APPEND> <HEADER header_text> <FOOTER footer_text>
<PRESORT> <LOCAL> <ISOLOCALE locale_code>
Parameters
Name Description
ON FIELDS field D <...n> The fields or expressions to check for sequential order. Specify ALL to check all fields in
| FIELDS ALL the Analytics table.
Include D to sort the key field in descending order. The default sort order is ascending.
UNFORMATTED Suppresses page headings and page breaks when the results are output to a file.
optional
ERRORLIMIT n The number of errors allowed before the command is terminated. The default value is 10.
optional
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Description
reached
o NEXT – start processing from the currently selected record until the specified number
of records is reached
Use range to specify the number of records to process.
If you omit FIRST and NEXT, all records are processed by default.
TO SCREEN | filename | The location to send the results of the command to:
PRINT o SCREEN – displays the results in the Analytics display area
optional o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o PRINT – sends the results to the default printer
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
HEADER header_text The text to insert at the top of each page of a report.
optional header_text must be specified as a quoted string. The value overrides the Analytics
HEADER system variable.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
PRESORT Sorts the table on the key field before executing the command.
optional Note
You cannot use PRESORT inside the GROUP command.
LOCAL Saves the output file in the same location as the Analytics project.
optional
Name Description
Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
Examples
Testing for out of sequence employee IDs and hire dates
You write any sequence errors identified in the EmployeeID and HireDate fields to a text file:
Remarks
Using SEQUENCE inside a GROUP
If you use SEQUENCE inside a GROUP command, the command executes to avoid interfering with the pro-
cessing of the group, but no further data sequence errors are reported.
SET command
Sets a configurable Analytics option.
Note
The SET command sets an Analytics option for the duration of the Analytics session only.
This behavior applies whether you use the SET command in the Analytics command line
or in an Analytics script.
To set Analytics options so that they persist between Analytics sessions, you must use the
Options dialog box. For more information, see Configuring ACL options.
Syntax
Syntax Examples and remarks
When this option is turned on, Analytics replaces invalid character data with blanks and
invalid numeric data with zeros.
Specifies how Analytics displays dates, and the date portion of datetimes, in views,
reports, and exported files.
o SET DATE 0 sets the date to MM/DD/YYYY format
o SET DATE 1 sets the date to MM/DD/YY format
o SET DATE 2 sets the date to DD/MM/YY format
o SET DATE "<string>" sets the date to the custom format you specify
When using the SET DATE command to specify custom date formats, you must use
'D' for Day, 'M' for Month, and 'Y' for Year, even if you have specified different date
format characters in the Options dialog box. For example:
Default setting: OFF
Specify ON to automatically delete the associated data file when you delete a table lay-
out.
Specify OFF to prevent the associated data file from being deleted when you delete a
table layout.
You must include the underscore ( _ ) in DELETE_FILE.
Specifying SET DELETE_FILE, without any parameter, in the command line displays
whether DELETE_FILE is currently on or off.
Caution
Use caution when turning this option on. It may be an original data file
that is deleted along with the table.
Data files are deleted outright. They are not sent to the Windows
Recycle Bin.
The value parameter is a quoted string that specifies the label to display at the top of
each printed page.
Specify NONE to stop writing commands and results in scripts to the Analytics com-
mand log. Specify ON to resume logging.
The SET ECHO command applies only to the logging of commands and results in
scripts. Commands performed through the user interface or issued from the command
line, and any results they produce, are always logged, regardless of how ECHO is set.
You can issue the SET ECHO NONE/ON command in a script or from the command
line, but regardless of where you issue the command, it affects only the logging of com-
mands and results in scripts.
Specifying SET ECHO, without any parameter, in the command line displays whether
the logging of commands and results in scripts is currently on or off.
Default setting: OFF
Controls how Analytics compares character fields, expressions, or literal values.
Note
Blank spaces are treated like characters.
o SET EXACT is OFF – Analytics uses the shorter string when comparing two strings
of unequal length. The comparison starts with the leftmost characters and moves to
the right.
For example, "AB" is equal to "AB", and it is also considered equal to "ABC".
o SET EXACT is ON – comparison strings must be exactly identical to be a match.
When comparing two strings of unequal length, Analytics pads the shorter string with
trailing blank spaces to match the length of the longer string.
For example, "AB" is equal to "AB", but it is not considered equal to "ABC".
For more examples illustrating SET EXACT, see "Exact Character Comparisons" in
Table tab (Options dialog box).
You can use the ALLTRIM( ) function to remove leading and trailing blank spaces and
ensure that only text characters and internal spaces are compared.
For example: ALLTRIM(" AB") = ALLTRIM("AB") is True when the values are wrapped
with ALLTRIM( ), but False otherwise.
Some Analytics commands and functions are affected by SET EXACT and some are
not:
Creates a global filter (view filter) on the open table, and specifies either a logical test,
or the name of an existing saved filter.
Specifying SET FILTER, without any parameter, removes any filter from the open table.
SET FOLDER folder path Specifies the Analytics project folder in the Overview tab for command output. The
default output folder is the folder containing the active table.
This a DOS-style path using the format /foldername/subfoldername, in which the initial
slash (/) indicates the root level in the Overview tab. You must specify a full file path.
o SET FOLDER /Tables/Results sets the output folder to the Results subfolder. If the
Results subfolder does not exist, it is created.
o SET FOLDER / sets the output folder to the root level in the Overview tab
o SET FOLDER sets the output folder to the default (the folder containing the active
table)
The output folder remains as whatever you set it – until you reset it, or close the project.
Upon opening the project, the output folder reverts to the default of the active table
folder.
Default setting: OFF
If you use the ON parameter, Analytics automatically displays the current table layout
and computed field definitions when you open a new table. The results appear in the
command log.
SET FUZZYGROUPSIZE
<TO> num SET FUZZYGROUPSIZE TO 10
Specifies the maximum number of items that can appear in a fuzzy duplicate group in
the output results. The num parameter cannot be less than 2 or greater than 100. The
default size is 20. The specified size remains in effect for the duration of the Analytics
session.
Specifies the graph type to use for all subsequently generated graphs. The commands
run must be compatible with the specified graph type. For example, the BENFORD com-
mand cannot produce a PIE2D or PIE3D chart. If an incompatible graph type is spe-
cified the default graph type is used (BAR3D).
The type parameter must be one of the following:
o PIE2D
o PIE3D
o BAR2D
o BAR3D – This is the default graph type.
o STACKED2D
o STACKED3D
o LAYERED
o LINE
o BENFORD – Combines 2D bar graph and 2D line graph.
Specifies the maximum number of table history entries to retain. The value parameter
must be between 1 and 100.
Specifies the name of the script file that the Script Recorder uses to record commands.
The first command switches logging to the specified log. If the specified log does not
exist, it is created.
The second command restores logging to the original Analytics command log.
Note
The maximum length of an Analytics project path and log name is 259
characters, which includes the file path, the log name, and the file exten-
sion (.log).
Specifies the maximum number of loops that can be executed by the LOOP command
before the command is terminated.
The num range is 0 to 32767, where 0 turns off loop testing.
Specify LEFT, RIGHT, TOP, or BOTTOM for the side parameter. If you want to change
the margin on all sides, you need to specify each margin with a separate SET MARGIN
command. Specifying a value of 100 creates a margin of 1 inch.
Default setting: MAX
Specifies how decimal precision works when two operands are evaluated in a numeric
expression.
o FIRST – use the number of decimal places of the first operand in a pair of operands
o LAST – use the number of decimal places of the last operand in a pair of operands
o MIN – use the minimum number of decimal places in a pair of operands
o MAX – use the maximum number of decimal places in a pair of operands
In multi-operand expressions, the SET MATH setting works on a pairwise basis, apply-
ing the specified setting to each pair of operands, rounding as necessary, as they are
evaluated in the standard mathematical order (BOMDAS).
If the SET MATH setting reduces the number of decimal places in a result, the result is
rounded, not truncated.
For more information, see Controlling rounding and decimal precision in numeric
expressions.
Note
You cannot use SET MATH while an Analytics table is open.
SET MONTHS <TO> string Specifies the default three-character abbreviations for month names. The string para-
meter is the list of month abbreviations separated by commas.
SET NOTIFYFAILSTOP
{ON | OFF} SET NOTIFYFAILSTOP ON
Default setting: OFF
o NOTIFYFAILSTOP is OFF – Analytics allows a script to continue even if a NOTIFY
command in the script fails.
o NOTIFYFAILSTOP is ON – Analytics stops processing a script, and writes a mes-
sage to the log, if a NOTIFY command in the script fails. The script stops after the ini-
tial failure, or after the specified number of NOTIFYRETRYATTEMPTS, if none of the
attempts are successful.
SET
NOTIFYRETRYATTEMPTS SET NOTIFYRETRYATTEMPTS TO 10
<TO> num
Specifies the number of times the NOTIFY command will attempt to send an email if the
initial attempt is unsuccessful. Enter a number from 0 to 255. If you enter 0, no addi-
tional attempts are made after an initial failure. The default is 5.
One possible reason for the NOTIFY command failing to send an email is that the email
server is unavailable.
SET
NOTIFYRETRYINTERVAL SET NOTIFYRETRYINTERVAL TO 30
<TO> seconds
Specifies the amount of time in seconds between NOTIFYRETRYATTEMPTS. Enter a
number from 1 to 255. The default is 10 seconds.
SET ORDER <TO> values Specifies the sort sequence for character fields. The values parameter lists all of the
character for the selected sort order.
Default setting: ON
If OFF is specified Analytics does not stop processing when an overflow error occurs.
Used to create a password definition, and specify a password value, for unattended
script execution.
The num parameter uniquely identifies the password definition and must be a value
from 1 to 10. Specify the password value as a quoted string.
SET READAHEAD <TO> Specifies the size of the data block read. You should only change this setting if you are
size advised to do so by Support.
Note
SET RETRYIMPORT is retained for backward compatibility.
SET RETRYIMPORT and SET RETRY perform identical actions.
Specify ON to display a confirmation dialog box when overwriting any of the following:
o fields in table layouts
o Analytics tables
o files, including Analytics data files (.fil)
Specify OFF to prevent the dialog box from being displayed.
Specifying SET SAFETY, without any parameter, in the command line displays whether
SAFETY is currently on or off.
Specifies the default decimal, thousands, and list separators used by Analytics. The
SET SEPARATORS values must be three valid separator characters in the following
order:
o decimal (period, comma, or space)
o thousands (period, comma, or space)
o list (semi-colon, comma, or space)
Among the three separators, the decimal separator must be unique. You must specify
all three separators when you use the command. The list separator is used primarily to
separate function parameters.
Creates a new session in the Analytics command log. The session is identified by the
current timestamp.
The optional session_name allows you to add up to 30 characters of additional identi-
fying information. Quotation marks are permitted but not required.
Specifies the maximum amount of memory allocated for sorting and indexing pro-
cesses. The num parameter must be a value from 0 to 2000 megabytes (MB), to be
entered in 20MB increments. If the sort memory is set to 0, Analytics uses the memory
currently available.
Default setting: OFF
Only for use when defining an Analytics table that uses an ODBC data source (IMPORT
ODBC command), or direct database access (DEFINE TABLE DB command).
If you use the ON parameter, when defining the table Analytics suppresses the time por-
tion of datetime values. For example, 20141231 235959 is read, displayed in views,
and subsequently processed as 20141231.
Including this command in a pre-datetime Analytics script (pre v.10.0) that assumes the
time portion of datetime data will be truncated allows the script to run in the datetime-
enabled version of Analytics.
Analytics suppresses the time portion by using only the date portion of the datetime
format. The time data is still present in the .fil file or the database table. If required, you
can redefine the field or define a new field to include the time portion of the data.
If SET SUPPRESSTIME = OFF, Analytics tables defined using ODBC or direct database
access include full datetime values.
You can issue the SET SUPPRESSTIME ON/OFF command in a script or from the com-
mand line.
Specifying SET SUPPRESSTIME, without any parameter, in the command line displays
whether the suppression of the time portion of datetime data is currently on or off.
Default setting: OFF
Specifies that command output is in plain text rather than formatted text.
Specifies whether the results of IF, WHILE, FOR, and NEXT tests associated with
GROUP commands should be recorded in the log.
Specifies how Analytics displays the time portion of datetimes, and standalone time val-
ues, in views, reports, and exported files.
When using the SET TIME command to specify time formats, you must use 'h' for Hour,
'm' for Minute, and 's' for Second, even if you have specified different time format char-
acters in the Options dialog box. For example:
Default setting: ON
o UTCZONE is ON – Analytics changes the display of local times with a UTC offset to
the UTC equivalent of the local time. (UTC is Coordinated Universal Time, the time
at zero degrees longitude.)
o UTCZONE is OFF – Analytics displays local times with a UTC offset without con-
verting them to UTC.
For example:
o 01 Jan 2015 04:59:59 (SET UTCZONE ON)
o 31 Dec 2014 23:59:59-05:00 (SET UTCZONE OFF)
Conversion of local time to UTC is for display purposes only, and does not affect the
source data. You can change back and forth between the two different display modes
whenever you want to.
Specifies the default display width in characters for numeric computed fields or ad hoc
numeric expressions when Analytics cannot determine the maximum width.
SIZE command
Calculates a statistically valid sample size, and sample interval, for record sampling or monetary unit
sampling.
Record samplingMonetary unit sampling
Syntax
SIZE RECORD CONFIDENCE confidence_level POPULATION population_size PRECISION tol-
erable_rate <ERRORLIMIT expected_rate> <TO {SCREEN|filename}>
Parameters
Note
Do not include thousands separators, or percentage signs, when you specify values.
Name Description
CONFIDENCE con- The desired confidence level that the resulting sample is representative of the entire
fidence_level population.
For example, specifying 95 means that you want to be confident that 95% of the time the
sample will in fact be representative. Confidence is the complement of "sampling risk". A
95% confidence level is the same as a 5% sampling risk.
POPULATION population_ The number of records in the table you are sampling.
size
PRECISION tolerable_rate The tolerable deviation rate, which is the maximum rate of deviation from a prescribed
control that can occur and you still consider the control effective.
For example, specifying 5 means that the deviation rate must be greater than 5% for you
to consider the control not effective.
ERRORLIMIT expected_ The expected population deviation rate. This is the rate of deviation from a prescribed
rate control that you expect to find.
optional For example, specifying 1 means that you expect the deviation rate to be 1%.
If you omit this parameter, an expected population deviation rate of 0% is used.
TO SCREEN | filename The location to send the results of the command to:
Name Description
Examples
Calculate the required size and interval for a record sample
You have decided to use record sampling to estimate the rate of deviation from a prescribed control in an
account containing invoices.
Before drawing the sample, you must first calculate the statistically valid sample size and sample interval.
You want to be confident that 95% of the time the sample drawn by Analytics will be representative of the
population as a whole.
Using your specified confidence level, the example below calculates a sample size of 95, and a sample inter-
val value of 8.12, to use when drawing a record sample:
Remarks
Note
For more information about how this command works, see the Analytics Help.
Syntax
SIZE MONETARY CONFIDENCE confidence_level POPULATION population_size MATERIALITY
tolerable_misstatement <ERRORLIMIT expected_misstatement> <TO {SCREEN|filename}>
Parameters
Note
Do not include thousands separators, or percentage signs, when you specify values.
Name Description
CONFIDENCE con- The desired confidence level that the resulting sample is representative of the entire
fidence_level population.
For example, specifying 95 means that you want to be confident that 95% of the time the
sample will in fact be representative. Confidence is the complement of "sampling risk". A
95% confidence level is the same as a 5% sampling risk.
POPULATION population_ The total absolute value of the numeric sample field.
size
MATERIALITY tolerable_ The tolerable misstatement, which is the maximum total amount of misstatement that
misstatement can occur in the sample field without being considered a material misstatement.
For example, specifying 29000 means that the total amount of misstatement must be
greater than $29,000 to be considered a material misstatement.
ERRORLIMIT expected_ The expected misstatement. This is the total amount of misstatement that you expect the
misstatement sample field to contain.
optional For example, specifying 5800 means that you expect the total amount of misstatement
to be $5,800.
If you omit this parameter, an expected misstatement of $0.00 is used.
TO SCREEN | filename The location to send the results of the command to:
o SCREEN – displays the results in the Analytics display area
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing
folder:
Name Description
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
Examples
Calculate the required size and interval for a monetary unit sample
You have decided to use monetary unit sampling to estimate the total amount of monetary misstatement in
an account containing invoices.
Before drawing the sample, you must first calculate the statistically valid sample size and sample interval.
You want to be confident that 95% of the time the sample drawn by Analytics will be representative of the
population as a whole.
Using your specified confidence level, the example below calculates a sample size of 93, and a sample inter-
val value of 6,283.33, to use when drawing a monetary unit sample:
Remarks
Note
For more information about how this command works, see the Analytics Help.
SORT command
Sorts records in an Analytics table into an ascending or descending sequential order, based on a specified
key field or fields. The results are output to a new, physically reordered Analytics table.
Syntax
SORT ON {key_field <D> <...n>|ALL} <FIELDS field_name <AS display_name> <...n>|FIELDS
ALL> TO tablename <IF test> <WHILE test> <FIRST range|NEXT range> <APPEND> <OPEN>
<ISOLOCALE locale_code>
Parameters
Name Description
ON key_field D <...n> The key field or fields, or the expression, to use for sorting.
| ALL
You can sort by any type of field, including computed fields and ad hoc expressions,
regardless of data type.
o key_field – use the specified field or fields
If you sort by more than one field, you create a nested sort in the output table. The
order of nesting follows the order in which you specify the fields.
Include D to sort the key field in descending order. The default sort order is ascend-
ing.
o ALL – use all fields in the table
If you sort by all the fields in a table you create a nested sort in the output table. The
order of nesting follows the order in which the fields appear in the table layout.
An ascending sort order is the only option for ALL.
Name Description
Fields are used in the order that they appear in the table layout.
Converts computed fields to physical fields of the appropriate data type in the des-
tination table – ASCII or Unicode (depending on the edition of Analytics), ACL (the
native numeric data type), Datetime, or Logical. Populates the physical fields with the
actual computed values.
o omit FIELDS – the entire record is included in the sorted output table: all fields, and
any undefined portions of the record
Fields are used in the order that they appear in the table layout.
Computed fields are preserved.
Tip
If you need only a portion of the data contained in a record, do not
include all fields, or the entire record, in the sorted output table. Select
only the fields you need, which in most cases speeds up the sorting pro-
cess.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Name Description
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
OPEN Open the table and apply the index to the table.
optional
Examples
Sort on a single field, output entire records
You want to sort the records in the sample Inventory table by product number. The sorted records are
extracted to a new Analytics table called Inventory_Product_Number.
Entire records are included in the output table:
To switch from the default ascending sort order to a descending sort order, you add D after the key field
name:
You want to sort the records in the sample Ap_Trans table by the following fields:
l vendor state (related Vendor table)
l vendor city (related Vendor table)
l vendor number (Ap_Trans table)
All three key fields and the specified non-key fields, including the related field Vendor.Vendor_Name, are
extracted to a new Analytics table called Ap_Trans_State_City :
Remarks
Note
For more information about how this command works, see the Analytics Help.
STATISTICS command
Calculates statistics for one or more numeric or datetime fields in an Analytics table.
Syntax
STATISTICS <ON> {field <...n>|ALL} <STD> <MODMEDQ> <NUMBER n> <TO
{SCREEN|filename|PRINT}> <IF test> <WHILE test> <FIRST range|NEXT range> <APPEND>
Parameters
Name Description
ON field <...n> | ALL Specify one or more numeric or datetime fields to generate statistics for, or specify ALL to
generate statistics for all numeric and datetime fields in the Analytics table.
STD Calculates the standard deviation of the fields specified, in addition to the other statistics.
optional
MODMEDQ Calculates the mode, median, first quartile, and third quartile values of the fields spe-
cified, in addition to the other statistics.
optional
NUMBER n The number of high and low values to retain during processing. The default value is 5.
optional
TO SCREEN | filename | The location to send the results of the command to:
PRINT o SCREEN – displays the results in the Analytics display area
optional o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o PRINT – sends the results to the default printer
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Name Description
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
Name Contains
Name Contains
Q25n The first quartile value (lower quartile value) calculated by the command.
Q75n The third quartile value (upper quartile value) calculated by the command.
RANGEn The difference between the maximum and minimum values calculated by the command.
Name Contains
GROUP command.
For more information, see "GROUP command" on page 221.
Examples
Generating conditional statistics
You generate statistics for the Quantity field in records where the product class ID is 01:
STRATIFY command
Groups records into numeric intervals based on values in a numeric field. Counts the number of records in
each interval, and also subtotals specified numeric fields for each interval.
Syntax
STRATIFY <ON> numeric_field MINIMUM value MAXIMUM value {<INTERVALS number>|FREE
interval_value <...n> last_interval} <SUPPRESS> <SUBTOTAL numeric_field <...n>|SUBTOTAL
ALL> <KEY break_field> <TO {SCREEN|table_name|filename|GRAPH|PRINT}> <IF test> <FIRST
range|NEXT range> <WHILE test> <APPEND> <OPEN> <HEADER header_text> <FOOTER
footer_text> <LOCAL> <STATISTICS>
Parameters
Name Description
MINIMUM value Applies to numeric fields only. The minimum value of the first numeric interval.
MINIMUM is optional if you are using FREE, otherwise it is required.
MAXIMUM value Applies to numeric fields only. The maximum value of the last numeric interval.
MAXIMUM is optional if you are using FREE, otherwise it is required.
Name Description
SUPPRESS Values above the MAXIMUM value and below the MINIMUM value are excluded from the
command output.
optional
SUBTOTAL numeric_field One or more numeric fields or expressions to subtotal for each group.
<...n> | SUBTOTAL ALL
Multiple fields must be separated by spaces. Specify ALL to subtotal all the numeric fields
optional in the table.
If you do not select a subtotal field, the field you are stratifying on is automatically sub-
totaled.
You must explicitly specify the stratify field if you want to subtotal it along with one or more
other fields, or if you want to include statistics for the subtotaled stratify field.
KEY break_field The field or expression that groups subtotal calculations. A subtotal is calculated each
time the value of break_field changes.
optional
break_field must be a character field or expression. You can specify only one field, but
you can use an expression that contains more than one field.
TO SCREEN table_name | The location to send the results of the command to:
filename | GRAPH | PRINT o SCREEN – displays the results in the Analytics display area
o table_name – saves the results to an Analytics table
Specify table_name as a quoted string with a .FIL file extension. For example: TO "Out-
put.FIL"
By default, the table data file (.FIL) is saved to the folder containing the
Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
l TO "C:\Output.FIL"
l TO "Results\Output.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _
), but no other special characters, or any spaces. The name cannot
start with a number.
o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
Name Description
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
Name Description
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
STATISTICS Note
optional Cannot be used unless SUBTOTAL is also specified.
Calculates average, minimum, and maximum values for all SUBTOTAL fields.
Examples
Stratifying on invoice amount
You need to stratify an accounts receivable table on the Invoice_Amount field. The invoice amount is also
automatically subtotaled.
The output is grouped into $1000 intervals:
l from $0 to $999.99
l from $1,000 to $1,999.99
l so on
The total invoice amount is included for each interval.
OPEN Ar
STRATIFY ON Invoice_Amount MINIMUM 0 MAXIMUM 10000 INTERVALS 10 TO "Stratified_
invoices.FIL"
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
STRATIFY groups records into equal-sized, or custom-sized, numeric intervals based on values in a
numeric field.
The output contains a single record for each interval, with a count of the number of records in the source
table that fall into each interval.
Subtotal field subtotaled field name in source Total + subtotaled alternate column title in source
table table
Average field a_ subtotaled field name in Average + subtotaled alternate column title in
source table source table
Minimum field m_ subtotaled field name in Minimum + subtotaled alternate column title in
source table source table
Maximum field x_ subtotaled field name in Maximum + subtotaled alternate column title in
source table source table
SUMMARIZE command
Groups records based on identical values in one or more character, numeric, or datetime fields. Counts
the number of records in each group, and also subtotals specified numeric fields for each group.
Syntax
SUMMARIZE ON key_field <...n> <SUBTOTAL numeric_field <...n> |SUBTOTAL ALL> <OTHER
field <...n>|OTHER ALL> <TO {SCREEN|table_name|PRINT}> <IF test> <WHILE test> <FIRST
range|NEXT range> <PRESORT> <APPEND> <OPEN> <LOCAL> <HEADER header_text>
<FOOTER footer_text> <STATISTICS> <MODMEDQ> <STDEV> <CPERCENT> <ISOLOCALE
locale_code>
Parameters
Name Description
ON key_field <...n> One or more character, numeric, or datetime fields to summarize. Multiple fields must be
separated by spaces, and can be different data types.
SUBTOTAL numeric_field One or more numeric fields or expressions to subtotal for each group.
<...n> | SUBTOTAL ALL
Multiple fields must be separated by spaces. Specify ALL to subtotal all the numeric
optional fields in the table.
OTHER field <...n> | One or more additional fields to include in the output.
OTHER ALL o field <...n> – include the specified field or fields
optional o ALL – include all fields in the table that are not specified as key fields or subtotal
fields
Use OTHER only with fields that contain the same value for all records in each sum-
marized group. If you specify a field that contains values that are different for a sum-
marized group, only the value for the first record in the group is displayed, which is not
meaningful.
For example:
o summarize a table on customer number – an appropriate "other field" is Customer
Name. Typically, the customer name is identical for all records with the same cus-
tomer number.
o summarize a vendor table by state – an inappropriate "other field" is City. Only the
first city listed for each state appears in the output. In this instance, the better
approach is to summarize using both state and city as key fields, in that order.
TO SCREEN table_name | The location to send the results of the command to:
PRINT o SCREEN – displays the results in the Analytics display area
Name Description
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
PRESORT Sorts the table on the key field before executing the command.
optional Note
You cannot use PRESORT inside the GROUP command.
Name Description
set of identical values, or identical combination of values, in the key field or fields.
Tip
If the input table is already sorted, you can save processing time by not
specifying PRESORT.
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled,
missing, or inaccurate data can result.
OPEN Opens the table created by the command after the command executes. Only valid if the
command creates an output table.
optional
LOCAL Saves the output file in the same location as the Analytics project.
optional Note
Applicable only when running the command against a server table with
an output file that is an Analytics table.
FOOTER footer_text The text to insert at the bottom of each page of a report.
optional footer_text must be specified as a quoted string. The value overrides the Analytics
FOOTER system variable.
Name Description
STATISTICS Note
optional Cannot be used unless SUBTOTAL is also specified.
Calculates average, minimum, and maximum values for all SUBTOTAL fields.
MODMEDQ Note
optional Cannot be used unless SUBTOTAL is also specified.
Calculates mode, median, first quartile, and third quartile values for all SUBTOTAL
fields.
STDEV Note
optional Cannot be used unless SUBTOTAL is also specified.
Calculates standard deviation and percentage of total for all SUBTOTAL fields.
ISOLOCALE Note
optional Applicable in the Unicode edition of Analytics only.
The system locale in the format language_ country. For example, to use Canadian
French, enter fr_ca.
Use the following codes:
o language – ISO 639 standard language code
o country – ISO 3166 standard country code
If you do not specify a country code, the default country for the language is used.
If you do not use ISOLOCALE, the default system locale is used.
Examples
Total transaction amount per customer
You summarize an accounts receivable table on the Customer_Number field, and subtotals the Trans_
Amount field. The output is grouped by customer and includes the total transaction amount for each cus-
tomer:
OPEN Ar
SUMMARIZE ON Customer_Number SUBTOTAL Trans_Amount TO "Customer_total.FIL"
PRESORT
You summarize an accounts receivable table on the Customer_Number and Trans_Date fields. You sub-
total the Trans_Amount field.
The output is grouped by customer, and within customer by date, and includes the total transaction
amount for each customer for each date the customer had transactions.
OPEN Ar
SUMMARIZE ON Customer_Number Trans_Date SUBTOTAL Trans_Amount TO "Customer_total_
by_date.FIL" PRESORT
OPEN Ar
SUMMARIZE ON Customer_Number Trans_Date SUBTOTAL Trans_Amount TO "Customer_
stats_by_date.FIL" PRESORT STATISTICS
OPEN CC_Trans
SUMMARIZE ON Trans_Date Trans_Amount TO "Transactions_by_date_amount.FIL" OPEN
PRESORT
SET FILTER TO COUNT > 1
Remarks
Note
For more information about how this command works, see the Analytics Help.
How it works
SUMMARIZE groups records that have the same value, or the same combination of values, in one or
more character, numeric, or datetime fields. The output contains a single record for each group, with a
count of the number of records in the source table that belong to the group.
Total + subtotaled altern- subtotaled field name Subtotaled values for each
SUBTOTAL ate column title group
Median + subtotaled altern- c_ subtotaled field name The median value for each
ate column title group
o Odd-numbered sets of
values: the middle
value
o Even-numbered sets of
values: the average of
the two values at the
middle
Mode + subtotaled altern- o_ subtotaled field name The most frequently occur-
ate column title ring value for each group
o Displays "N/A" if no
value occurs more than
once
o In the event of a tie, dis-
plays the lowest value
Q25 + subtotaled alternate q_ subtotaled field name The first quartile value for
column title each group (lower quartile
value)
o The result is an inter-
polated value based on
an Analytics algorithm
o Produces the same res-
ult as the QUARTILE
and QUARTILE.INC
functions in Microsoft
MODMEDQ Excel
Q75 + subtotaled alternate p_ subtotaled field name The third quartile value for
column title each group (upper quartile
value)
o The result is an inter-
polated value based on
an Analytics algorithm
o Produces the same res-
ult as the QUARTILE
and QUARTILE.INC
functions in Microsoft
Excel
TOP command
Moves to the first record in an Analytics table.
Syntax
TOP
Parameters
This command does not have any parameters.
Remarks
When to use TOP
Use TOP to move to the first record in a table if a previous command, such as FIND, selected another record
in the table.
TOTAL command
Calculates the total value of one or more fields in an Analytics table.
Syntax
TOTAL {<FIELDS> numeric_field <...n>|<FIELDS> ALL} <IF test> <WHILE test> <FIRST
range|NEXT range>
Parameters
Name Description
FIELDS numeric_ The numeric fields to total. Specify ALL to total each of the numeric fields in the table.
field <...n> | FIELDS ALL
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
Name Contains
Examples
Totaling the first 25 records
You calculate the total amount of the MKTVAL field for the first 25 records in the table:
Remarks
When to use TOTAL
Use TOTAL to verify the completeness and accuracy of the source data and to produce control totals. The
command calculates the arithmetic sum of the specified fields or expressions.
TRAIN command
Uses automated machine learning to create an optimum predictive model using a training data set.
Syntax
TRAIN {CLASSIFIER|REGRESSOR} <ON> key_field <...n> TARGET labeled_field SCORER
{ACCURACY|AUC|F1|LOGLOSS|PRECISION|RECALL|MAE|MSE|R2} SEARCHTIME minutes
MAXEVALTIME minutes MODEL model_name TO table_name <IF test> <WHILE test> <FIRST
range|NEXT range> FOLDS number_of_folds <SEED seed_value> <LINEAR> <NOFP>
Note
The maximum supported size of the data set used with the TRAIN command is 1 GB.
Parameters
Name Description
TARGET labeled_field The field that the model is being trained to predict based on the training input fields.
The different prediction types (classification or regression) work with different field data
types:
Name Description
SCORER ACCURACY | The metric to use when scoring (tuning and ranking) the generated models.
AUC | F1 | LOGLOSS |
The generated model with the best value for this metric is kept, and the rest of the mod-
PRECISION | RECALL |
els are discarded.
MAE | MSE | R2
A different subset of metrics is valid depending on the prediction type you are using
(classification or regression):
SEARCHTIME minutes The total time in minutes to spend training and optimizing a predictive model.
Training and optimizing involves searching across different pipeline configurations (dif-
ferent model, preprocessor, and hyperparameter combinations).
Note
Total runtime of the TRAIN command is SEARCHTIME plus up to twice
MAXEVALTIME.
Tip
Specify a SEARCHTIME that is 10x the MAXEVALTIME.
This time allotment strikes a reasonable balance between processing
time and allowing a variety of model types to be evaluated.
MODEL model_name The name of the model file output by the training process.
The model file contains the model best fitted to the training data set. You will input the
model to the PREDICT command to generate predictions about a new, unseen data set.
Specify model_name as a quoted string. For example: TO "Loan_default_prediction"
You can specify the *.model file extension, or let Analytics automatically specify it.
By default, the model file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the model file to a different, existing
folder:
o TO "C:\Loan_default_prediction"
o TO "ML Train output\Loan_default_prediction.model"
TO table_name The name of the model evaluation table output by the training process.
The model evaluation table contains two distinct types of information:
o Scorer/Metric – for the classification or regression metrics, quantitative estimates of
the predictive performance of the model file output by the training process
Name Description
Different metrics provide different types of estimates. Scorer identifies the metric you
specified with SCORER. Metric identifies the metrics you did not specify.
o Importance/Coefficient – in descending order, values indicating how much each fea-
ture (predictor) contributes to the predictions made by the model
Specify table_name as a quoted string with a .FIL file extension. For example: TO
"Model_evaluation.FIL"
By default, the table data file (.FIL) is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the data file to a different, existing
folder:
o TO "C:\Model_evaluation.FIL"
o TO "ML Train output\Model_evaluation.FIL"
Note
Table names are limited to 64 alphanumeric characters, not including
the .FIL extension. The name can include the underscore character ( _ ),
but no other special characters, or any spaces. The name cannot start
with a number.
IF test A conditional expression that must be true in order to process each record. The com-
mand is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The com-
mand is executed until the condition evaluates as false, or the end of the table is
optional
reached.
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
FOLDS number_of_folds The number of cross-validation folds to use when evaluating and optimizing the model.
Folds are subdivisions of the training data set, and are used in a cross-validation pro-
cess.
Typically, using from 5 to 10 folds yields good results when training a model. The min-
imum number of folds allowed is 2, and the maximum number is 10.
Name Description
Tip
With smaller training data sets, increasing the number of folds can pro-
duce a better estimate of the predictive performance of a model, but it
also increases overall runtime.
SEED seed_value The seed value to use to initialize the random number generator in Analytics.
optional If you omit SEED, Analytics randomly selects the seed value.
Explicitly specify a seed value, and record it, if you want to replicate the training process
with the same data set in the future.
NOFP Exclude feature selection and data preprocessing from the training process.
optional Feature selection is the automated selection of the fields in the training data set that are
the most useful in optimizing the predictive model. Automated selection can improve pre-
dictive performance, and reduce the amount of data involved in model optimization.
Data preprocessing performs transformations such as scaling and standardizing on the
training data set to make it better suited for the training algorithms.
Caution
You should only exclude feature selection and data preprocessing if you
have a reason for doing so.
Examples
Train a classification model
You want to train a classification model that you can use in a subsequent process to predict which loan
applicants will default.
You train the model using a set of historical loan data with a known outcome for each loan, including whether
the client defaulted.
In the subsequent prediction process, you will use the model produced by the TRAIN command to process
current loan applicant data.
OPEN "Loan_applicants_historical"
TRAIN CLASSIFIER ON Age Job_Category Salary Account_Balance Loan_Amount Loan_Period
OPEN "House_sales"
TRAIN REGRESSOR ON Lot_Size Bedrooms Bathrooms Stories Driveway Recroom Full_Base-
ment Gas_HW Air_conditioning Garage_Places Preferred_Area TARGET Price SCORER MSE
SEARCHTIME 960 MAXEVALTIME 90 MODEL "House_price_prediction.model" TO "Model_eval-
uation.FIL" FOLDS 5
Remarks
Note
For more information about how this command works, see the Analytics Help.
VERIFY command
Checks for data validity errors in one or more fields in an Analytics table by verifying that the data is con-
sistent with the field definitions in the table layout.
Syntax
VERIFY {<FIELDS> field <...n>|<FIELDS> ALL} <IF test> <WHILE test> <FIRST
range|NEXT range> <ERRORLIMIT n> <TO {SCREEN|filename|PRINT}> <APPEND>
Parameters
Name Description
FIELDS field <...n> | The fields or expressions to verify. Specify ALL to verify all fields in the table.
FIELDS ALL
Note
By definition, computed fields, ad hoc expressions, and binary fields are
always valid.
IF test A conditional expression that must be true in order to process each record. The command
is executed on only those records that satisfy the condition.
optional
Note
The IF parameter is evaluated against only the records remaining in a
table after any scope parameters have been applied (WHILE, FIRST,
NEXT).
WHILE test A conditional expression that must be true in order to process each record. The command
is executed until the condition evaluates as false, or the end of the table is reached.
optional
Note
If you use WHILE in conjunction with FIRST or NEXT, record processing
stops as soon as one limit is reached.
ERRORLIMIT n The number of errors allowed before the command is terminated. The default value is 10.
Name Description
optional
TO SCREEN | filename | The location to send the results of the command to:
PRINT o SCREEN – displays the results in the Analytics display area
optional o filename – saves the results to a file
Specify filename as a quoted string with the appropriate file extension. For example:
TO "Output.TXT"
By default, the file is saved to the folder containing the Analytics project.
Use either an absolute or relative file path to save the file to a different, existing folder:
l TO "C:\Output.TXT"
l TO "Results\Output.TXT"
o PRINT – sends the results to the default printer
APPEND Appends the command output to the end of an existing file instead of overwriting it.
optional Note
You must ensure that the structure of the command output and the exist-
ing file are identical:
l the same fields
l the same field order
l matching fields are the same length
l matching fields are the same data type
Analytics appends output to an existing file regardless of its structure. If
the structure of the output and the existing file do not match, jumbled, miss-
ing, or inaccurate data can result.
WRITEn The total number of data validity errors identified by the command.
Examples
Verifying data and specifying an error limit
You verify all of the columns in a table and set the error limit to 10. The command stops processing if 10
data validity errors are detected:
Remarks
How it works
VERIFY compares the values in one or more fields to the data type specified for each of the fields in the
table layout and reports any errors. The command ensures the following:
l character fields – contain only valid characters, and that no unprintable characters are present
l numeric fields – contain only valid numeric data. In addition to numbers, numeric fields can contain
one preceding plus sign or minus sign and one decimal point
l datetime fields – contain valid dates, datetimes, or times
For each error that is identified, the record number and field name are output, along with the invalid value in
hexadecimal format.
Functions
ABS( ) function
Returns the absolute value of a numeric expression. The absolute value of a number is the number without
its sign.
Syntax
ABS(number)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 7.2:
ABS(7.2)
Returns 7.2:
ABS(-7.2)
AGE( ) function
Returns the age, in days, of a specified date compared to a specified cutoff date, or the current operating
system date.
Syntax
AGE(date/datetime/string <,cutoff_date>)
Parameters
Name Type Description
Note
date/datetime/string and cutoff_date can both accept a datetime value, but the time por-
tion of the value is ignored. You cannot use AGE( ) with time values alone.
Output
Numeric.
Examples
Basic examples
No cutoff date
Returns the number of days between 31 Dec 2014 and the current date:
l If a positive value is returned, it is equal to the number of days in the past December 31, 2014
occurred
l If a negative value is returned, it is equal to the number of days in the future December 31, 2014
occurs
l If 0 is returned, December 31, 2014 is the current date
AGE(`20141231`)
Returns the number of days between each date in the Due_date field and the current date:
AGE(Due_date)
AGE(`20130731`,`20141231`)
AGE("20130731","20141231")
AGE(`20130731`,"20141231")
AGE(`20130731 235959`,`20141231`)
AGE(Due_date, `20141231`)
Returns the number of days between December 31, 2014 and each date in the Due_date field. Results are
the same as the example immediately above, but the sign of the returned values (positive or negative) is
reversed:
AGE(`20141231`, Due_date)
AGE(Payment_date, Due_date)
Returns the number of days between each date in the Payment_date field and a corresponding date in the
Due_date field plus a grace period of 15 days.
l Payment dates prior to due dates, or up to 15 days after due dates, return a positive value
l Payment dates more than 15 days after due dates return a negative value, indicating late payment
outside the grace period
AGE(Payment_date, Due_date+15)
Advanced examples
Extracting overdue payments
Extract the name, amount, and invoice date for each record where the age of the invoice is greater than
180 days, based on a cutoff date of December 31, 2014:
Remarks
How it works
The AGE( ) function calculates the number of days between two dates.
AGE(Payment_date, Due_date)
Using the AGE( ) function in this manner is equivalent to calculating the difference between two date fields
by subtracting them in an expression.
For example:
Due_date – Payment_date
Parameter details
A datetime field specified for date/datetime/string or cutoff_date can use any date or datetime format, as
long as the field definition correctly defines the format.
YYYYMMDD `20141231`
"20141231"
YYMMDD `141231`
"141231"
YYYYMMDD hhmmss `20141231 235959`
"20141231 235959"
YYMMDDthhmm `141231t2359`
"141231t2359"
YYYYMMDDThh `20141231T23`
"20141231T23"
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset) "20141231 235959-0500"
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset) "141231 2359+01"
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
ALLTRIM( ) function
Returns a string with leading and trailing spaces removed from the input string.
Syntax
ALLTRIM(string)
Parameters
Name Type Description
string character The value to remove leading and trailing spaces from.
Output
Character.
Examples
Basic examples
Returns "Vancouver":
ALLTRIM(" Vancouver ")
ALLTRIM(" New York ")
Advanced examples
Concatenating character fields
Use ALLTRIM( ) to eliminate spaces when you concatenate character fields, such as first name and last
name fields, so that the resulting field does not contain multiple blank spaces between the concatenated
values.
The REPLACE( ) function replaces any non-breaking spaces with regular spaces, and then ALLTRIM( )
removes any leading or trailing regular spaces.
Remarks
How it works
The ALLTRIM( ) function removes the leading and trailing spaces from a string. Spaces inside the string
are not removed.
Related functions
Use the LTRIM( ) function if you want to remove only leading spaces from a string, or the TRIM( ) function if
you want to remove only trailing spaces.
ASCII( ) function
Returns the ASCII code for a specified character.
Syntax
ASCII(character)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 65:
ASCII("A")
Returns 49:
ASCII("1")
Advanced examples
Extracting a record that starts with a tab character
Extract records that have a tab character at the beginning of a field called "Description". The ASCII code
for a tab character is "9".
Remarks
Testing for non-printable characters
You can use ASCII( ) to test for non-printable characters such as:
l Nul – ASCII "0"
l Tab – ASCII "9"
l Line Feed (LF) – ASCII "10"
l Carriage Return (CR) – ASCII "13"
Related functions
ASCII( ) is the inverse of the CHR( ) function.
AT( ) function
Returns a number specifying the starting location of a particular occurrence of a substring within a character
value.
Syntax
AT(occurrence_num, search_for_string, within_text)
Parameters
Name Type Description
search_for_string character The substring to search for in within_text. This value is case-sensitive.
If search_for_string includes double quotation marks, you need to
enclose the value in single quotation marks:
AT(1,'"test"', Description)
AT(1,'"test"', Description+Summary)
Output
Numeric. Returns the starting byte position of the specified occurrence of the search_for_string value, or 0 if
no matches are found.
Examples
Basic examples
Occurrences found
Returns 4:
Returns 8:
Groups of characters
Returns 5:
Searching a field
Returns the byte position of the first hyphen in each value in the Invoice_Number field:
Advanced examples
Finding invoice numbers in which the second hyphen occurs after the tenth
byte position
You can analyze the consistency of invoice numbers in a table by using the AT( ) function to create a filter
like the one below. This filter isolates all records in which the invoice number contains two or more hyphens,
and the second hyphen occurs after the tenth byte position:
Remarks
When to use AT( )
Use this function to retrieve the following starting positions within a character value:
l the starting position of a substring
l the starting position of a subsequent occurrence of the substring
If you only want to confirm multiple occurrences of the same substring in a field, the OCCURS( ) function is a
better alternative. For more information, see "OCCURS( ) function" on page 666.
BETWEEN( ) function
Returns a logical value indicating whether the specified value falls within a range.
Syntax
BETWEEN(value, min, max)
Parameters
Name Type Description
Note
The range evaluating to T (true) includes the min and max values.
For information regarding character ranges, see "SET EXACT behavior" on the facing
page.
Output
Logical. Returns T (true) if value is greater than or equal to the min value, and less than or equal to the max
value. Returns F (false) otherwise.
Examples
Basic examples
Numeric input
Returns T:
BETWEEN(500,400,700)
Returns F:
BETWEEN(100,400,700)
Character input
Returns T:
BETWEEN("B","A","C")
Returns F, because the character comparison is case-sensitive, and lowercase "b" does not fall between
uppercase "A" and "C":
BETWEEN("b","A","C")
Datetime input
Returns T:
BETWEEN(`141230`,`141229`,`141231`)
Returns T for all values in the Login_time field from 07:00:00 AM to 09:00:00 AM, inclusive, and F oth-
erwise:
BETWEEN(Login_time,`t070000`,`t090000`)
Returns T for all values in the Last_Name field that begin with the letters from "C" to "J", inclusive, and F
otherwise (SET EXACT must be ON). Also returns T for the single letter "K":
Field input
Returns T for all values in the Invoice_Date field from 30 Sep 2014 to 30 Oct 2014, inclusive, and F oth-
erwise:
Returns T for all records in which the invoice date does not fall between the PO date and the paid date,
inclusive, and F otherwise:
Returns T for all values in the Invoice_Amount field from $1000 to $5000, inclusive, and F otherwise:
Advanced examples
Creating a filter to view a salary range
The following example opens the Employee_List sample table and applies a filter that limits the records
displayed to include only employees that earn a salary greater than or equal to $40,000.00, and less than
or equal to $50,000.00.
OPEN Employee_List
SET FILTER TO BETWEEN(Salary, 40000.00, 50000.00)
Remarks
Supported data types
Inputs to the BETWEEN( ) function can be numeric, character, or datetime data. You cannot mix data
types. All three inputs must belong to the same data category.
is equivalent to
Returns F, because 1.23 is less than 1.234 once the third decimal place is considered:
Character data
Case sensitivity
The BETWEEN( ) function is case-sensitive when used with character data. When it compares characters,
"a" is not equivalent to "A".
Returns F:
BETWEEN("B","a","C")
If you are working with data that includes inconsistent case, you can use the UPPER( ) function to convert
the values to consistent case before using BETWEEN( ).
Returns T:
Partial matching
Partial matching is supported for character comparisons.
value can be contained by min.
Returns T, even though value "AB" appears to be less than min "ABC":
Note
The shorter value in the character comparison must appear at the start of the longer value
to constitute a match.
Datetime parameters
A date, datetime, or time field specified as a function input can use any date, datetime, or time format, as
long as the field definition correctly defines the format.
BETWEEN(`20141231`,`20141229`,`20141231`)
Returns F, even though 12:00 PM on 31 December 2014 appears to fall within the range specified by min
and max:
BETWEEN(`20141231 120000`,`20141229`,`20141231`)
If we look at the serial number equivalent of these two expressions, we can see why the second one eval-
uates to false.
Returns T, because the serial number value is equal to the serial number max:
Returns F, because the serial number value is greater than the serial number max:
The serial number 42003.500000 is greater than 42003.000000 and therefore is out of range, even though
the two dates are identical. 0.500000 is the serial number equivalent of 12:00 PM.
BETWEEN(CTOD(DATE(`20141231
120000`,"YYYYMMDD"),"YYYYMMDD"),`20141229`,`20141231`)
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
thhmmss `t235959`
Thhmm `T2359`
Note
Do not use hh alone in the main time
format with data that has a UTC offset.
For example, avoid: hh+hhmm. Results
can be unreliable.
BINTOSTR( ) function
Returns Unicode character data converted from ZONED or EBCDIC character data. Abbreviation for "Bin-
ary to String".
Note:
This function is specific to the Unicode edition of Analytics. It is not a supported function in
the non-Unicode edition.
Syntax
BINTOSTR(string, string_type)
Parameters
Name Type Description
string character The ZONED or EBCDIC value that you want to convert to Unicode char-
acter encoding.
string_type character The format to convert from. You must specify one of the following val-
ues:
o "A" – convert from ZONED (ASCII) data
o "E" – convert from EBCDIC data
Output
Character.
Examples
Basic examples
The expression ZONED(-6448,4) converts the value -6448 to the character format "644Q", however the
Unicode edition of Analytics requires that you convert the output of ZONED( ) to Unicode characters using
BINTOSTR( ).
Returns "644Q" in Unicode format:
BINTOSTR(ZONED(-6448,4), "A")
Remarks
When to use BINTOSTR( )
Use this function to convert return values from the ZONED( ) and EBCDIC( ) functions to a Unicode value.
Note
If this function is not applied to the return values of ZONED( ) and EBCDIC( ) in Unicode
editions of Analytics, then they are displayed incorrectly because the encoding is not inter-
preted correctly.
BIT( ) function
Returns the binary representation for the specified byte position in the current record as an eight character
string.
Syntax
BIT(byte_location)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "00110001" if the eighth byte contains "1":
BIT(8)
BIT(9)
BIT(17)
Advanced examples
Using BIT ( ) and SUBSTRING ( ) to extract a value
Assume that byte position 17 contains a set of 8 credit flags.
To extract all customer records that have the third bit set to one (meaning "do not ship"), specify:
In this example, the SUBSTRING( ) function is used to extract the value of the third bit.
Remarks
How it works
BIT( ) converts the byte at the specified byte position into an eight character string of ones and zeros.
Related functions
If you want to retrieve the character at the specified byte location, use the BYTE( ) function.
BLANKS( ) function
Returns a string containing a specified number of blank spaces.
Syntax
BLANKS(count)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns " ":
BLANKS(5)
Returns "ABC Corporation":
Remarks
When to use BLANKS( )
Use the BLANKS( ) function to harmonize fields, to initialize variables in scripts, or to insert blank spaces
when formatting fields or concatenating strings.
BYTE( ) function
Returns the character stored in the specified byte position in the current record.
Syntax
BYTE(byte_location)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "1" from a record that begins with an ID field containing "1":
byte(112)
Advanced examples
Identify records in print files or PDFs based on consistent formatting
Use BYTE( ) to identify records in a data file where a particular character is present in a particular byte pos-
ition. This is typically the case in Print Image (Report) files or Adobe Acrobat (PDF) files where data is
formatted in a consistent way throughout the document.
For example, to locate and extract records that include a period at byte position 113:
Remarks
When to use BYTE( )
Use BYTE( ) to examine the contents of a position in a record, without having to define a field for the pur-
pose.
Related functions
If you want to retrieve the binary representation for specified byte location, use the BIT( ) function.
CDOW( ) function
Returns the name of the day of the week for a specified date or datetime. Abbreviation for "Character Day of
Week".
Syntax
CDOW(date/datetime, length)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value to return the day name for.
length numeric A value between 1 and 9 that specifies the length of the output string.
To display abbreviated day names, specify a smaller value.
Output
Character.
Examples
Basic examples
Returns "Wednesday" because December 31, 2014 falls on a Wednesday, and length is 9:
CDOW(`20141231`, 9)
Returns "Wed" because December 31, 2014 falls on a Wednesday, and length is 3:
CDOW(`20141231 235959`, 3)
Returns the full day name for each value in the Invoice_date field:
CDOW(Invoice_date, 9)
Returns the abbreviated day name for each value in the Receipt_timestamp field:
CDOW(Receipt_timestamp, 3)
Advanced examples
Adding a field that identifies the days of the week for dates
Use the CDOW( ) function to create a computed field that identifies the days of the week for all the dates in
a date field. Once you have created the computed field, you can add it to the view beside the date column:
Remarks
Parameter details
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
If the length parameter is shorter than the day name, the day name is truncated to the specified length. If
the length parameter is longer than the day name, the day name is padded with blank spaces.
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
Related functions
If you need to return the day of the week as a number (1 to 7), use DOW( ) instead of CDOW( ).
CHR( ) function
Returns the character associated with the specified ASCII code.
Syntax
CHR(number)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "A":
CHR(65)
Returns "1":
CHR(49)
Advanced examples
Adding the UK pound symbol (£) to each of the values in a currency field
Create a computed field that adds the pound symbol (ASCII code 163) before amounts in the Invoice_
Amount field. The numeric Invoice_Amount field is first converted to a character field, and leading and trail-
ing blank spaces are trimmed.
Remarks
When to use CHR( )
Use the CHR( ) function to return the character associated with any ASCII code, including those characters
that cannot be entered directly from a keyboard or displayed on screen. With CHR( ), you can search fields
or records for the existence of these specific characters.
Referencing NUL
Referencing the ASCII NUL (null) character, CHR(0), may produce unpredictable results because it is used
by Analytics as a text qualifier, and should be avoided if possible.
Related functions
CHR( ) is the inverse of the ASCII( ) function.
CLEAN( ) function
Replaces the first invalid character in a string, and all subsequent characters, with blanks.
Syntax
CLEAN(string <,extra_invalid_characters>)
Parameters
Name Type Description
string character The value from which to remove default and any extra invalid char-
acters.
extra_invalid_char- character Invalid characters you want to remove from string in addition to the
acters default invalid characters. You may specify more than one extra
invalid character:
optional
" ,;\"
Tab characters, null characters, and carriage return and line feed
characters are default invalid characters that are automatically
removed and do not need to be specified.
To specify the double quotation mark as an extra invalid character,
enclose extra_invalid_characters in single quotation marks:
'"'
Output
Character.
Examples
Basic examples
Returns "ABC " ("ABC" followed by four blank spaces):
CLEAN("ABC%DEF","%")
CLEAN("1234.56,111,2", ",")
Remarks
When to use CLEAN( )
Use this function to ensure that fields containing invalid data are printed correctly. You can also use this func-
tion to isolate parts of a field, such as the last name in a customer field that includes both the first and last
name of the customer.
Automatic CLEAN( )
In an Analytics script, you can apply the CLEAN( ) function automatically to all character fields by adding
SET CLEAN ON to the script. You cannot specify extra individual characters using this option.
CMOY( ) function
Returns the name of the month of the year for a specified date or datetime. Abbreviation for "Character
Month of Year".
Syntax
CMOY(date/datetime, length)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value to return the month name for.
length numeric A value between 1 and 9 that specifies the length of the output string.
To display abbreviated month names, specify a smaller value.
Output
Character.
Examples
Basic examples
Returns "December":
CMOY(`20141231`, 9)
Returns "Dec":
CMOY(`20141231 235959`, 3)
Returns the abbreviated month name for each value in the Receipt_timestamp field:
CMOY(Receipt_timestamp, 3)
Returns the full month name for each value in the Invoice_date field:
CMOY(Invoice_date, 9)
Returns the full name of the month 15 days after each value in the Invoice_date field:
CMOY(Invoice_date + 15, 9)
Remarks
Parameter details
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
If the length parameter is shorter than the month name, the month name is truncated to the specified length.
If the length parameter is longer than the month name, the month name is padded with blank spaces.
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
Related functions
If you need to return the month of the year as a number (1 to 12), use MONTH( ) instead of CMOY( ).
COS( ) function
Returns the cosine of an angle expressed in radians, with a precision of 15 decimal places.
Syntax
COS(radians)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 0.500000000000000 (the specified number of radians):
COS(1.047197551196598)
Advanced examples
Using degrees as input
Returns 0.500000000000000 (the cosine of 60 degrees):
COS(60 * PI( )/180)
DEC(COS(60 * PI( )/180),3)
Remarks
Performing the Mantissa Arc Test
The three trigonometric functions in Analytics – SIN( ), COS( ), and TAN( ) – support performing the Man-
tissa Arc Test associated with Benford's Law.
CTOD( ) function
Converts a character or numeric date value to a date. Can also extract the date from a character or numeric
datetime value and return it as a date. Abbreviation for "Character to Date".
Syntax
CTOD(string/number <,format>)
Parameters
Name Type Description
string/number character The field, expression, or literal value to convert to a date, or from which
to extract the date.
numeric
format character The date format of string/number. The format is required for values that
use any date format other than YYYYMMDD or YYMMDD, for example
optional
"DD/MM/YYYY".
Note
If you use the CTOD function with a datetime value that
requires the format parameter, specify only the date por-
tion of the format, and not the time portion. For example:
Output
Datetime. The date value is output using the current Analytics date display format.
Examples
Basic examples
Character literal input
Returns `20141231` displayed as 31 Dec 2014 assuming a current Analytics date display format of DD
MMM YYYY:
CTOD("20141231")
CTOD("31/12/2014", "DD/MM/YYYY")
CTOD("20141231 235959")
CTOD(20141231)
CTOD(31122014, "DDMMYYYY")
CTOD(20141231.235959)
CTOD(Invoice_date, "DD/MM/YYYY")
CTOD(Receipt_timestamp)
CTOD(Due_date, "DDMMYYYY")
CTOD(Payment_timestamp)
Advanced examples
Compare a character or numeric field to a date
Use the CTOD( ) function to compare a date against character or numeric fields that contain values rep-
resenting dates.
The filter below compares two values:
l the numeric Due_date field that stores dates as numbers in the format DDMMYYYY
l the literal date value July 1, 2014
Remarks
Required date formats
Character and numeric fields containing date or datetime values must match the formats in the table below.
Datetime values can use any combination of the date, separator, and time formats valid for their data type.
The date must precede the time, and there must be a separator between the two.
Dates, or the date portion of datetime values, can use any date format supported by Analytics, and valid for
the data type, as long as formats other than YYYYMMDD or YYMMDD are correctly defined by format.
Character fields
any Analytics-supported date format, valid for the data type, if defined by format the letter 'T' hh
+/-hhmm
+/-hh:mm
(UTC offset)
+/-hh
(UTC offset)
N-
ot-
e
D-
o
n-
ot
us-
e
hh
al-
o-
ne
in
th-
e
m-
ai-
n
ti-
m-
e
fo-
r-
m-
at
wi-
th
d-
at-
a
th-
at
h-
as
a
U-
T-
C
off-
se-
t.
F-
or
ex-
a-
m-
pl-
e,
av-
oi-
d:
h-
h-
+-
h-
h-
Published 6/19/2019 © 2019 ACL Services Ltd. dba Galvanize m-
m.
Page 499 of 934
R-
Commands
Numeric fields
YYMMDD hhmm
any Analytics-supported date format, valid for the data type, if defined by format hh
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a char-
acter or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can
also return the current operating system date.
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can
also return the current operating system time.
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional por-
tion of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
Function Description
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
hours equaling 1 – to a time value. Abbreviation for "Serial to Time".
CTODT( ) function
Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to Datetime".
Syntax
CTODT(string/number <,format>)
Parameters
Name Type Description
format character The date format of string/number. The format is required for values
that use any date format other than YYYYMMDD or YYMMDD for the
optional
date portion of the value, for example "DD/MM/YYYY".
Output
Datetime. The datetime value is output using the current Analytics date and time display formats.
Examples
Basic examples
Character literal input
Returns `20141231t235959` displayed as 31 Dec 2014 23:59:59 assuming a current Analytics date and
time display formats of DD MMM YYYY and hh:mm:ss:
CTODT("20141231 235959")
CTODT(20141231.235959)
CTODT(31122014.235959, "DDMMYYYY.hhmmss")
CTODT(Receipt_timestamp, "DD/MM/YYYY hh:mm:ss")
CTODT(Payment_timestamp, "DD/MM/YYYY hh:mm:ss")
Advanced examples
Compare a character or numeric field to a datetime
Use the CTODT( ) function to compare a datetime against character or numeric fields that contain values
representing datetimes.
The filter below compares two values:
l the character Receipt_timestamp field that stores datetimes as character data in the format
DD/MM/YYYY hh:mm:ss
l the literal datetime value July 1, 2014 13:30:00
Remarks
Required datetime formats
Character and numeric fields containing datetime values must match the formats in the table below. The
datetime values can use any combination of the date, separator, and time formats valid for their data type.
The date must precede the time, and there must be a separator between the two.
The date portion of values can use any date format supported by Analytics, and valid for the data type, as
long as formats other than YYYYMMDD or YYMMDD are correctly defined by format. If you use format,
you must also specify the time format, which must be one of the time formats that appear in the table
below.
Analytics automatically recognizes the separator between the date and time portions of datetime values,
so there is no need to specify the separator in format. You can specify the separator if you want to.
Character fields
any Analytics-supported date format, valid for the data type, if defined by format the letter 'T' hh
+/-hhmm
+/-hh:mm
(UTC offset)
+/-hh
(UTC offset)
N-
ot-
e
D-
o
n-
ot
u-
se
h-
h
al-
o-
n-
e
in
th-
e
m-
ai-
n
ti-
m-
e
fo-
r-
m-
at
wi-
th
d-
at-
a
th-
at
h-
as
a
U-
T-
C
off-
s-
et.
F-
or
e-
x-
a-
m-
pl-
e,
a-
v-
oi-
d:
h-
Published 6/19/2019 © 2019 ACL Services Ltd. dba Galvanize h-
+-
Pageh-505 of 934
h-
Commands
Numeric fields
YYMMDD hhmm
any Analytics-supported date format, valid for the data type, if defined by format hh
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a char-
acter or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a char-
acter or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can
also return the current operating system date.
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can
also return the current operating system time.
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional por-
tion of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
Function Description
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
hours equaling 1 – to a time value. Abbreviation for "Serial to Time".
CTOT( ) function
Converts a character or numeric time value to a time. Can also extract the time from a character or
numeric datetime value and return it as a time. Abbreviation for "Character to Time".
Syntax
CTOT(string/number)
Parameters
Name Type Description
string/number character The field, expression, or literal value to convert to a time, or from
which to extract the time.
numeric
Output
Datetime. The time value is output using the current Analytics time display format.
Examples
Basic examples
Character literal input
Returns `t235959` displayed as 23:59:59 assuming a current Analytics time display format of hh:mm:ss:
CTOT("t235959")
CTOT("23:59:59")
CTOT("20141231 235959")
CTOT(.235959)
CTOT(0.235959)
CTOT(20141231.235959)
CTOT(Login_time)
CTOT(Payment_datetime)
Advanced examples
Compare a character or numeric field to a time
Use the CTOT( ) function to compare a time against character or numeric fields that contain values rep-
resenting times.
The filter below compares two values:
l the numeric Login_time field that stores times as numeric data
l the literal time value 09:30:00
Remarks
Required datetime formats
Character and numeric fields containing time or datetime values must match the formats in the table
below.
Time values can use any combination of separator and time format. There must be a separator before the
time value, or colons between the time components, for the function to operate correctly.
Datetime values can use any combination of the date, separator, and time formats valid for their data type.
The date must precede the time, and there must be a separator between the two.
Use the CTOD( ) function if you want to convert a character or numeric date value to a date, or extract the
date from a character or numeric datetime value and return it as a date.
Use the CTODT( ) function if you want to convert a character or numeric datetime value to a datetime.
Character fields
+/-hhmm
+/-hh:mm
(UTC offset)
+/-hh
(UTC offset)
Note:
Do not use hh alone
in the main time
format with data that
has a UTC offset. For
example, avoid: hh+h-
hmm. Results can be
unreliable.)
Numeric fields
YYMMDD hhmm
hh
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a character
or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can also
return the current operating system date.
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can also
return the current operating system time.
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional portion
of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
Function Description
CUMIPMT( ) function
Returns the cumulative interest paid on a loan during a range of periods.
Syntax
CUMIPMT(rate, periods, amount, start_period, end_period <,type>)
Parameters
Name Type Description
Note
You must use consistent time periods when specifying rate and periods to ensure that you
are specifying interest rate per period.
For example:
l for monthly payments on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for annual payments on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric.
Examples
Basic examples
Returns 17437.23, the total amount of interest paid in the second year of a twenty-five year, $275,000 loan
at 6.5% per annum, with payments due at the end of the month:
Returns 17741.31, the total amount of interest paid on the same loan in the first year of the loan:
Remarks
Related functions
The CUMPRINC( ) function is the complement of the CUMIPMT( ) function.
The IPMT( ) function calculates interest paid for a single period.
CUMPRINC( ) function
Returns the cumulative principal paid on a loan during a range of periods.
Syntax
CUMPRINC(rate, periods, amount, start_period, end_period <,type>)
Parameters
Name Type Description
Note
You must use consistent time periods when specifying rate and periods to ensure that you
are specifying interest rate per period.
For example:
l for monthly payments on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for annual payments on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric.
Examples
Basic examples
Returns 4844.61, the total amount of principal paid in the second year of a twenty-five year, $275,000 loan
at 6.5% per annum, with payments due at the end of the month:
Returns 367.24, the amount of principal paid on the same loan in the first month of the loan:
Remarks
Related functions
The CUMIPMT( ) function is the complement of the CUMPRINC( ) function.
The PPMT( ) function calculates principal paid for a single period.
DATE( ) function
Extracts the date from a specified date or datetime and returns it as a character string. Can also return the
current operating system date.
Syntax
DATE(< date/datetime> <,format>)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value to extract the date from. If omitted,
the current operating system date is returned.
optional
format character The format to apply to the output string, for example "DD/MM/YYYY". If
omitted, the current Analytics date display format is used. You cannot
optional
specify a format if you have omitted date/datetime.
Output
Character.
Examples
Basic examples
Returns "20141231" in the current Analytics date display format:
DATE(`20141231 235959`)
Returns "31-Dec-2014":
Returns the current operating system date as a character string, using the current Analytics date display
format:
DATE()
Returns each value in the Receipt_timestamp field as a character string using the current Analytics date
display format:
DATE(Receipt_timestamp)
Returns each value in the Receipt_timestamp field as a character string using the specified date display
format:
DATE(Receipt_timestamp, "DD/MM/YYYY")
Remarks
Output string length
The length of the output string is always 12 characters. If the specified output format, or the Analytics date
display format, is less than 12 characters, the output string is padded with trailing blank spaces.
Parameter details
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
If you use format to control how the output string is displayed, you can use any supported Analytics date
display format. For example:
l DD/MM/YYYY
l MM-DD-YY
l DD MMM YYYY
format must be specified using single or double quotation marks – for example, "DD MMM YYYY".
l Datetime values – you can use any combination of the date, separator, and time formats listed in the
table below. The date must precede the time, and you must use a separator between the two. Valid
separators are a single blank space, the letter 't', or the letter 'T'.
l Time values – you must specify times using the 24-hour clock. Offsets from Coordinated Universal
Time (UTC) must be prefaced by a plus sign (+) or a minus sign (-).
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
Related functions
If you need to return the current operating system date as a datetime value, use TODAY( ) instead of DATE(
).
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can also
Function Description
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a char-
acter or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a char-
acter or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional por-
tion of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
hours equaling 1 – to a time value. Abbreviation for "Serial to Time".
DATETIME( ) function
Converts a datetime to a character string. Can also return the current operating system datetime.
Syntax
DATETIME(< datetime> <,format>)
Parameters
Name Type Description
datetime datetime The field, expression, or literal value to convert. If omitted, the current
operating system date is returned.
optional
format character The format to apply to the output string, for example "DD/MM/YYYY". If
omitted, the current Analytics date display format is used. You cannot
optional
specify a format if you have omitted date/datetime.
Output
Character.
Examples
Basic examples
Literal datetime input
Returns "20141231 235959" in the current Analytics date and time display formats:
DATETIME(`20141231 235959`)
Returns the current operating system date and time as a character string, using the current Analytics date
and time display formats:
DATETIME()
Field input
Returns each value in the Receipt_timestamp field as a character string using the current Analytics date
and time display formats:
DATETIME(Receipt_timestamp)
Returns each value in the Receipt_timestamp field as a character string using the specified date and time
display formats:
Remarks
Output string length
The length of the output string is always 27 characters. If the specified output format, or the Analytics date
and time display formats, are less than 27 characters, the output string is padded with trailing blank
spaces.
Parameter details
A field specified for datetime can use any datetime format, as long as the field definition correctly defines
the format.
If you use format to control how the output string is displayed, you are restricted to the formats in the table
below.
l You can use any combination of date, time, and AM/PM formats.
l The date must precede the time. Placing a separator between the two is not required as Analytics
automatically uses a single space as a separator in the output string.
l The AM/PM format is optional, and is placed last.
l format must be specified using single or double quotation marks.
For example: "DD-MMM-YYYY hh:mm:ss AM"
all supported Analytics date display formats hh:mm:ss none "DD/MM/YYYY hh:mm:ss"
24-hour clock
hhmm
hh
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can
also return the current operating system date.
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can
also return the current operating system time.
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a char-
acter or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a char-
acter or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional por-
tion of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
hours equaling 1 – to a time value. Abbreviation for "Serial to Time".
DAY( ) function
Extracts the day of the month from a specified date or datetime and returns it as a numeric value (1 to 31).
Syntax
DAY(date/datetime)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value to extract the day from.
Output
Numeric.
Examples
Basic examples
Returns 31:
DAY(`20141231`)
DAY(`20141231 235959`)
Returns the day of the month for each value in the Invoice_date field:
DAY(Invoice_date)
Remarks
Parameter details
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset.
For example, avoid: hh+hhmm. Results
can be unreliable.
Related functions
If you need to return:
l the day of the week as a number (1 to 7), use DOW( ) instead of DAY( )
l the name of the day of the week, use CDOW( ) instead of DAY( )
DBYTE( ) function
Returns the Unicode character located at the specified byte position in a record.
Note
This function is specific to the Unicode edition of Analytics. It is not a supported function in
the non-Unicode edition.
Syntax
DBYTE(byte_location)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
The examples illustrate the behavior of the function when applied to the following Unicode value, which
contains 11 characters (22 bytes) 美丽 10072DOE:
Returns "丽 ":
DBYTE(3)
Returns "D":
DBYTE(17)
Returns "E":
DBYTE(21)
Remarks
When to use DBYTE( )
Use DBYTE( ) to examine the contents of a position in a record, without having to define a field for this pur-
pose.
DEC( ) function
Returns a value, or the result of a numeric expression, with the specified number of decimal places.
Syntax
DEC(number, decimals)
Parameters
Name Type Description
number numeric The value or result to adjust the number of decimal places for.
o integers – decimal places are added to the end of number as trail-
ing zeros.
o fractional numbers – If the number of decimal places is reduced,
number is rounded, not truncated. If the number of decimal places
is increased, trailing zeros are added to the end of number.
decimals numeric The number of decimal places to use in the return value.
Note
You cannot use DEC( ) to increase the decimal pre-
cision of results.
For information about how to increase decimal pre-
cision, see Controlling rounding and decimal precision
in numeric expressions.
Output
Numeric.
Examples
Basic examples
Returns 7.00:
DEC(7, 2)
Returns 7.565:
DEC(7.5647, 3)
Returns 7.56470:
DEC(7.5647, 5)
Advanced examples
Calculating daily interest
Calculates the daily interest to six decimal places for a field called Annual_rate:
DEC(Annual_rate, 6) / 365
Remarks
When to use DEC( )
Use this function to adjust the number of decimal places in a field, or when you want to round a value or a res-
ult to a specified number of decimal places.
Example
Consider the following series of expressions in Analytics:
Fixed-point rounding means that the result of 1.1 * 1.1 is 1.2, not 1.21, which is the unrounded result. Using
DEC( ) to specify a two-decimal-place result does not create two-decimal-place precision. Instead, it adds a
trailing zero to create the specified number of decimal places, without increasing precision.
For information about how to increase decimal precision, see Controlling rounding and decimal precision in
numeric expressions.
Related functions
If you want to round a value to the nearest whole number, use the "ROUND( ) function" on page 756.
DHEX( ) function
Converts a Unicode string to a hexadecimal string.
Note:
This function is specific to the Unicode edition of Analytics. It is not a supported function in
the non-Unicode edition.
Syntax
DHEX(field)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "004100420043003100320033":
DHEX("ABC123")
Remarks
How it works
DHEX( ) displays each double-byte character using big-endian format, with the most significant double-byte
stored first.
Each character is represented by a four-character code. The output string is four times as long as the field
value, and includes the digits between 0 and 9 and letters between A and F that make up the hexadecimal
values.
Related functions
DHEX( ) is the inverse of the HTOU( ) function, which converts a hexadecimal string to a Unicode string.
DICECOEFFICIENT( ) function
Returns the Dice's coefficient of two specified strings, which is a measurement of how similar the two strings
are.
Syntax
DICECOEFFICIENT(string1, string2 <,ngram>)
Parameters
Name Type Description
Output
Numeric. The value is the Dice's coefficient of the two strings, which represents the percentage of the total
number of n-grams in the two strings that are identical. The range is 0.0000 to 1.0000, inclusive.
Examples
Basic examples
How the n -gram length affects the result
The three examples below compare the same two strings. The degree of similarity returned varies depend-
ing on the specified n-gram length.
Returns 0.9167 (using the default n-gram length (2), the n-grams in the two strings are 92% identical):
Returns 1.0000 (using an n-gram length of 1, the n-grams in the two strings are 100% identical):
Returns 0.8261 (using an n-gram length of 3, the n-grams in the two strings are 83% identical):
Field input
Returns the Dice's coefficient of each value in the Address field when compared to the string "125 SW
39TH ST, Suite 100" (based on the default n-gram length of 2):
Advanced examples
Working with transposed elements
By reducing the n-gram length, and removing non-essential characters, you can optimize
DICECOEFFICIENT( ) when searching for transposed elements.
Returns 0.7368 (using the default n-gram length (2), the n-grams in the two strings are 74% identical):
Returns 1.0000 (by excluding the comma between last name and first name, and by using an n-gram
length of 1, the n-grams in the two strings are 100% identical):
Add the computed field Dice_Co to the view, and then quick sort it in descending order, to rank all values in
the Address field based on their similarity to "125 SW 39TH ST, Suite 100".
Changing the number in the expression allows you to adjust the degree of similarity in the filtered values.
Remarks
When to use DICECOEFFICIENT( )
Use the DICECOEFFICIENT( ) function to find nearly identical values (fuzzy duplicates). You can also use
DICECOEFFICIENT( ) to find values with identical or near-identical content, but transposed elements. For
example:
l telephone numbers, or social security numbers, with transposed digits
l versions of the same address, formatted differently
How it works
DICECOEFFICIENT( ) returns the Dice's coefficient of the two evaluated strings, which is a measurement of
the degree of similarity between the strings, on a scale from 0.0000 to 1.0000. The greater the returned
value the more similar the two strings:
l 1.0000 – means that each string is composed of an identical set of characters, although the characters
may be in a different order, and may use different case.
l 0.7500 – means the n-grams in the two strings are 75% identical.
l 0.0000 – means the two strings have no shared n-grams (explained below), or the specified length of
the n-gram used in the calculation is longer than the shorter of the two strings being compared.
Usage tips
l Filtering or sorting – Filtering or sorting the values in a field based on their Dice's coefficient iden-
tifies those values that are most similar to the comparison string.
l Case-sensitivity – The function is not case-sensitive, so "SMITH" is equivalent to "smith."
l Leading and trailing blanks – The function automatically trims leading and trailing blanks in fields,
so there is no need to use the TRIM( ) or ALLTRIM( ) functions when specifying a field as a para-
meter.
l Removing generic elements – The OMIT( ) and EXCLUDE( ) functions can improve the effect-
iveness of the DICECOEFFICIENT( ) function by removing generic elements such as "Corporation"
or "Inc.", or characters such as commas, periods, and ampersands (&), from field values.
Removal of generic elements and punctuation focuses the DICECOEFFICIENT( ) string com-
parison on just the portion of the strings where a meaningful difference may occur.
n-gram
length "John Smith" n-grams "Smith, John D." n-grams
2 Jo | oh | hn | n_ | _S | Sm | mi | it | th Sm | mi | it | th | h, | ,_ | _J | Jo | oh | hn | n_ | _D | D.
3 Joh | ohn | hn_ | n_S | _Sm | Smi | mit | Smi | mit | ith | th, | h,_ | ,_J | _Jo | Joh | ohn | hn_ | n_D | _
ith D.
2 Jo | oh | hn | n_ | _S | Sm | mi Sm | mi | it | th | h, | ,_ | _J | Jo | oh | hn | n_ | 8 2x8 / (9+13) =
| it | th _D | D. 0.7273
(default)
(9 n-grams) (13 n-grams)
3 Joh | ohn | hn_ | n_S | _Sm | Smi | mit | ith | th, | h,_ | ,_J | _Jo | Joh | ohn 6 2x6 / (8+12) =
Smi | mit | ith | hn_ | n_D | _D. 0.6000
(8 n-grams) (12 n-grams)
4 John | ohn_ | hn_S | n_Sm | _ Smit | mith | ith, | th,_ | h,_J | ,_Jo | _Joh | 4 2x4 / (7+11) =
Smi | Smit | mith John | ohn_ | hn_D | n_D. 0.4444
(7 n-grams) (11 n-grams)
Dice's coefficient
Address pair (default n-gram of 2) Levenshtein distance
Dice's coefficient
Corporation name pair (default n-gram of 2) Levenshtein distance
DIGIT( ) function
Returns the upper or lower digit of a specified Packed data type byte.
Syntax
DIGIT(byte_location, position)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
A packed field with the value 123.45 (00 12 34 5C), containing two decimals, and starting in byte position 10,
appears in the data record in the following format:
UPPER(1) 0 1 3 5
LOWER(2) 0 2 4 C
Returns 3 (finds the digit that appears in the 12th position in the upper half of the byte):
DIGIT(12, 1)
Returns 4 (finds digit that appears in the 12th position in the lower half of the byte):
DIGIT(12, 2)
Remarks
How it works
DIGIT( ) separates individual halves of a byte, and returns the value of the byte specified in the position
parameter as a digit between 0 and 15.
DOW( ) function
Returns a numeric value (1 to 7) representing the day of the week for a specified date or datetime. Abbre-
viation for "Day of Week".
Syntax
DOW(date/datetime)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value to extract the numeric day of the
week from.
Output
Numeric.
Examples
Basic examples
Returns 4, because December 31, 2014 falls on a Wednesday, the 4th day of the week:
DOW(`20141231`)
DOW(`20141231 235959`)
Returns the numeric day of the week for each value in the Invoice_date field:
DOW(Invoice_date)
Advanced examples
Identifying transactions occurring on a weekend
Use the DOW( ) function to identify transactions that occur on a weekend. The filter below isolates dates in
the Trans_Date field that occur on a Saturday or a Sunday:
Remarks
Parameter details
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset.
For example, avoid: hh+hhmm. Results
can be unreliable.
Related functions
If you need to return:
l the name of the day of the week, use CDOW( ) instead of DOW( )
l the day of the month as a number (1 to 31), use DAY( ) instead of DOW( )
DTOU( ) function
Converts an Analytics date value to a Unicode string in the specified language and locale format. Abbre-
viation for "Date to Unicode".
Note
This function is specific to the Unicode edition of Analytics. It is not a supported function in
the non-Unicode edition.
Syntax
DTOU(< date> <,locale> <,style>)
Parameters
Name Type Description
date datetime The field, expression, or literal value to convert to a Unicode string. If
omitted, the current operating system date is used.
optional
The date can contain a datetime value, but the time portion of the
value is ignored. Standalone time values are not supported.
You can specify a field or a literal date value:
o Field – can use any date format, as long as the field definition cor-
rectly defines the format
o Literal – must use one of the YYYYMMDD or YYMMDD formats, for
example `20141231`
The minimum supported date value is 31 December 1969.
locale character The locale code that specifies the language of the output string, and
optionally the version of the language associated with a particular
optional
country or region.
For example, "zh" specifies Chinese, and "pt_BR" specifies Brazilian
Portuguese.
If omitted, the default locale for your computer is used. If a language
is specified, but no country is specified, the default country for the lan-
guage is used.
You cannot specify locale if you have not specified date.
For information about locale codes, see www.unicode.org.
style numeric The date format style to use for the Unicode string. The format style
matches the standard for the locale you specify:
optional
Output
Character.
Examples
Basic examples
Literal input values
Returns "31 de dezembro de 2014":
DTOU(`20141231`, "pt_BR", 1)
DTOU(`20141231`, "pl", 1)
DTOU(Invoice_date, "zh", 1)
DTOU(`20141231`, "zh", 0)
DTOU(`20141231`, "zh_CN", 0)
DTOU(`20141231`, "zh", 1)
DTOU(`20141231`, "zh_CN", 1)
Remarks
Related functions
DTOU( ) is the inverse of the UTOD( ) function, which converts a Unicode string to a date.
EBCDIC( ) function
Returns a string that has been converted to EBCDIC character encoding.
Syntax
EBCDIC(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "ñòó@Æ '…@â£K":
Advanced examples
Creating an EBCDIC-encoded field to export
To create a field containing the EBCDIC encoded value of a Name field for export to an application that
requires EBCDIC encoding, specify the following:
Remarks
When to use EBCDIC( )
Use this function to convert data to the Extended Binary Coded Decimal Interchange Code (EBCDIC) char-
acter encoding. EBCDIC character encoding is used primarily on IBM mainframe operating systems, such
as z/OS.
EFFECTIVE( ) function
Returns the effective annual interest rate on a loan.
Syntax
EFFECTIVE(nominal_rate, periods)
Parameters
Name Type Description
Output
Numeric. The rate is calculated to eight decimals places.
Examples
Basic examples
Returns 0.19561817 (19.56%), the effective annual rate of interest on the unpaid balance of a credit card
that charges 18% per annum, compounded monthly:
EFFECTIVE(0.18, 12)
Remarks
What is the effective annual interest rate?
The effective annual interest rate on a loan is the actual annual rate of interest paid, taking into account
interest on the remaining balance, compounded monthly or daily.
Related functions
The NOMINAL( ) function is the inverse of the EFFECTIVE( ) function.
EOMONTH( ) function
Returns the date of the last day of the month that is the specified number of months before or after a spe-
cified date.
Syntax
EOMONTH(< date/datetime> <,months>)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value from which to calculate the end-
of-month date. If omitted, the end-of-month date is calculated from the
optional
current operating system date.
Note
You can specify a datetime value for date/datetime but
the time portion of the value is ignored.
months numeric The number of months before or after date/datetime. If omitted, the
default of 0 (zero) is used.
optional
You cannot specify months if you have omitted date/datetime.
Output
Datetime. The date value is output using the current Analytics date display format.
Examples
Basic examples
No input
Returns the last day of the month for the current operating system date:
EOMONTH()
EOMONTH(`20140115`)
Returns `20140430` displayed as 30 Apr 2014 assuming a current Analytics date display format of
DD MMM YYYY:
EOMONTH(`20140115`, 3)
Returns `20131031` displayed as 31 Oct 2013 assuming a current Analytics date display format of
DD MMM YYYY:
EOMONTH(`20140115`, -3)
EOMONTH(Invoice_date, 3)
Returns the last day of the month that falls three months after each date in the Invoice_date field plus a
grace period of 15 days:
EOMONTH(Invoice_date + 15, 3)
Returns the first day of the month in which the invoice date falls:
EOMONTH(Invoice_date, -1) + 1
Remarks
Datetime formats
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
A literal date value must use one of the following formats:
l YYYYMMDD
l YYMMDD
You must enclose literal date values in backquotes. For example: `20141231`
EOMONTH(`20140115`) + 1
Related functions
Use the GOMONTH( ) function if you want to return the exact date, rather than the date of the last day of the
month, that is the specified number of months before or after a specified date.
EXCLUDE( ) function
Returns a string that excludes the specified characters.
Syntax
EXCLUDE(string, characters_to_exclude)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns " Alberni Street", which is the input string with all numbers excluded:
Returns all the values in the Product_Number field with the forward slash and number sign excluded:
EXCLUDE(Product_Number, "/#")
Remarks
How it works
The EXCLUDE( ) function compares each character in string with the characters listed in characters_to_
exclude. If a match occurs, the character is excluded from the output string.
For example, the output for EXCLUDE("123-45-4536", "-") is "123454536".
No matching characters
If there are no matches between string and characters_to_exclude, then string and the output of the func-
tion are the same.
For example, the output for EXCLUDE("ABC", "D") is "ABC".
Case sensitivity
The EXCLUDE( ) function is case-sensitive. If you specify "ID" in characters_to_exclude, these characters
are not excluded from "id#94022". If there is a chance the case may be mixed, use the UPPER( ) function to
convert string to uppercase.
For example:
EXCLUDE(UPPER("id#94022"), "ID#")
Usage tip
Use EXCLUDE( ) if the set of characters you want to exclude is small, and the set you want to include is
large.
Related functions
The EXCLUDE( ) function is the opposite of the INCLUDE( ) function.
EXP( ) function
Returns the exponential value (base 10) of a numeric expression with a specified number of decimal
places.
Syntax
EXP(number, decimals)
Parameters
Name Type Description
number numeric The numeric field, expression, or value to return the exponential
value of.
Output
Numeric.
Examples
Basic examples
Returns 1000.00:
EXP(3, 2)
Returns 72443.596007:
EXP(4.86, 6)
Advanced examples
Finding the cube root
Creates a field that is the cube root of the field X to two decimal places:
Tip
You can determine the nth root by dividing the log of the value by n and taking the expo-
nential of the result.
Remarks
How it works
This function returns the exponential value (base 10) of a numeric expression, which is defined as 10 raised
to the nth power. For example, the exponential value of 3 is 103, or 1000.
Related functions
The inverse of an exponent is its logarithm, so EXP( ) is the opposite of the LOG( ) function.
FILESIZE( ) function
Returns the size of the specified file in bytes or -1 if the file does not exist.
Syntax
FILESIZE(filename)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 14744:
FILESIZE("Inventory.fil")
If the file you are checking is not in the same folder as the Analytics project, you must specify either the rel-
ative path or absolute path to the file.
Returns 6018:
Advanced examples
Executing a script if a file does not exist
Only executes the script import_data if the file Metaphor_Inventory_2002.fil does not exist:
CALCULATE FILESIZE("Metaphor_Inventory_2002.fil")
FIND( ) function
Returns a logical value indicating whether the specified string is present in a particular field, or anywhere in
an entire record.
Note
The FIND( ) function and the "FIND command" on page 204 are two separate Analytics
features with significant differences.
Syntax
FIND(string <,field_to_search_in>)
Parameters
Name Type Description
string character The character string to search for. This search is not case-sensitive.
field_to_search_in character The field, or variable, to search in. If omitted, the entire record is
searched, including any undefined portion of the record.
optional
Output
Logical. Returns T (true) if the specified string value is found, and F (false) otherwise.
Examples
Basic examples
Searching an entire record
Returns T for all records that contain the string "New York" in any field, across any field boundaries, and in
any undefined portion of the record. Returns F otherwise:
FIND("New York")
Returns T for all records that contain the string "Ne" in the City field. Returns F otherwise:
FIND("Ne", City)
Returns T for all records that contain the string "New York" preceded by one or more spaces in the City field.
Returns F otherwise:
Returns T for all records that have a value in the Description field that matches, or contains, the value in the
v_search_term variable. Returns F otherwise:
FIND(v_search_term, Description)
Returns T for all records that contain the string "New York" in either the City or the City_2 fields. Returns F
otherwise:
FIND(ALLTRIM(Last_Name), Last_Name_2)
Remarks
When to use FIND( )
Use the FIND( ) function to test for the presence of the specified string in a field, two or more fields, or an
entire record.
The concatenated fields are treated like a single field that includes leading and trailing spaces from the indi-
vidual fields, unless you use the ALLTRIM( ) function to remove spaces.
You can also build an expression that searches each field individually:
If string includes a leading space, search results from the two approaches can differ.
FINDMULTI( ) function
Returns a logical value indicating whether any string in a set of one or more specified strings is present in a
particular field, or anywhere in an entire record.
Syntax
FINDMULTI({search_in|RECORD}, string_1 <,...n>)
Parameters
Name Type Description
Field_1+Field_2+Field_3
string_1 <,...n> character One or more character strings to search for. Separate multiple search
strings with commas:
Output
Logical. Returns T (true) if any of the specified string values are found, and F (false) otherwise.
Examples
Basic examples
Searching an entire record
Returns T for all records that contain "New York" or "Chicago" in any field, across any field boundaries, and
in any undefined portion of the record. Returns F otherwise:
Returns T for all records that contain the string "Ne" or "Chi" in the City field. Returns F otherwise:
Returns T for all records that contain "New York" or "Chicago" preceded by one or more spaces in the City
field. Returns F otherwise:
Returns T for all records that have a value in the Description field that matches, or contains, any of the val-
ues in the v_search_term variables . Returns F otherwise:
Returns T for all records that contain the string "New York" or "Chicago" in either the City or the City_2
fields. Returns F otherwise:
Remarks
When to use FINDMULTI( )
Use the FINDMULTI( ) function to test for the presence of any of the specified strings in a field, two or more
fields, or an entire record.
The concatenated fields are treated like a single field that includes leading and trailing spaces from the indi-
vidual fields, unless you use the ALLTRIM( ) function to remove spaces.
You can also build an expression that searches each field individually:
If a string value includes a leading space, search results from the two approaches can differ.
FREQUENCY( ) function
Returns the expected Benford frequency for sequential leading positive numeric digits to a precision of
eight decimal places.
Syntax
FREQUENCY(digit_string)
Parameters
Name Type Description
digit_string character A character string containing digits (0-9) to identify the frequency for.
digit_string must be a positive number, and leading zeros are
ignored.
Output
Numeric.
Examples
Basic examples
Returns 0.00998422:
FREQUENCY("43")
Returns 0.00000000:
FREQUENCY("87654321")
Note
The result is 0.00000000495, but because Analytics computes to a precision of eight
decimal places, a zero value is returned.
Remarks
How it works
FREQUENCY( ) returns the expected Benford frequency for sequential leading positive numeric digits to a
precision of eight digits. It lets you perform limited Benford tests for specific situations.
FTYPE( ) function
Returns a character identifying the data category of a field or variable, or the type of an Analytics project
item.
Syntax
FTYPE(field_name_string)
Parameters
Name Type Description
field_name_string character A field name, variable name, or Analytics project item name.
Enclose field_name_string in quotation marks:
FTYPE("Amount")
Output
Character. This function returns one of the following characters, which indicates the field, variable, or Ana-
lytics project item type:
l "C" – Character field
l "N" – Numeric field
l "D" – Datetime field
l "L" – Logical field
l "c" – Character variable
l "n" – Numeric variable
l "d" – Datetime variable
l "l" – Logical variable
l "b" – Analytics script
l "y" – Analytics table layout
l "w" – Analytics workspace
l "i" – Analytics index
l "r" – Analytics report
l "a" – Analytics log file
l "U" – Undefined
Examples
Basic examples
The following example assigns a value of 4 to the num variable and then checks the type.
Returns "n":
ASSIGN num = 4
FTYPE("num")
Advanced examples
Testing for the data type of a field
You have a script or analytic that requires a numeric Amount field, and you need to test that the field is the
correct type before running the script.
The following command only runs Script_1 if Amount is a numeric field:
OPEN Invoices
DO Script_1 IF FTYPE("Amount") = "N"
The ability to detect the runtime environment allows you to design a single script that executes different
blocks of codes depending on which application it is running in.
FVANNUITY( ) function
Returns the future value of a series of payments calculated using a constant interest rate. Future value is the
sum of the payments plus the accumulated compound interest.
Syntax
FVANNUITY(rate, periods, payment <,type>)
Parameters
Name Type Description
Note
You must use consistent time periods when specifying rate, periods, and payment to
ensure that you are specifying interest rate per period.
For example:
l for a monthly payment on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for an annual payment on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric. The result is calculated to two decimal places.
Examples
Basic examples
Monthly payments
Returns 27243.20, the future value of $1,000 paid at the beginning of each month for 2 years at 1% per
month, compounded monthly:
Returns 12809.33, the future value of the same annuity after the first year:
Annual payments
Returns 25440.00, the future value of $12,000 paid at the end of each year for 2 years at 12% per annum,
compounded annually:
FVANNUITY(0.12, 2, 12000, 0)
Advanced examples
Annuity calculations
Annuity calculations involve four variables:
l present value, or future value – $21,243.39 and $ 26,973.46 in the examples below
l payment amount per period – $1,000.00 in the examples below
l interest rate per period – 1% per month in the examples below
l number of periods – 24 months in the examples below
If you know the value of three of the variables, you can use an Analytics function to calculate the fourth.
PVANNUITY( )
Returns 21243.39:
Returns 26973.46:
PMT( )
Returns 1000:
RATE( )
Returns 0.00999999 (1%):
NPER( )
Returns 24.00:
Annuity formulas
The formula for calculating the present value of an ordinary annuity (payment at the end of a period):
The formula for calculating the future value of an ordinary annuity (payment at the end of a period):
Remarks
Related functions
The PVANNUITY( ) function is the inverse of the FVANNUITY( ) function.
FVLUMPSUM( ) function
Returns the future value of a current lump sum calculated using a constant interest rate.
Syntax
FVLUMPSUM(rate, periods, amount)
Parameters
Name Type Description
amount numeric The investment made at the start of the first period.
Note
You must use consistent time periods when specifying rate and periods to ensure that you
are specifying interest rate per period.
For example:
l for monthly payments on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for annual payments on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric. The result is calculated to two decimal places.
Examples
Basic examples
Interest compounded monthly
Returns 1269.73, the future value of a lump sum of $1,000 invested for 2 years at 1% per month, com-
pounded monthly:
Returns 1126.83, the future value of the same investment after the first year:
Returns 27243.20, the future value of $21,455.82 invested for 2 years at 1% per month, compounded
monthly:
FVLUMPSUM(0.12, 2, 1000)
Remarks
What is future value?
The future value of an invested lump sum is the initial investment principal plus the accumulated compound
interest.
Related functions
The PVLUMPSUM( ) function is the inverse of the FVLUMPSUM( ) function.
FVSCHEDULE( ) function
Returns the future value of a current lump sum calculated using a series of interest rates.
Syntax
FVSCHEDULE(principal, rate1 <,rate2...>)
Parameters
Name Type Description
Output
Numeric. The result is calculated to two decimal places.
Examples
Basic examples
Returns 1282.93, the future value of a lump sum of $1000 invested for 3 years at 10% for the first year, 9%
for the second year, and 7% for the third year, compounded annually:
Remarks
The future value of an invested lump sum is the initial investment principal plus the accumulated com-
pound interest.
GETOPTIONS( ) function
Returns the current setting for the specified Analytics option (Options dialog box setting).
Syntax
GETOPTIONS(option)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns the current settings for the three Analytics separator characters. For example, ".,,":
GETOPTIONS("separators")
Advanced examples
Using GETOPTIONS( ) in a script
If a script needs to change one or more of the Analytics separator characters, the GETOPTIONS( ) func-
tion provides a method for discovering the current settings. The current settings can be stored in a variable
and then reinstated at the end of the script.
ASSIGN v_SeparatorsSetting = GETOPTIONS("separators")
SET SEPARATORS ",.;"
< script content>
SET SEPARATORS "%v_SeparatorsSetting%"
Remarks
The three Analytics separator characters are specified in the following options in the Options dialog box:
l Decimal Place Symbol
l Thousands Separator
l List Separator
GOMONTH( ) function
Returns the date that is the specified number of months before or after a specified date.
Syntax
GOMONTH(date/datetime, months)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value from which to calculate the output
date.
Output
Datetime. The date value is output using the current Analytics date display format.
Examples
Basic examples
Literal input values
Returns `20140415` displayed as 15 Apr 2014 assuming a current Analytics date display format of
DD MMM YYYY:
GOMONTH(`20140115`, 3)
Returns `20131015` displayed as 15 Oct 2013 assuming a current Analytics date display format of
DD MMM YYYY:
GOMONTH(`20140115`, -3)
Returns `20140430` displayed as 30 Apr 2014 assuming a current Analytics date display format of
DD MMM YYYY (date rounding prevents returning 31 Apr 2014, which is an invalid date):
GOMONTH(`20140330`, 1)
GOMONTH(`20140331`, 1)
Returns `20140501` displayed as 01 May 2014 assuming a current Analytics date display format of
DD MMM YYYY:
GOMONTH(`20140401`, 1)
GOMONTH(Invoice_date, 3)
Returns the date three months after each date in the Invoice_date field plus a grace period of 15 days:
GOMONTH(Invoice_date + 15, 3)
Remarks
Datetime formats
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
A literal date value must use one of the following formats:
l YYYYMMDD
l YYMMDD
You must enclose literal date values in backquotes. For example: `20141231`
GOMONTH(`20140331`,1)
Related functions
Use the EOMONTH( ) function if you want to return the date of the last day of the month, rather than the
exact date, that is the specified number of months before or after a specified date.
HASH( ) function
Returns a salted cryptographic hash value based on the input value.
Syntax
HASH(field <,salt_value>)
Parameters
Name Type Description
salt_value character The salt value to use. You can specify a PASSWORD identifier num-
ber from 1 to 10, or a character string.
optional numeric
If omitted, the Analytics default salt value is used.
The salt value is limited to 128 characters, and is automatically trun-
cated to 128 characters if you specify a longer salt value.
For more information, see "The salt value" on page 591.
Output
Character.
Examples
Basic examples
With the Analytics default salt value
Returns "819A974BB91215D58E7753FD5A42226150100A0763087CA7DECD93F3C3090405":
HASH("555-44-3322")
Returns the hash value for each number in the Credit_card_num field:
HASH(Credit_card_num)
Advanced examples
Ensuring hash values are identical
Use other functions in conjunction with HASH( ) to standardize clear text values that should produce
identical hash values.
Consider the following set of examples. Note how the case of the clear text values completely alters the out-
put hash value in the first two examples.
Returns "DF6789E1EC65055CD9CA17DD5B0BEA5892504DFE7661D258737AF7CB9DC46462":
HASH("John Smith")
Returns "3E12EABB5940B7A2AD90A6B0710237B935FAB68E629907927A65B3AA7BE6781D":
HASH("JOHN SMITH")
By using the UPPER( ) function to standardize case, an identical hash value results.
Returns "3E12EABB5940B7A2AD90A6B0710237B935FAB68E629907927A65B3AA7BE6781D":
HASH(UPPER("John Smith"))
If the comment fields are in separate tables, create a computed HASH( ) field in each table and then use
the computed fields as a common key field to do an unmatched join of the two tables. The records in the
joined output table represent text blocks that are not identical.
Remarks
When to use HASH( )
Use the HASH( ) function to protect sensitive data, such as credit card numbers, salary information, or
social security numbers.
How it works
HASH( ) provides one-way encoding. Data in clear text can be used to produce a hash value, however the
hash value cannot subsequently be unencoded or decrypted.
A specific clear text value always produces the same hash value, so you can search a field of hashed
credit card numbers for duplicates, or join two fields of hashed credit card numbers, and the results are the
same as if you had performed the operation on the equivalent clear text fields.
Note
The PASSWORD salt value must be entered before the field in the HASH( ) function
can be extracted.
The benefit of using a PASSWORD identifier number with HASH( ) is that you do not have to
expose a clear text salt value.
For more information, see "PASSWORD command" on page 350.
HEX( ) function
Converts an ASCII string to a hexadecimal string.
Syntax
HEX(field)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "3132333435":
HEX("12345")
HEX(Count)
Remarks
How it works
This function returns the hexadecimal string that is equivalent to the field value or expression you specify.
You can use the function when you need to identify the exact contents of a field, including characters that
cannot be displayed on screen, such as CR (carriage return), LF (line feed), and NUL (null).
HOUR( ) function
Extracts the hour from a specified time or datetime and returns it as a numeric value using the 24-hour
clock.
Syntax
HOUR(time/datetime)
Parameters
Name Type Description
time/datetime datetime The field, expression, or literal value to extract the hour from.
Output
Numeric.
Examples
Basic examples
Returns 23:
HOUR(`t235959`)
HOUR(`20141231 235959`)
HOUR(Call_start_time)
Remarks
Parameter details
A field specified for time/datetime can use any time or datetime format, as long as the field definition cor-
rectly defines the format.
Specifying a literal time or datetime value
When specifying a literal time or datetime value for time/datetime, you are restricted to the formats in the
table below, and you must enclose the value in backquotes – for example, `20141231 235959`.
Do not use any separators such as slashes (/) or colons (:) between the individual components of dates or
times.
l Time values – you can use any of the time formats listed in the table below. You must use a sep-
arator before a standalone time value for the function to operate correctly. Valid separators are the
letter 't', or the letter 'T'. You must specify times using the 24-hour clock. Offsets from Coordinated
Universal Time (UTC) must be prefaced by a plus sign (+) or a minus sign (-).
l Datetime values – you can use any combination of the date, separator, and time formats listed in
the table below. The date must precede the time, and you must use a separator between the two.
Valid separators are a single blank space, the letter 't', or the letter 'T'.
thhmmss `t235959`
Thhmm `T2359`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset.
For example, avoid: hh+hhmm. Results
can be unreliable.
HTOU( ) function
Converts a hexadecimal string to a Unicode string. Abbreviation for "Hexadecimal to Unicode".
Note
This function is specific to the Unicode edition of Analytics. It is not a supported function in
the non-Unicode edition.
Syntax
HTOU(hex_string)
Parameters
Name Type Description
hex_string character The hexadecimal string to convert to a Unicode string. The string can
only contain hexadecimal values, for example "20AC".
Output
Character.
Examples
Basic examples
Returns "ABC123":
HTOU("004100420043003100320033")
Advanced examples
Adding a currency symbol to a value
You need to extract a monetary field to a new table. The field should display the numeric Amount field's
When the EXTRACT command runs, HTOU( ) returns the Euro symbol "€" and concatenates it with the
Amount value that STRING( ) converts to a character. If the original value of Amount was 2000, then the
value in Currency_Amount is "€2000".
Remarks
Related functions
HTOU( ) is the inverse of the DHEX( ) function, which converts a Unicode string to a hexadecimal string.
INCLUDE( ) function
Returns a string that includes only the specified characters.
Syntax
INCLUDE(string, characters_to_include)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "123", which is the input string with only numbers included:
Returns "1231234", which is the input string with only numbers included:
INCLUDE("123-123-4", "1243")
Returns "" (nothing), because the input string does not contain "D":
INCLUDE("ABC", "D")
Remarks
How it works
The INCLUDE( ) function compares each character in string with the characters listed in characters_to_
include. If a match occurs, the character is included in the output string.
No matching characters
If there are no matches between string and characters_to_include the output of the function is blank.
Case sensitivity
The INCLUDE( ) function is case-sensitive. If you specify "ID" in characters_to_include, these characters
are not included in "id#94022". If there is a chance the case may be mixed, use the UPPER( ) function to
convert string to uppercase.
For example:
INCLUDE(UPPER("id#94022"), "ID0123456789")
Usage tip
Use INCLUDE( ) if the set of characters you want to include is small, and the set you want to exclude is
large.
Related functions
The INCLUDE( ) function is the opposite of the EXCLUDE( ) function.
INSERT( ) function
Returns the original string with specified text inserted at a specific byte location.
Syntax
INSERT(string, insert_text, location)
Parameters
Name Type Description
location numeric The character position at which to insert insert_text into the string.
Output
Character.
Examples
Basic examples
Returns "aXXXbcde":
INSERT("abcde", "XXX", 2)
Returns "XXXabcde":
INSERT("abcde", "XXX", 0)
Returns "abcdeXXX", with "XXX" inserted at byte position 6 instead of 8, because "abcde" is only 5 bytes
long::
INSERT("abcde", "XXX", 8)
Remarks
How it works
The INSERT( ) function inserts specified characters or spaces into a character string, beginning at a spe-
cified position in the string.
Location guidelines
l If the location value is greater than the length of string, the insert_text value is inserted at the end of
the string.
l If location is 0 or 1, insert_text is inserted at the beginning of the string.
INT( ) function
Returns the integer value of a numeric expression or field value.
Syntax
INT(number)
Parameters
Name Type Description
number numeric The field or numeric expression to convert to an integer. If the value
specified includes decimals, the decimals are truncated without round-
ing.
Output
Numeric.
Examples
Basic examples
Returns 7:
INT(7.9)
Returns -7:
INT(-7.9)
IPMT( ) function
Returns the interest paid on a loan for a single period.
Syntax
IPMT(rate, specified_period, periods, amount <,type>)
Parameters
Name Type Description
specified_period numeric The period for which you want to find the interest payment.
Note
You must use consistent time periods when specifying rate and periods to ensure that you
are specifying interest rate per period.
For example:
l for monthly payments on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for annual payments on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric.
Examples
Basic examples
Returns 1489.58, the interest paid in the first month of a twenty-five year, $275,000 loan at 6.5% per annum,
with payments due at the end of the month:
Returns 10.00, the interest paid on the same loan in the last month of the loan:
Remarks
Related functions
The PPMT( ) function is the complement of the IPMT( ) function.
The CUMIPMT( ) function calculates interest paid during a range of periods.
ISBLANK( ) function
Returns a logical value indicating whether the input value is blank.
Syntax
ISBLANK(string)
Parameters
Name Type Description
Output
Logical. Returns T (true) if the string parameter value is blank, and F (false) otherwise.
Examples
Basic examples
Returns F:
ISBLANK(" A")
Returns T:
ISBLANK(" ")
ISBLANK("")
Returns T for all values in the Address field that are blank, and F otherwise:
ISBLANK(Address)
Remarks
When to use ISBLANK( )
Use ISBLANK( ) during the data integrity phase of an analysis project to identify fields with missing data,
which may indicate issues with the source data.
Null characters
ISBLANK( ) may not return useful results when used with character fields that contain null characters. Ana-
lytics uses the null character to indicate the end of a string, and for this reason the ISBLANK( ) function will
not read any character data that follows a null character, including blanks.
ISDEFINED( ) function
Returns T (true) if the specified field or variable is defined, and F (false) otherwise.
Syntax
ISDEFINED(string)
Parameters
Name Type Description
string character The name of the field or variable to check for the existence of. The
value must be entered as a quoted string:
ISDEFINED("v_numeric_limit")
Output
Logical.
Examples
Basic examples
Returns T if v_numeric_limit is defined as a variable or field, otherwise returns F:
ISDEFINED("v_numeric_limit")
Advanced examples
Using ISDEFINED( ) to test a field
The following example uses the ISDEFINED( ) function to test if the Limit field is defined in the table before
extracting records based on the value in the field:
OPEN Metaphor_Employees_US
IF ISDEFINED("Limit") EXTRACT RECORD IF Limit > 50000 TO "HighLimit.fil"
ISFUZZYDUP( ) function
Returns a logical value indicating whether a string is a fuzzy duplicate of a comparison string.
Syntax
ISFUZZYDUP(string1, string2, levdist <,diffpct>)
Parameters
Name Type Description
levdist numeric The maximum allowable Levenshtein distance between the two
strings for them to be identified as fuzzy duplicates.
The levdist value cannot be less than 1 or greater than 10.
Increasing the levdist value increases the number of results by includ-
ing values with a greater degree of fuzziness – that is, values that are
more different from each another.
Output
Logical. Returns T (true) if string values are fuzzy duplicates, and F (false) otherwise.
Examples
Basic examples
Returns F, because two edits are required to transform "Smith" into "Smythe", but the levdist value is only 1:
ISFUZZYDUP("Smith","Smythe", 1, 99)
Returns T, because two edits are required to transform "Smith" into "Smythe", and the levdist value is 2:
ISFUZZYDUP("Smith","Smythe", 2, 99)
Returns T, because zero edits are required to transform "SMITH" into "smith", and the levdist value is 1 (the
ISFUZZYDUP( ) function is not case-sensitive):
ISFUZZYDUP("SMITH","smith", 1, 99)
Returns a logical value (T or F) indicating whether individual values in the Last_Name field are fuzzy duplic-
ates for the string "Smith":
ISFUZZYDUP(Last_Name,"Smith", 3, 99)
Advanced examples
Working with difference percentage
The difference percentage gives you a tool for reducing the number of false positives returned by
ISFUZZYDUP( ).
No diffpct specified
Returns T, because five edits are required to transform "abc" into "Smith", and the levdist value is 5:
ISFUZZYDUP("abc", "Smith", 5)
diffpct specified
Returns F, even though "abc" is within the specified Levenshtein distance of "Smith", because 5 edits/a
string length of 3 results in a difference percentage of 167%, which exceeds the specified diffpct of 99%:
Changing the levdist or diffpct values allows you to adjust the amount of difference in the filtered values.
Remarks
When to use ISFUZZYDUP( )
Use the ISFUZZYDUP( ) function to find nearly identical values (fuzzy duplicates) or locate inconsistent
spelling in manually entered data.
How it works
The ISFUZZYDUP( ) function calculates the Levenshtein distance between two strings, and calculates the
difference percentage.
ISFUZZYDUP( ) evaluates to T (true) if:
l The Levenshtein distance is less than or equal to the levdist value.
l The difference percentage is less than or equal to the diffpct value (if specified).
Levenshtein distance
The Levenshtein distance is a value representing the minimum number of single character edits required
to make one string identical to the other string.
For more information, see "LEVDIST( ) function" on page 620.
Difference percentage
The difference percentage is the percentage of the shorter of the two evaluated strings that is different.
The difference percentage is the result of the following internal Analytics calculation, which uses the Leven-
shtein distance between the two strings:
Levenshtein distance / number of characters in the shorter string × 100 = difference percentage
Using the optional difference percentage helps reduce the number of false positives returned by
ISFUZZYDUP( ):
l The upper threshold for diffpct is 99%, which prevents the entire replacement of a string in order to
make it identical.
l Strings that require a large number of edits in relation to their length are excluded.
Usage tips
l Case-sensitivity – The function is not case-sensitive, so "SMITH" is equivalent to "smith."
l Trailing blanks – The function automatically trims trailing blanks in fields, so there is no need to use
the TRIM( ) function when specifying a field as a parameter.
l Removing generic elements – The OMIT( ) function can improve the effectiveness of the
ISFUZZYDUP( ) function by removing generic elements such as "Corporation" or "Inc." from field val-
ues.
Removal of generic elements focuses the ISFUZZYDUP( ) string comparison on just the portion of
the strings where a meaningful difference may occur.
Related functions
l LEVDIST( ) – provides an alternate method for comparing strings based on Levenshtein distance.
Unlike ISFUZZYDUP( ), LEVDIST( ) is case-sensitive by default.
l DICECOEFFICIENT( ) – de-emphasizes or completely ignores the relative position of characters or
character blocks when comparing strings.
l SOUNDSLIKE( ) and SOUNDEX( ) – compare strings based on a phonetic comparison (sound)
rather than on an orthographic comparison (spelling).
LAST( ) function
Returns a specified number of characters from the end of a string.
Syntax
LAST(string, length)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "Savings":
Returns "efghi":
LAST("abcdefghi", 5)
Returns "fghi ":
LAST("abcdefghi ", 5)
Returns " abc", because the string value is shorter than the specified length of 6, so leading spaces are
added to the output:
LAST("abc", 6)
Remarks
Blank results caused by trailing spaces
Trailing spaces in string can cause the results produced by the LAST( ) function to be blank.
For example, the output for LAST("6483-30384 ", 3) is " ".
You can use the ALLTRIM( ) function in conjunction with LAST( ) to remove any trailing spaces in string.
For example, LAST(ALLTRIM("6483-30384 "), 3) returns "384".
LEADING( ) function
Returns a string containing a specified number of leading digits.
Syntax
LEADING(number, leading_digits)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Literal numeric input
Returns 623:
LEADING(6234.56, 3)
Returns 62345:
LEADING(-6234.56, 5)
LEADING(0.00, 3)
Returns 00000:
LEADING(0.00, 5)
Returns 35500:
LEADING(3.55, 5)
Remarks
Use LEADING( ) to extract digits from a numeric field as a string, and filter out non-digit elements such as
decimals or dollar signs.
LENGTH( ) function
Returns the number of characters in a string.
Syntax
LENGTH(string)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 15:
LENGTH("ABC Corporation")
Returns the length in characters of the Description field in the table layout:
LENGTH(Description)
Advanced examples
Displaying the length of each address in an address field
Create a computed field that displays the length in characters of each address in the Vendor_Street field.
Leading and trailing blank spaces are first trimmed from the address values so they are not counted in the
length.
Remarks
How it works
The LENGTH( ) function counts the number of characters in string, including any spaces, and returns the
number.
Trailing spaces
Trailing spaces are counted as characters. If you do not want trailing spaces to be counted, use the TRIM( )
or ALLTRIM( ) functions to remove them. For example:
LENGTH(TRIM(Vendor_Street))
If you create a computed field to display the length of the values in a field, and you do not remove trailing
spaces, the maximum length of the field is displayed for each value.
LEVDIST( ) function
Returns the Levenshtein distance between two specified strings, which is a measurement of how much the
two strings differ.
Syntax
LEVDIST(string1, string2 <,case_sensitive>)
Parameters
Name Type Description
Output
Numeric. The value is the Levenshtein distance between two strings.
Examples
Basic examples
Returns 3, because two substitutions and one insertion are required to transform "smith" into "Smythe":
LEVDIST("smith","Smythe")
Returns 2, because case is ignored, so only two substitutions are required to transform "smith's" into
"Smythes":
LEVDIST("smith's","Smythes",F)
Returns the Levenshtein distance between each value in the Last_Name field and the string "Smith":
LEVDIST(TRIM(Last_Name),"Smith")
Advanced examples
Ranking values against "Smith"
Create the computed field Lev_Dist to display the Levenshtein distance between "Smith" and each value in
the Last_Name field:
Add the computed field Lev_Dist to the view, and then quick sort it in ascending order, to rank all values in
the Last_Name field by their amount of difference from "Smith".
Changing the number in the expression allows you to adjust the amount of Levenshtein distance in the
filtered values.
Remarks
When to use LEVDIST( )
Use the LEVDIST( ) function to find nearly identical values (fuzzy duplicates) or locate inconsistent spelling
in manually entered data. LEVDIST( ) also identifies exact duplicates.
How it works
The LEVDIST( ) function returns the Levenshtein distance between the two evaluated strings, which is a
value representing the minimum number of single character edits required to make one string identical to
the other string.
Each required edit increments the value of the Levenshtein distance by 1. The greater the Levenshtein dis-
tance, the greater the difference between the two strings. A distance of zero (0) means the strings are
identical.
Types of edits
The edits can be of three types:
l insertion
l deletion
l substitution
Transpositions (two adjacent letters reversed) are not recognized by the Levenshtein algorithm, and count
as two edits – specifically, two substitutions.
Non-alphanumeric characters
Punctuation marks, special characters, and blanks are treated as single characters, just like letters and
numbers.
LEVDIST("abc", "dec")
Returns 3:
LEVDIST("abc", "cde")
Related functions
l ISFUZZYDUP( ) – provides an alternate method for comparing strings based on Levenshtein dis-
tance.
Unlike the default behavior of LEVDIST( ), ISFUZZYDUP( ) is not case-sensitive.
l DICECOEFFICIENT( ) – de-emphasizes or completely ignores the relative position of characters or
character blocks when comparing strings.
l SOUNDSLIKE( ) and SOUNDEX( ) – compare strings based on a phonetic comparison (sound)
rather than on an orthographic comparison (spelling).
LOG( ) function
Returns the logarithm (base 10) of a numeric expression or field value with a specified number of decimal
places.
Syntax
LOG(number, decimals)
Parameters
Name Type Description
decimals numeric The number of decimal places for the return value.
Output
Numeric.
Examples
Basic examples
Returns 3.0000:
LOG(1000, 4)
Returns 4.86:
LOG(72443, 2)
Advanced examples
Finding the cube root
Creates a field that is the cube root of the field X to two decimal places:
Note
You determine the nth root by dividing the log of the value by n and taking the exponential
value of the result.
Remarks
How it works
The logarithm of a number is the exponent (or power) of 10 needed to generate that number. Therefore, the
logarithm of 1000 is 3.
Related functions
The LOG( ) function is the inverse of the EXP( ) function.
LOWER( ) function
Returns a string with alphabetic characters converted to lowercase.
Syntax
LOWER(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "abc":
LOWER("ABC")
LOWER("AbCd 12")
LOWER(Last_Name)
Remarks
How it works
The LOWER( ) function converts all alphabetic characters in string to lowercase. All non-alphabetic char-
acters are left unchanged.
LTRIM( ) function
Returns a string with leading spaces removed from the input string.
Syntax
LTRIM(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Note that in both examples the trailing spaces are not removed by the LTRIM( ) function.
Returns "Vancouver ":
LTRIM(" Vancouver ")
LTRIM(" New York ")
Advanced examples
Removing non-breaking spaces
Non-breaking spaces are not removed by the LTRIM( ) function.
If you need to remove leading non-breaking spaces, create a computed field using the following expression:
The REPLACE( ) function replaces any non-breaking spaces with regular spaces, and then LTRIM( )
removes any leading regular spaces.
Remarks
How it works
The LTRIM( ) function removes leading spaces only. Spaces inside the string, and trailing spaces, are not
removed.
Related functions
LTRIM( ) is related to the TRIM( ) function, which removes any trailing spaces from a string, and to the
ALLTRIM( ) function, which removes both leading and trailing spaces.
MAP( ) function
Returns a logical value indicating if a character string matches a specified format string containing wildcard
characters, literal characters, or both.
Syntax
MAP(string, format)
Parameters
Name Type Description
string character The field, expression, or literal value to test for matches.
format character The data pattern, or character string, you want to compare with string.
format can contain wildcard characters, literal characters, or a com-
bination of the two:
"\9\9\9-999-9999"
Output
Logical. Returns T (true) if a match is found, and F (false) otherwise.
Examples
Basic examples
Simple search patterns
Returns T:
MAP("045", "9999")
Escaping a wildcard
If the goal is to return T for only those values that start with the literal character "X", followed by any second
letter, the format parameter "\XX" ensures that the first "X" in the parameter is interpreted literally and not as
a wildcard.
Returns T:
MAP("XA-123", "XX")
MAP("GC-123", "XX")
MAP("XA-123", "\XX")
Returns F:
MAP("GC-123", "\XX")
MAP(Invoice_Number, "XX99999")
Returns T for all records with invoice numbers that are exactly "AB12345", or that start with "AB12345".
Returns F otherwise:
MAP(Invoice_Number, "AB12345")
Returns T for all records with invoice numbers that consist of, or that start with, "AB" followed by five num-
bers. Returns F otherwise:
MAP(Invoice_Number, "AB99999")
Returns T for all records that do not match the standard format of social security numbers in the SSN field.
Returns F otherwise:
Advanced examples
Extracting records with 10-character product codes and with the leading
characters "859-"
Use an IF statement and the MAP( ) function to extract only those records that have product codes at least
10 characters long, and the leading characters "859-":
Remarks
When to use MAP( )
Use the MAP( ) function to search for patterns or particular formats in alpha-numeric data. The patterns or
formats can be made up of wildcard characters, literal characters, or a combination of both.
Case sensitivity
The MAP( ) function is case-sensitive when comparing two literal characters. For example, "a" is not equi-
valent to "A".
If string includes character data with inconsistent case, you can use the UPPER( ) function to convert the
values to consistent case before using MAP( ).
For example:
MAP(UPPER(Invoice_Number), "AB99999")
Partial matching
MAP( ) supports partial matching in one situation but not in the other.
Partial matching in MAP( ) is not affected by the Exact Character Comparisons option (SET EXACT
ON/OFF).
MAP("AB1234567", "AB99999")
Note
To return True, the format value must appear at the start of the string value.
MAP("AB1234", "AB99999")
Concatenating fields
You can concatenate two or more fields in string if you want to search in more than one field in a table. The
concatenated fields are treated like a single field that includes leading and trailing blanks from the individual
fields, unless you use the ALLTRIM( ) function to remove them.
MASK( ) function
Performs a bitwise AND operation on the first bytes of two character strings.
Syntax
MASK(character_value, character_mask)
Parameters
Name Type Description
character_mask character The string with the byte to test against (the mask value).
Output
Character. The output is the character representation of the binary result of the bitwise AND operation.
Examples
Basic examples
Returns "2" (00110010), the result of a bitwise AND of 3 (00110011) and 6 (00110110):
MASK("3", "6")
Remarks
When to use MASK( )
Use MASK( ) to identify specific bit patterns in a byte of data, including whether or not a particular bit is set
to 1.
How it works
The MASK( ) function performs a bitwise AND operation on the binary representations of the first characters
of character_value and character_mask. The two comparison bytes are compared one bit at a time, res-
ulting in a third binary value.
The result of each comparison of corresponding bits is either 1 or 0:
0 0 0
0 1 0
1 0 0
1 1 1
MATCH( ) function
Returns a logical value indicating whether the specified value matches any of the values it is compared
against.
Syntax
MATCH(comparison_value, test <,...n>)
Parameters
Name Type Description
comparison_value character The field, expression, or literal value to test for matches.
numeric
datetime
test <,...n> character Any field, expression, or literal value you want to compare with com-
parison_value.
numeric
You can specify as many test values as necessary, but all specified
datetime
values must be of the same data type:
Note
Inputs to the MATCH( ) function can be character, numeric, or datetime data. You cannot
mix data types. All inputs must belong to the same data category.
Output
Logical. Returns T (true) if at least one match is found, and F (false) otherwise.
Examples
Basic examples
Note
Return values for character comparisons assume that SET EXACT is OFF (the default set-
ting), except where noted.
Returns F:
Testing a field
Returns T for all records that contain "Phoenix", "Austin", or "Los Angeles" in the Vendor_City field. Returns
F otherwise:
Returns T for all records that do not contain "Phoenix", "Austin", or "Los Angeles" in the Vendor_City field.
Returns F otherwise:
Returns T for all records that contain "PHOENIX", "AUSTIN", or "LOS ANGELES" in the Vendor_City field,
regardless of the case of any of the characters in the field. Returns F otherwise:
Values in the Vendor_City field are converted to uppercase before being compared with the uppercase city
names.
SET EXACT behavior
Returns T for all records that have product codes "A", "D", or "F", or product codes beginning with "A", "D",
or "F", in the Product_Code field. Returns F otherwise:
Returns T for all records that have one-character product codes "A", "D", or "F" in the Product_Code field.
Returns F otherwise (SET EXACT must be ON):
MATCH(Vendor_Address, Employee_Address)
Comparing dates
Returns T for all records with an invoice date of 30 Sep 2014 or 30 Oct 2014. Returns F otherwise:
Advanced examples
Extracting anomalous inventory records
Use an IF statement and the MATCH( ) function to extract records that contain different amounts in the
Inventory_Value_at_Cost field and the computed Cost_x_Quantity field:
Remarks
Use MATCH( ) instead of the OR operator
You can use the MATCH( ) function instead of expressions that use the OR operator.
For example:
is equivalent to
Returns F, because 1.23 does not equal 1.234 once the third decimal place is considered:
Character parameters
Case sensitivity
The MATCH( ) function is case-sensitive when used with character data. When it compares characters, "a"
is not equivalent to "A".
Returns F:
MATCH("a","A","B","C")
If you are working with data that includes inconsistent case, you can use the UPPER( ) function to convert
the values to consistent case before using MATCH( ).
Returns T:
Partial matching
Partial matching is supported for character comparisons. Either value being compared can be contained
by the other value and be considered a match.
Both of these examples return T:
MATCH("AB", "ABC")
MATCH("ABC", "AB")
Note
The shorter value must appear at the start of the longer value to constitute a match.
Datetime parameters
A date, datetime, or time field specified as a function input can use any date, datetime, or time format, as
long as the field definition correctly defines the format.
Analytics uses serial number equivalents to process datetime calculations, so even if you are interested in
only the date portion of a datetime value, the time portion still forms part of the calculation.
Consider the following examples:
Returns T, because 31 December 2014 matches the second test value:
MATCH(`20141231`,`20141229`,`20141231`)
Returns F, even though the comparison_value and the second test value have an identical date of 31
December 2014:
MATCH(`20141231 120000`,`20141229`,`20141231`)
If we look at the serial number equivalent of these two expressions, we can see why the second one eval-
uates to false.
Returns T, because the serial number comparison_value is equal to the second serial number test:
Returns F, because the serial number comparison_value does not equal any of the test values:
The date portion of the serial numbers 42003.500000 and 42003.000000 match, but the time portions do
not. 0.500000 is the serial number equivalent of 12:00 PM.
MATCH(CTOD(DATE(`20141231 120000`,"YYYYMMDD"),"YYYYMMDD"),`20141229`,
`20141231`)
l Datetime values – you can use any combination of the date, separator, and time formats listed in
the table below. The date must precede the time, and you must use a separator between the two.
Valid separators are a single blank space, the letter 't', or the letter 'T'.
l Time values – you must specify times using the 24-hour clock. Offsets from Coordinated Universal
Time (UTC) must be prefaced by a plus sign (+) or a minus sign (-).
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
thhmmss `t235959`
Thhmm `T2359`
Note
Do not use hh alone in the main time
format with data that has a UTC offset.
For example, avoid: hh+hhmm. Results
can be unreliable.
MAXIMUM( ) function
Returns the maximum value in a set of numeric values, or the most recent value in a set of datetime values.
Syntax
MAXIMUM(value_1, value_2 <,...n>)
Parameters
Name Type Description
Output
Numeric or Datetime.
Examples
Basic examples
Literal numeric input
Returns 7:
MAXIMUM(4, 7)
Returns 8:
MAXIMUM(4, 7, 3, 8)
Returns 8.00:
MAXIMUM(4, 7.25, 3, 8)
Returns `23:59:59`:
Field input
Returns the most recent date among the three fields for each record:
Advanced examples
Creating a computed field with a minimum default value
If you have a table of overdue accounts, create an Interest_Due computed field with a minimum default
value of $1.00:
If the balance multiplied by the interest rate is less than one dollar, MAXIMUM( ) returns 1. Otherwise,
MAXIMUM( ) returns the calculated interest amount.
Remarks
How decimal places work in sets of numeric values
If the numeric values being compared do not have the same number of decimal places, the result is adjusted
to the largest number of decimal places.
Returns 20.400:
You can use the DECIMALS( ) function to adjust the number of decimals for value parameters.
Returns 20.40:
MINIMUM( ) function
Returns the minimum value in a set of numeric values, or the oldest value in a set of datetime values.
Syntax
MINIMUM(value_1, value_2 <,...n>)
Parameters
Name Type Description
Output
Numeric or Datetime.
Examples
Basic examples
Literal numeric input
Returns 4:
MINIMUM(4, 7)
Returns 3:
MINIMUM(4, 7, 3, 8)
Returns 3.00:
MINIMUM(4, 7.25, 3, 8)
Returns `23:59:57`:
Field input
Returns the oldest date among the three fields for each record:
Advanced examples
Identifying the lowest value among multiple fields
Create a computed field to identify the lowest value among the Cost, Sale_Price, and Discount_Price
fields:
Remarks
How decimal places work in sets of numeric values
If the numeric values being compared do not have the same number of decimal places, the result is adjus-
ted to the largest number of decimal places.
Returns 3.600:
MINIMUM(3.6,10.88, 20.482)
You can use the DECIMALS( ) function to adjust the number of decimals for value parameters.
Returns 3.60:
MINUTE( ) function
Extracts the minutes from a specified time or datetime and returns it as a numeric value.
Syntax
MINUTE(time/datetime)
Parameters
Name Type Description
time/datetime datetime The field, expression, or literal value to extract the minutes from.
Output
Numeric.
Examples
Basic examples
Returns 59:
MINUTE(`t235930`)
MINUTE(`20141231 235930`)
MINUTE(Call_start_time)
Remarks
Abbreviating MINUTE( ) in scripts
In ACLScript, if you abbreviate the MINUTE( ) function, you must use at least the first four letters ( MINU ).
Analytics reserves the abbreviation MIN for the MINIMUM( ) function.
Parameter details
A field specified for time/datetime can use any time or datetime format, as long as the field definition cor-
rectly defines the format.
Specifying a literal time or datetime value
When specifying a literal time or datetime value for time/datetime, you are restricted to the formats in the
table below, and you must enclose the value in backquotes – for example, `20141231 235959`.
Do not use any separators such as slashes (/) or colons (:) between the individual components of dates or
times.
l Time values – you can use any of the time formats listed in the table below. You must use a sep-
arator before a standalone time value for the function to operate correctly. Valid separators are the
letter 't', or the letter 'T'. You must specify times using the 24-hour clock. Offsets from Coordinated
Universal Time (UTC) must be prefaced by a plus sign (+) or a minus sign (-).
l Datetime values – you can use any combination of the date, separator, and time formats listed in
the table below. The date must precede the time, and you must use a separator between the two.
Valid separators are a single blank space, the letter 't', or the letter 'T'.
thhmmss `t235959`
Thhmm `T2359`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset.
For example, avoid: hh+hhmm. Results
can be unreliable.
MOD( ) function
Returns the remainder from dividing two numbers.
Syntax
MOD(number, divisor_number)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 3:
MOD(93, 10)
Returns 2.0:
MOD(66, 16.00)
Returns 3.45:
MOD(53.45, 10)
Advanced examples
Calculating an anniversary date
Defines a field that shows the number of months since the last anniversary:
Remarks
When to use MOD( )
Use the MOD( ) function to test whether two numbers divide evenly, or to isolate the remainder of a division
calculation. This function divides one number by another and returns the remainder.
MONTH( ) function
Extracts the month from a specified date or datetime and returns it as a numeric value (1 to 12).
Syntax
MONTH(date/datetime)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value to extract the month from.
Output
Numeric.
Examples
Basic examples
Returns 12:
MONTH(`20141231`)
MONTH(`20141231 235959`)
MONTH(Invoice_date)
Remarks
Parameter details
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
Related functions
If you need to return the name of the month of the year, use CMOY( ) instead of MONTH( ).
NOMINAL( ) function
Returns the nominal annual interest rate on a loan.
Syntax
NOMINAL(effective_rate, periods)
Parameters
Name Type Description
Output
Numeric. The rate is calculated to eight decimals places.
Examples
Basic examples
Returns 0.17998457 (18%), the nominal annual rate of interest on the unpaid balance of a credit card that
charges an effective annual rate of 19.56%:
NOMINAL(0.1956, 12)
Remarks
What is the nominal interest rate?
The nominal annual interest rate on a loan is the stated or posted rate of interest, without taking into
account interest on the remaining balance, compounded monthly or daily.
Related functions
The EFFECTIVE( ) function is the inverse of the NOMINAL( ) function.
NORMDIST( ) function
Returns the probability that a random variable from a normally distributed data set is less than or equal to a
specified value, or exactly equal to a specified value.
Syntax
NORMDIST(x, mean, standard_deviation, cumulative)
Parameters
Name Type Description
x numeric The value for which you want to calculate the probability.
standard_deviation numeric The standard deviation of the data set. The standard_deviation value
must be greater than 0.
cumulative logical Specify T to calculate the probability that a random variable is less
than or equal to x (cumulative probability), or F to calculate the prob-
ability that a random variable is exactly equal to x (simple probability).
Output
Numeric.
Examples
Basic examples
Returns 0.908788780274132:
Returns 0.109340049783996:
NORMSINV( ) function
Returns the z-score associated with a specified probability in a standard normal distribution. The z-score is
the number of standard deviations a value is from the mean of a standard normal distribution.
Syntax
NORMSINV(probability)
Parameters
Name Type Description
probability numeric The probability for which you want to calculate the z-score.
Output
Numeric.
Examples
Basic examples
Returns 1.333401745213610:
NORMSINV(0.9088)
NOW( ) function
Returns the current operating system time as a Datetime data type.
Syntax
NOW()
Parameters
This function does not have any parameters.
Output
Datetime.
Examples
Basic examples
Returns the current operating system time as a datetime value, for example, `t235959`, displayed using
the current Analytics time display format:
NOW()
Remarks
Related functions
If you need to return the current operating system time as a character string, use TIME( ) instead of NOW(
).
NPER( ) function
Returns the number of periods required to pay off a loan.
Syntax
NPER(rate, payment, amount <,type>)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 300.00, the number of months required to pay off a $275,000 loan at 6.5% per annum, with pay-
ments of $1,856.82 due at the end of each month:
Returns 252.81, the number of months required to pay off the same loan, with payments of $2,000 due at
the end of each month:
Returns 249.92, the number of months required to pay off the same loan, with payments of $2,000 due at
the beginning of each month:
Advanced examples
Annuity calculations
Annuity calculations involve four variables:
l present value, or future value – $21,243.39 and $ 26,973.46 in the examples below
l payment amount per period – $1,000.00 in the examples below
l interest rate per period – 1% per month in the examples below
l number of periods – 24 months in the examples below
If you know the value of three of the variables, you can use an Analytics function to calculate the fourth.
PVANNUITY( )
Returns 21243.39:
FVANNUITY( )
Returns 26973.46:
PMT( )
Returns 1000:
RATE( )
Returns 0.00999999 (1%):
Returns 24.00:
Annuity formulas
The formula for calculating the present value of an ordinary annuity (payment at the end of a period):
The formula for calculating the future value of an ordinary annuity (payment at the end of a period):
OCCURS( ) function
Returns a count of the number of times a substring occurs in a specified character value.
Syntax
OCCURS(string, search_for)
Parameters
Name Type Description
OCCURS(First_Name+Last_Name,"John")
Output
Numeric.
Examples
Basic examples
Returns 2:
OCCURS("abc/abc/a","ab")
Returns 3:
OCCURS("abc/abc/a","a")
Returns the number of times a hyphen occurs in each value in the Invoice_Number field:
OCCURS(Invoice_Number, "-")
Advanced examples
Finding invoice numbers with more than one hyphen
If invoice numbers in a table should have only one hyphen, use the OCCURS( ) function to create a filter that
isolates invoice numbers that have two or more hyphens:
Including the ALLTRIM( ) function in the expression removes any leading or trailing spaces from the Last_
Name field, ensuring that only text values are compared.
If you want to find all occurrences of "United Equipment" regardless of casing, use the UPPER( ) function to
convert the search field values to uppercase:
OFFSET( ) function
Returns the value of a field with the starting position offset by a specified number of bytes.
Syntax
OFFSET(field, number_of_bytes)
Parameters
Name Type Description
Output
The return value is the same data type as the input field parameter.
Examples
Basic examples
If you have a field called "Number" that contains the value "1234567890" and you define an overlapping
field called "Offset_Number" that has a starting position of 1, a length of 3, and no decimals places, you
can use the OFFSET( ) function to shift the numbers in the field.
Returns 123:
OFFSET(Offset_Number,0)
Returns 234:
OFFSET(Offset_Number,1)
Returns 789:
OFFSET(Offset_Number,6)
Remarks
You can use this function to temporarily offset the starting position of a field. This is useful when you are pro-
cessing data where the field starting position is variable.
If you use the OFFSET( ) function with conditional computed fields, any fields referenced in the IF test will
also be offset.
OMIT( ) function
Returns a string with one or more specified substrings removed.
Syntax
OMIT(string1, string2 <,case_sensitive>)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Literal character input
Returns "Intercity Couriers":
Note
The Levenshtein distance between the returned values in the first two examples is 1. If the
generic elements are not removed, the distance between the two examples is 8, which
could allow the values to escape detection as fuzzy duplicates.
Field input
Returns all the values in the Vendor_Name field with generic elements such as "Corporation" and "Inc."
removed:
Returns all the values in the Vendor_Name field with generic elements such as "Corporation" and "Inc."
removed:
OMIT(Vendor_Name," ,.,Corporation,Corp,Inc,Ltd")
Note
The two preceding examples both return the same results but the syntax for the second
example is more efficient.
Returns all the values in the Vendor_Name field with "Corporation" and "Corp" removed, and all commas
removed:
Remarks
OMIT( ) can remove substrings as units
The OMIT( ) function removes one or more substrings from a string. It differs from functions such as CLEAN
( ), EXCLUDE( ), INCLUDE( ), and REMOVE( ) because it matches and removes characters on a substring
basis rather than on a character-by-character basis. Substring removal allows you to remove specific
words, abbreviations, or repeated sequences of characters from a string without affecting the remainder of
the string.
PACKED( ) function
Returns numeric data converted to the Packed data type.
Syntax
PACKED(number, length_of_result)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Integer and decimal input
Returns 00075C:
PACKED(75, 3)
PACKED(7.5, 3)
PACKED(-12.456, 6)
Returns 456D:
PACKED(-12.456, 2)
Advanced examples
Creating an 8-byte field to update a mainframe
You need to create an 8-byte field containing each employee's salary as a PACKED number for uploading
to a mainframe:
Remarks
What is Packed data?
The Packed data type is used by mainframe operating systems to store numeric values in a format that uses
minimal storage space. The Packed data type stores two digits in each byte, and the last byte indicates
whether the value is positive or negative.
PI( ) function
Returns the value of pi to 15 decimal places.
Syntax
PI( )
Parameters
This function does not have any parameters.
Output
Numeric.
Examples
Basic examples
Returns 3.141592653589793 (the value of pi to 15 decimal places):
PI( )
60 * PI( )/180
Advanced examples
Using degrees as input
Returns 0.866025403784439 (the sine of 60 degrees):
SIN(60 * PI( )/180)
Remarks
When to use PI( )
Use PI( ) to convert degrees to radians: (degrees * PI( )/180) = radians. Radians are the required input for
three of Analytics's math functions: SIN( ), COS( ), and TAN( ).
PMT( ) function
Returns the amount of the periodic payment (principal + interest) required to pay off a loan.
Syntax
PMT(rate, periods, amount <,type>)
Parameters
Name Type Description
Note
You must use consistent time periods when specifying rate and periods to ensure that you
are specifying interest rate per period.
For example:
l for monthly payments on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for annual payments on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric.
Examples
Basic examples
Returns 1856.82, the monthly payment (principal + interest) required to pay off a twenty-five year, $275,000
loan at 6.5% per annum, with payments due at the end of the month:
Returns 1846.82, the monthly payment (principal + interest) required to pay off the same loan, with pay-
ments due at the beginning of the month:
Advanced examples
Annuity calculations
Annuity calculations involve four variables:
l present value, or future value – $21,243.39 and $ 26,973.46 in the examples below
l payment amount per period – $1,000.00 in the examples below
l interest rate per period – 1% per month in the examples below
l number of periods – 24 months in the examples below
If you know the value of three of the variables, you can use an Analytics function to calculate the fourth.
PVANNUITY( )
Returns 21243.39:
FVANNUITY( )
Returns 26973.46:
PMT( )
Payment amount per period Returns 1000:
RATE( )
Returns 0.00999999 (1%):
NPER( )
Returns 24.00:
Annuity formulas
The formula for calculating the present value of an ordinary annuity (payment at the end of a period):
The formula for calculating the future value of an ordinary annuity (payment at the end of a period):
PPMT( ) function
Returns the principal paid on a loan for a single period.
Syntax
PPMT(rate, specified_period, periods, amount <,type>)
Parameters
Name Type Description
specified_period numeric The period for which you want to find the principal payment.
Note
You must use consistent time periods when specifying rate and periods to ensure that you
are specifying interest rate per period.
For example:
l for monthly payments on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for annual payments on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric.
Examples
Basic examples
Returns 367.24, the principal paid in the first month of a twenty-five year, $275,000 loan at 6.5% per
annum, with payments due at the end of the month:
Returns 1846.82, the principal paid on the same loan in the last month of the loan:
Remarks
Related functions
The IPMT( ) function is the complement of the PPMT( ) function.
The CUMPRINC( ) function calculates principal paid during a range of periods.
PROPER( ) function
Returns a string with the first character of each word set to uppercase and the remaining characters set to
lowercase.
Syntax
PROPER(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "John Doe":
PROPER("JOHN DOE")
PROPER("john doe")
PROPER("BILL O'HARA")
Returns all the values in the Company_Name field converted to proper case:
PROPER(Company_Name)
Remarks
How it works
The PROPER( ) function converts the first character in string, and any subsequent character preceded by
a blank, to uppercase.
Subsequent characters preceded by a hyphen, an apostrophe, an ampersand (&), and several other punc-
tuation marks and special characters are also converted to uppercase. All other alphabetic characters are
converted to lowercase.
PROPERTIES( ) function
Returns properties information for the specified Analytics project item.
Syntax
PROPERTIES(name, obj_type, info_type)
Parameters
Name Type Description
name character The name of the Analytics project item you want information about.
name is not case-sensitive.
If the project item is an Analytics table, specify the table layout name,
not the data file name. For example: "Invoices", not "january_invoices.-
fil"
If you are using the PROPERTIES( ) function to return the name of the
active table, specify the name "activetable"
info_type character The type of information you want about the Analytics project item.
For more information, see "Types of properties information" on
page 687.
Output
Character. The maximum length of the output string is 260 characters. If properties information cannot be
found, an empty string is returned.
Examples
Basic examples
Information about the Analytics data file (.fil)
Returns "Ap_Trans.fil":
Returns "EXCEL":
Remarks
File information
Information types starting with "file" provide information about the Analytics data file (.fil) associated with
an Analytics table.
Source information
Information types starting with "source" provide information about external data sources that can be asso-
ciated with an Analytics table. Only those external data sources that support refreshing an Analytics table
can be reported on using the PROPERTIES( ) function:
l Microsoft Excel
l Microsoft Access
l Delimited text
l Adobe Acrobat (PDF)
l Print Image (Report)
l SAP private file format/DART
l XML
l XBRL
l ODBC data sources
obj_
type info_type Returns:
"table" "filename" The name of the data file associated with the Analytics table.
"filepath" The path of the data file associated with the Analytics table.
"filesize" The size, in KB, of the data file associated with the Analytics table.
"filemodifiedat" The time and date that the data file associated with the Analytics table was last mod-
ified.
"sourcename" The name of the data source associated with the Analytics table.
Data sources can be external files such as Excel, Access, PDF, XML, or delimited text
files, or ODBC data sources.
"sourcepath" The path of the data source associated with the Analytics table.
Not supported for ODBC data sources.
"sourcetype" The type of the data source associated with the Analytics table.
"sourcesize" The size, in KB, of the data source associated with the Analytics table.
Not supported for ODBC data sources.
"sourcemodifiedat" The time and date that the data source associated with the Analytics table was last
modified.
Not supported for ODBC data sources.
PVANNUITY( ) function
Returns the present value of a series of future payments calculated using a constant interest rate. Present
value is the current, lump-sum value.
Syntax
PVANNUITY(rate, periods, payment <,type>)
Parameters
Name Type Description
Note
You must use consistent time periods when specifying rate, periods, and payment to
ensure that you are specifying interest rate per period.
For example:
l for a monthly payment on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for an annual payment on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric. The result is calculated to two decimal places.
Examples
Basic examples
Monthly payments
Returns 21455.82, the present value of $1,000 paid at the beginning of each month for 2 years at 1% per
month, compounded monthly:
Annual payments
Returns 20280.61, the present value of $12,000 paid at the end of each year for 2 years at 12% per
annum, compounded annually:
PVANNUITY(0.12, 2, 12000, 0)
Advanced examples
Annuity calculations
Annuity calculations involve four variables:
l present value, or future value – $21,243.39 and $ 26,973.46 in the examples below
l payment amount per period – $1,000.00 in the examples below
l interest rate per period – 1% per month in the examples below
l number of periods – 24 months in the examples below
If you know the value of three of the variables, you can use an Analytics function to calculate the fourth.
PVANNUITY( )
Returns 21243.39:
FVANNUITY( )
Returns 26973.46:
PMT( )
Returns 1000:
RATE( )
Returns 0.00999999 (1%):
NPER( )
Returns 24.00:
Annuity formulas
The formula for calculating the present value of an ordinary annuity (payment at the end of a period):
The formula for calculating the future value of an ordinary annuity (payment at the end of a period):
Remarks
Related functions
The FVANNUITY( ) function is the inverse of the PVANNUITY( ) function.
PVLUMPSUM( ) function
Returns the present value required to generate a specific future lump sum calculated using a constant
interest rate. Present value is the current, lump-sum value.
Syntax
PVLUMPSUM(rate, periods, amount)
Parameters
Name Type Description
amount numeric The value of the future lump sum at the end of the last period.
Note
You must use consistent time periods when specifying rate and periods to ensure that you
are specifying interest rate per period.
For example:
l for monthly payments on a two-year loan or investment with interest of 5% per
annum, specify 0.05/12 for rate and 2 * 12 for periods
l for annual payments on the same loan or investment, specify 0.05 for rate and 2 for
periods
Output
Numeric. The result is calculated to two decimal places.
Examples
Basic examples
Interest compounded monthly
Returns 1000.00, the initial investment principal required to generate a future lump sum of $1,269.73, when
invested for 2 years at 1% per month, compounded monthly:
Returns 787.57, the initial investment principal required to generate a future lump sum of $1,000, when
invested for 2 years at 1% per month, compounded monthly:
Returns 21455.82, the initial investment principal required to generate a future lump sum of $27,243.20,
when invested for 2 years at 1% per month, compounded monthly:
PVLUMPSUM(0.12, 2, 1000)
Remarks
What is present value?
The present value of an invested lump sum is the initial principal required to generate a specific future
lump sum, within a particular time frame. The future value is the principal plus the accumulated compound
interest.
Related functions
The FVLUMPSUM( ) function is the inverse of the PVLUMPSUM( ) function.
PYDATE( ) function
Returns a date value calculated by a function in a external Python script. Data processing in Python is
external to Analytics.
Syntax
PYDATE("PyFile,PyFunction" <, field|value <,...n>>)
Parameters
Name Type Description
PyFile,PyFunction character The name of the Python script to run followed by a comma and then
the name of the function that returns the value:
"myScript,myFunction"
When specifying the Python script, omit the file extension. The function
you call may call other functions within the script or within other scripts,
however all scripts that run must be placed inside a folder in the
PYTHONPATH system environment variable prior to running.
For more information, see "Configuring Python for use with Analytics"
on page 905.
Note
Your PyFunction must return a Python datetime.date
object.
field |value <,...n> character This list of fields, expressions, or literal values to use as arguments for
the Python function. The values are passed into the function you call in
optional numeric
the order you specify them.
datetime
You may include as many arguments as necessary to satisfy the func-
logical tion definition in the Python script.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Datetime.
Examples
Basic examples
Returns `20160630`:
External Python script that accepts a date and a grace period as a number of days and calculates the date
the invoice is due. For an invoice date of 2016-05-31 and a period of 30 days: "2016-06-30":
#! python
from datetime import timedelta
Advanced examples
Defining a computed field
Defines a computed field in the Ap_Trans table using the Python script that calculates due date:
OPEN Ap_Trans
DEFINE FIELD due_date COMPUTED
WIDTH 27
PYDATE( "hello,due_date" ,Invoice_Date, Pay_Period)
PYDATETIME( ) function
Returns a datetime value calculated by a function in an external Python script. Data processing in Python is
external to Analytics.
Syntax
PYDATETIME("PyFile,PyFunction" <, field|value <,...n>>)
Parameters
Name Type Description
PyFile,PyFunction character The name of the Python script to run followed by a comma and then
the name of the function that returns the value:
"myScript,myFunction"
When specifying the Python script, omit the file extension. The function
you call may call other functions within the script or within other scripts,
however all scripts that run must be placed inside a folder in the
PYTHONPATH system environment variable prior to running.
For more information, see "Configuring Python for use with Analytics"
on page 905.
Note
Your PyFunction must return a Python datetime object.
field |value <,...n> character This list of fields, expressions, or literal values to use as arguments for
the Python function. The values are passed into the function you call in
optional numeric
the order you specify them.
datetime
You may include as many arguments as necessary to satisfy the func-
logical tion definition in the Python script.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Datetime.
Examples
Basic examples
Returns `20170101t0500`:
External Python script that accepts a date argument and a time argument and returns a combined Dat-
etime object:
# hello.py content
from datetime import datetime
def combine_date_time(d,t):
return datetime.combine(d,t)
Advanced examples
Adding time to a datetime
Returns `20160101t2230`:
External Python script that accepts a datetime and a time and adds the time to the datetime: 2016-01-01
15:00:00 + 7 hours, 30 minutes, 00 seconds = 2016-01-01 22:30:00.
# hello.py content
from datetime import timedelta
from datetime import datetime
from datetime import time
def add_time(start, time_to_add):
return start + timedelta(hours=time_to_add.hour, minutes=time_to_add.minute, seconds=time_to_
add.second)
PYLOGICAL( ) function
Returns a logical value calculated by a function in an external Python script. Data processing in Python is
external to Analytics.
Syntax
PYLOGICAL("PyFile,PyFunction" <, field|value <,...n>>)
Parameters
Name Type Description
PyFile,PyFunction character The name of the Python script to run followed by a comma and then
the name of the function that returns the value:
"myScript,myFunction"
When specifying the Python script, omit the file extension. The function
you call may call other functions within the script or within other scripts,
however all scripts that run must be placed inside a folder in the
PYTHONPATH system environment variable prior to running.
For more information, see "Configuring Python for use with Analytics"
on page 905.
Note
Your PyFunction must return a Python truth value.
field |value <,...n> character This list of fields, expressions, or literal values to use as arguments for
the Python function. The values are passed into the function you call in
optional numeric
the order you specify them.
datetime
You may include as many arguments as necessary to satisfy the func-
logical tion definition in the Python script.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Logical.
Examples
Basic examples
Returns F:
External Python script that compares str1 and str2 using the count of the character that is passed in as
char:
# hello.py content
def str_compare(str1, str2, char):
return str1.count(char) > str2.count(char)
Advanced examples
Using fields
Returns a truth value when comparing Vendor_Name and Vendor_City:
External Python script that compares str1 and str2 using the count of the character that is passed in as
char:
# hello.py content
def str_compare(str1, str2, char):
return str1.count(char) > str2.count(char)
PYNUMERIC( ) function
Returns a numeric value calculated by a function in an external Python script. Data processing in Python is
external to Analytics.
Syntax
PYNUMERIC(PyFile,PyFunction, decimal <, field|value <,...n>>)
Parameters
Name Type Description
PyFile,PyFunction character The name of the Python script to run followed by a comma and then
the name of the function that returns the value:
"myScript,myFunction"
When specifying the Python script, omit the file extension. The function
you call may call other functions within the script or within other scripts,
however all scripts that run must be placed inside a folder in the
PYTHONPATH system environment variable prior to running.
For more information, see "Configuring Python for use with Analytics"
on page 905.
Note
Your PyFunction must return a Python numeric type.
decimal numeric The number of decimal places to include in the return value. Must be a
positive integer.
field |value <,...n> character This list of fields, expressions, or literal values to use as arguments for
the Python function. The values are passed into the function you call in
optional numeric
the order you specify them.
datetime
You may include as many arguments as necessary to satisfy the func-
logical tion definition in the Python script.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Numeric.
Examples
Basic examples
Returns 35.00:
External Python script that returns the value at the requested percentile from a dynamically sized list of val-
ues:
# hello.py content
from math import ceil
def get_nth_percent(percentage, *values):
input_length = len(values)
position = ceil((percentage/100.00) * input_length)
return values[position-1]
PYSTRING( ) function
Returns a character value calculated by a function in an external Python script. Data processing in Python is
external to Analytics.
Syntax
PYSTRING("PyFile,PyFunction", length <,field|value <,...n>>)
PyFile,PyFunction character The name of the Python script to run followed by a comma and then
the name of the function that returns the value:
"myScript,myFunction"
When specifying the Python script, omit the file extension. The function
you call may call other functions within the script or within other scripts,
however all scripts that run must be placed inside a folder in the
PYTHONPATH system environment variable prior to running.
For more information, see "Configuring Python for use with Analytics"
on page 905.
Note
Your PyFunction must return a Python string object.
field |value <,...n> character This list of fields, expressions, or literal values to use as arguments for
the Python function. The values are passed into the function you call in
optional numeric
the order you specify them.
datetime
You may include as many arguments as necessary to satisfy the func-
logical tion definition in the Python script.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Character.
Examples
Basic examples
Returns "my test":
External Python script that accepts a string and concatenates " test" to the string:
#! python
# hello.py content
def main(str):
str2 = str + ' test'
return(str2)
Advanced examples
Returning a substring
This example removes the last two characters from the Vendor Name field and returns the substring:
External Python script that accepts a string, a string length, and two character positions. The function
returns a substring between position one and position two:
#hello.py content
def sub_set(str, length, p1, p2):
if p1 >= 0 and p2 < length and p1 < p2:
str2 = str[p1:p2]
else:
str2 = str
return str2
PYTIME( ) function
Returns a time value calculated by a function in an external Python script. Data processing in Python is
external to Analytics.
Syntax
PYTIME("PyFile,PyFunction" <, field|value <,...n>>)
Parameters
Name Type Description
PyFile,PyFunction character The name of the Python script to run followed by a comma and then
the name of the function that returns the value:
"myScript,myFunction"
When specifying the Python script, omit the file extension. The function
you call may call other functions within the script or within other scripts,
however all scripts that run must be placed inside a folder in the
PYTHONPATH system environment variable prior to running.
For more information, see "Configuring Python for use with Analytics"
on page 905.
Note
Your PyFunction must return a Python datetime.time
object.
field |value <,...n> character This list of fields, expressions, or literal values to use as arguments for
the Python function. The values are passed into the function you call in
optional numeric
the order you specify them.
datetime
You may include as many arguments as necessary to satisfy the func-
logical tion definition in the Python script.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Datetime.
Examples
Basic examples
Returns `t2122`:
# hello.py content
from datetime import time
from datetime import date
def get_time(timestamp):
return timestamp.time();
RAND( ) function
Returns a random number that falls within a specified boundary.
Syntax
RAND(number)
Parameters
Name Type Description
RAND(100)
RAND(-100)
Output
Numeric.
Examples
Basic examples
Returns 278.61:
RAND(1000.00)
Returns 3781:
RAND(10000)
Note
The return value will differ with each execution of the function.
Remarks
RAND( ) cannot replicate results
If you use the RAND( ) function consecutively with the same number value, it produces different results.
Unlike the RANDOM command, the RAND( ) function has no seed value.
RATE( ) function
Returns the interest rate per period.
Syntax
RATE(periods, payment, amount)
Parameters
Name Type Description
Note
The RATE( ) function assumes that payments are made at the end of each period.
Output
Numeric. The rate is calculated to eight decimals places.
Examples
Basic examples
Returns 0.00541667 (0.54%), the monthly interest rate implied by a twenty-five year, $275,000 loan with
monthly payments of $1,856.82:
Returns 0.06500004 (6.5%), the annual interest rate implied by the same loan:
Advanced examples
Converting the nominal rate to the effective rate
The RATE( ) function calculates the nominal interest rate. You can use the EFFECTIVE( ) function to con-
vert the result of RATE( ) to the effective interest rate.
Returns 0.06715155 (6.7%), the effective annual interest rate implied by the loan in the examples above:
Annuity calculations
Annuity calculations involve four variables:
l present value, or future value – $21,243.39 and $ 26,973.46 in the examples below
l payment amount per period – $1,000.00 in the examples below
l interest rate per period – 1% per month in the examples below
l number of periods – 24 months in the examples below
If you know the value of three of the variables, you can use an Analytics function to calculate the fourth.
PVANNUITY( )
Returns 21243.39:
FVANNUITY( )
Returns 26973.46:
PMT( )
Returns 1000:
RATE( )
Returns 0.00999999 (1%):
Returns 24.00:
Annuity formulas
The formula for calculating the present value of an ordinary annuity (payment at the end of a period):
The formula for calculating the future value of an ordinary annuity (payment at the end of a period):
RDATE( ) function
Returns a date value calculated by an R function or script. Data processing in R is external to Analytics.
Syntax
RDATE(rScript|rCode <,field|value <,...n>>)
Parameters
Name Type Description
rScript | rCode character The full or relative path to the R script, or a snippet of R code to run.
If you enter R code directly rather than use an external file, you can-
not use the enclosing quotation character in your code, even if you
escape it:
o valid – 'var <- "\"test\"" '
o invalid – 'var <- "\'test\'" '
field | value <,...n> character The list of fields, expressions, or literal values to use as arguments for
the R script or code snippet.
optional numeric
The values are passed into the function you call in the order you spe-
datetime
cify them, and you reference them using value1, value2 ... valueN in
logical the R code.
You may include as many arguments as necessary to satisfy the func-
tion definition in the R code.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Datetime.
Examples
Basic examples
Returns `20160530`:
RDATE("as.Date(value1,'%m-%d-%Y')", "05-30-16")
Advanced examples
Using an external R script
Converts a string to a date and returns it:
RDATE("a<-source('c:\\scripts\\r_scripts\\sample.r');a[[1]]", dateText)
Remarks
Returning data from R
When calling R scripts, use the source function and assign the return object to a variable. You can then
access the value returned from your R function from the return object:
# 'a' holds the response object and a[[1]] access the data value
"a<-source('c:\\scripts\\r_scripts\\sample.r');a[[1]]"
R log file
Analytics logs R language messages to an aclrlang.log file in the project folder. Use this log file for
debugging R errors.
Tip
The log file is available in the Results folder of Analytics Exchange analytic jobs.
RDATETIME( ) function
Returns a datetime value calculated by an R function or script. Data processing in R is external to Analytics.
Syntax
RDATETIME(rScript|rCode <,field|value <,...n>>)
Parameters
Name Type Description
rScript | rCode character The full or relative path to the R script, or a snippet of R code to run.
If you enter R code directly rather than use an external file, you cannot
use the enclosing quotation character in your code, even if you escape
it:
o valid – 'var <- "\"test\"" '
o invalid – 'var <- "\'test\'" '
field | value <,...n> character The list of fields, expressions, or literal values to use as arguments for
the R script or code snippet.
optional numeric
The values are passed into the function you call in the order you spe-
datetime
cify them, and you reference them using value1, value2 ... valueN in
logical the R code.
You may include as many arguments as necessary to satisfy the func-
tion definition in the R code.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Datetime.
Examples
Basic examples
Adds 45 minutes to the current date and time:
RDATETIME("Sys.time() + value1",2700)
Advanced examples
Using an external R script
Adds 45 minutes to a datetime field by passing a field and a literal value to an external R function:
Remarks
Returning data from R
When calling R scripts, use the source function and assign the return object to a variable. You can then
access the value returned from your R function from the return object:
# 'a' holds the response object and a[[1]] access the data value
"a<-source('c:\\scripts\\r_scripts\\sample.r');a[[1]]"
R log file
Analytics logs R language messages to an aclrlang.log file in the project folder. Use this log file for
debugging R errors.
Tip
The log file is available in the Results folder of Analytics Exchange analytic jobs.
RECLEN( ) function
Returns the length of the current record.
Syntax
RECLEN( )
Parameters
This function does not have any parameters.
Output
Numeric.
Examples
Basic examples
The following example extracts all records where the length is exactly 110:
Remarks
You can use the RECLEN( ) function to identify to records of a particular length, or to test for shorter than
expected records. This function is useful if you are working with Print Image (Report) files because it
provides an easy way to examine the record lengths:
l For fixed-length records, the return value is a constant value (the record length).
l For variable-length records, the return value changes for each record.
RECNO( ) function
Returns the current record number.
Syntax
RECNO( )
Parameters
This function does not have any parameters.
Output
Numeric.
Examples
Basic examples
The following example extracts records numbered 10 through 20 to a new Analytics table:
Remarks
You can use the RECNO( ) function to output record numbers to a table, or to determine the relative location
of a particular record within a table.
Reordering records
When you reorder the records in a table, the record numbers generated by RECNO( ) are not reordered.
To keep the record numbers with the records that they were originally associated with, extract the data to a
new table using the Fields option before you reorder the records.
RECOFFSET( ) function
Returns a field value from a record that is a specified number of records from the current record.
Syntax
RECOFFSET(field, number_of_records)
Parameters
Name Type Description
field character The name of the field to retrieve the value from.
numeric
datetime
number_of_records numeric The number of records from the current record. A positive number spe-
cifies a record after the current record, and a negative number spe-
cifies a record before the current record.
Output
Character, Numeric, or Datetime. The return value belongs to the same data category as the input field para-
meter.
Examples
Basic examples
Returns an Amount value from the next record:
RECOFFSET(Amount,1)
RECOFFSET(Amount, -1)
Advanced examples
Using RECOFFSET in a computed field
The computed field Next_Amount shows the value of the Amount field in the next record only if the next
record has the same customer number.
To define this computed field in a script, use the following syntax:
Next_Amount is the value of the next record's Amount field only if the customer number in the next record
is the same as the customer number in the current record. Otherwise, Next_Amount is assigned a value of
zero.
Remarks
The RECOFFSET( ) function returns a field value from a record that is a specified number of records
above or below the current record.
REGEXFIND( ) function
Returns a logical value indicating whether the pattern specified by a regular expression occurs in a string.
Syntax
REGEXFIND(string, pattern)
Parameters
Name Type Description
string character The field, expression, or literal value to test for a matching pattern.
Output
Logical. Returns T (true) if the specified pattern value is found, and F (false) otherwise.
Examples
Basic examples
Alpha character patterns
Returns T for all records that contain "Phoenix", "Austin", or "Los Angeles" in the Vendor_City field. Returns
F otherwise:
Returns T for all last names that start with "John" or "Jon". For example: John, Jon, Johnson, Johnston, Jon-
son, Jonston, Jones, and so on. Returns F otherwise:
REGEXFIND(Last_Name,"^Joh?n")
Returns T for only those last names that are "John" or "Jon". Returns F otherwise:
REGEXFIND(Last_Name,"^Joh?n\b")
REGEXFIND(Invoice_Number, "98")
Returns T for all records with invoice numbers that begin with "98". Returns F otherwise:
REGEXFIND(Invoice_Number, "\b98")
Returns T for all records with invoice numbers that end with "98". Returns F otherwise:
REGEXFIND(Invoice_Number, "98\b")
Returns T for all records with invoice numbers that contain "98" in the 5th and 6th positions. Returns F oth-
erwise:
REGEXFIND(Invoice_Number, "\b\d\d\d\d98")
REGEXFIND(Invoice_Number, "\b\d{4}98")
REGEXFIND(Product_Code, "\b\d{3}-[a-zA-Z]{6}\b")
Returns T for all records with product codes that start with 3 or more numbers, followed by a hyphen and 6
or more letters. Returns F otherwise:
REGEXFIND(Product_Code, "\b\d{3,}-[a-zA-Z]{6}")
Returns T for all records with alphanumeric invoice identifiers that contain "98" in the 5th and 6th positions.
Returns F otherwise:
REGEXFIND(Invoice_Number, "\b\w{4}98")
Returns T for all records with invoice identifiers that contain both of the following, otherwise returns F:
l any character in the first four positions
l "98" in the 5th and 6th positions
REGEXFIND(Invoice_Number, "\b.{4}98")
Returns T for all records with invoice identifiers that contain "98" preceded by 1 to 4 initial characters.
Returns F otherwise:
REGEXFIND(Invoice_Number, "\b.{1,4}98")
Returns 'T' for all records with invoice identifiers that contain all of the following, otherwise returns F:
l any character in the first three positions
l "5" or "6" in the 4th position
l "98" in the 5th and 6th positions
REGEXFIND(Invoice_Number, "\b.{3}[56]98")
Returns T for all records with invoice identifiers that contain all of the following, otherwise returns F:
l any character in the first two positions
l "55" or "56" in the 3rd and 4th positions
l "98" in the 5th and 6th positions
REGEXFIND(Invoice_Number, "\b.{2}(55|56)98")
Remarks
How it works
The REGEXFIND( ) function uses a regular expression to search data in Analytics.
Regular expressions are powerful and flexible search strings that combine literal characters and metachar-
acters, which are special characters that perform a wide variety of search operations.
For example:
REGEXFIND(Last_Name,"Sm(i|y)the{0,1}")
uses the group ( ) , alternation | , and quantifier { } metacharacters to create a regular expression that finds
"Smith", "Smyth", "Smithe", or "Smythe" in the Last_Name field.
Concatenating fields
You can concatenate two or more fields in string if you want to search across multiple fields sim-
ultaneously.
For example:
REGEXFIND(Vendor_Name+Vendor_Street,"Hardware.*Main")
searches both the Vendor_Name and the Vendor_Street fields for the words "Hardware" and "Main" sep-
arated by zero or more characters.
A business with the word "Hardware" in its name, located on a street called "Main", matches the regular
expression. So does a business called "Hardware on Main".
The concatenated fields are treated like a single field that includes leading and trailing spaces from the indi-
vidual fields, unless you use the ALLTRIM( ) function to remove spaces.
Metacharacter Description
{} Matches the specified number of occurrences of the immediately preceding literal, metacharacter, or
element. You can specify an exact number, a range, or an open-ended range.
For example:
o a{3} matches "aaa"
o X{0,2}L matches "L", "XL", and "XXL"
o AB-\d{2,}-YZ matches any alphanumeric identifier with the prefix "AB-", the suffix "-YZ", and two or
more numbers in the body of the identifier
Metacharacter Description
() Creates a group that defines a sequence or block of characters, which can then be treated as a single
unit.
For example:
o S(ch)?mid?th? matches "Smith" or "Schmidt"
o (56A.*){2} matches any alphanumeric identifier in which the sequence "56A" occurs at least twice
o (56A).*-.*\1 matches any alphanumeric identifier in which the sequence "56A" occurs at least
twice, with a hyphen located between two of the occurrences
\ An escape character that specifies that the character immediately following is a literal. Use the escape
character if you want to literally match metacharacters. For example, \( finds a left parenthesis, and \\
finds a backslash.
Use the escape character if you want to literally match any of the following characters:
^ $ . * + ? = ! : | \ ( ) [ ] { }
Other punctuation characters such as the ampersand (&) or the 'at sign' (@) do not require the escape
character.
\ int Specifies that a group, previously defined with parentheses ( ), recurs. int is an integer that identifies
the sequential position of the previously defined group in relation to any other groups. This metachar-
acter can be used in the pattern parameter in both REGEXFIND( ) and REGEXREPLACE( ).
For example:
o (123).*\1 matches any identifier in which the group of digits "123" occurs at least twice
o ^(\d{3}).*\1 matches any identifier in which the first 3 digits recur
o ^(\d{3}).*\1.*\1 matches any identifier in which the first 3 digits recur at least twice
o ^(\D)(\d)-.*\2\1 matches any identifier in which the alphanumeric prefix recurs with the alpha and
numeric characters reversed
$int Specifies that a group found in a target string is used in a replacement string. int is an integer that iden-
tifies the sequential position of the group in the target string in relation to any other groups. This
metacharacter can only be used in the new_string parameter in REGEXREPLACE( ).
For example:
o If the pattern (\d{3})[ -]?(\d{3})[ -]?(\d{4}) is used to match a variety of different telephone number
formats, the new_string($1)-$2-$3 can be used to replace the numbers with themselves, and stand-
ardize the formatting. 999 123-4567 and 9991234567 both become (999)-123-4567.
| Matches the character, block of characters, or expression before or after the pipe (|)
For example:
o a|b matches a or b
o abc|def matches "abc" or "def"
o Sm(i|y)th matches Smith or Smyth
o [a-c]|[Q-S]|[x-z] matches any of the following letters: a, b, c, Q, R, S, x, y, z
o \s|- matches a space or a hyphen
Metacharacter Description
REGEXFIND("jsmith@example.net", "\bexample\b")
Related functions
If you want to find and replace matching patterns, use the "REGEXREPLACE( ) function" on the next page.
REGEXREPLACE( ) function
Replaces all instances of strings matching a regular expression with a new string.
Syntax
REGEXREPLACE(string, pattern, new_string)
Parameters
Name Type Description
string character The field, expression, or literal value to test for a matching pattern.
new_string character The string to use to replace all values matching pattern.
The replacement string can contain literal characters, groups of char-
acters from the original string (using the $int element), or a com-
bination of the two.
Output
Character.
Examples
Basic examples
Working with spaces
Returns "AB CD EF", by replacing multiple spaces between text characters with a single space:
Returns the character field data with the spacing between words standardized on a single space:
Returns the character field data with the spacing between words standardized on a single space. Using the
BLANKS( ) function in new_string, rather than a literal space, makes spaces easier to read and less likely to
be overlooked:
Returns the numbers in the Telephone_Number field and standardizes their formatting:
Extracts telephone numbers from surrounding text in the Comment field and standardizes their formatting:
REGEXREPLACE(REGEXREPLACE(REGEXREPLACE("1ABC-123aa","\d","9"),"[a-z]","x"),"[A-Z]",
"X")
REGEXREPLACE(REGEXREPLACE(REGEXREPLACE(Invoice_Number,"\d","9"),"[a-z]","x"),"[A-
Z]", "X")
Returns the names in the Full_Name field in their regular order: First (Middle) (Middle) Last:
Note
Name data can present various complications, such as apostrophes in names. Account-
ing for variations in name data typically requires more complex regular expressions than
the one provided in the example above.
Remarks
How it works
The REGEXREPLACE( ) function uses a regular expression to find matching patterns in data, and
replaces any matching values with a new string.
For example:
standardizes spacing in character data by replacing one or more spaces between text characters with a
single space.
The search portion of REGEXREPLACE( ) is identical to the REGEXFIND( ) function. For detailed inform-
ation about the search capability common to both functions, see "REGEXFIND( ) function" on page 723.
REGEXREPLACE(REGEXREPLACE("123ABC","\d","9"),"[A-Z]","X")
Returns "9X9X9X":
REGEXREPLACE(REGEXREPLACE("1A2B3C","\d","9"),"[A-Z]","X")
REGEXREPLACE("x123x","123","ABCDE")
Returns "xABCDEx", which includes all replacement characters and unreplaced existing characters:
REGEXREPLACE(SUBSTR("x123x",1,10),"123","ABCDE")
Metacharacter Description
Metacharacter Description
{} Matches the specified number of occurrences of the immediately preceding literal, metacharacter, or
element. You can specify an exact number, a range, or an open-ended range.
For example:
o a{3} matches "aaa"
o X{0,2}L matches "L", "XL", and "XXL"
o AB-\d{2,}-YZ matches any alphanumeric identifier with the prefix "AB-", the suffix "-YZ", and two
or more numbers in the body of the identifier
() Creates a group that defines a sequence or block of characters, which can then be treated as a
single unit.
For example:
o S(ch)?mid?th? matches "Smith" or "Schmidt"
o (56A.*){2} matches any alphanumeric identifier in which the sequence "56A" occurs at least
twice
o (56A).*-.*\1 matches any alphanumeric identifier in which the sequence "56A" occurs at least
twice, with a hyphen located between two of the occurrences
\ An escape character that specifies that the character immediately following is a literal. Use the
escape character if you want to literally match metacharacters. For example, \( finds a left par-
enthesis, and \\ finds a backslash.
Use the escape character if you want to literally match any of the following characters:
^ $ . * + ? = ! : | \ ( ) [ ] { }
Other punctuation characters such as the ampersand (&) or the 'at sign' (@) do not require the
escape character.
\ int Specifies that a group, previously defined with parentheses ( ), recurs. int is an integer that identifies
the sequential position of the previously defined group in relation to any other groups. This
metacharacter can be used in the pattern parameter in both REGEXFIND( ) and REGEXREPLACE
( ).
For example:
o (123).*\1 matches any identifier in which the group of digits "123" occurs at least twice
o ^(\d{3}).*\1 matches any identifier in which the first 3 digits recur
o ^(\d{3}).*\1.*\1 matches any identifier in which the first 3 digits recur at least twice
Metacharacter Description
o ^(\D)(\d)-.*\2\1 matches any identifier in which the alphanumeric prefix recurs with the alpha and
numeric characters reversed
$int Specifies that a group found in a target string is used in a replacement string. int is an integer that
identifies the sequential position of the group in the target string in relation to any other groups. This
metacharacter can only be used in the new_string parameter in REGEXREPLACE( ).
For example:
o If the pattern (\d{3})[ -]?(\d{3})[ -]?(\d{4}) is used to match a variety of different telephone number
formats, the new_string($1)-$2-$3 can be used to replace the numbers with themselves, and
standardize the formatting. 999 123-4567 and 9991234567 both become (999)-123-4567.
| Matches the character, block of characters, or expression before or after the pipe (|)
For example:
o a|b matches a or b
o abc|def matches "abc" or "def"
o Sm(i|y)th matches Smith or Smyth
o [a-c]|[Q-S]|[x-z] matches any of the following letters: a, b, c, Q, R, S, x, y, z
o \s|- matches a space or a hyphen
REGEXFIND("jsmith@example.net", "\bexample\b")
Metacharacter Description
Related functions
If you want to find matching patterns without replacing them, use the "REGEXFIND( ) function" on
page 723.
REMOVE( ) function
Returns a string that includes only the specified characters.
Syntax
REMOVE(string, valid_characters)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "ABC123 ":
Returns "ABC123XX ":
Returns "1234 ":
Returns all the values in the Product_Number field with all non-numeric characters removed:
REMOVE(Product_Number,"0123456789")
Remarks
Note
The REMOVE( ) function has been superseded by the INCLUDE( ) and EXCLUDE( ) func-
tions.
REMOVE( ) is still available in the current version of Analytics for backwards compatibility
with earlier versions.
How it works
The REMOVE( ) function removes unwanted characters from character data and returns a fixed length
string.
Case sensitivity
The REMOVE( ) function is case-sensitive. If you specify "ID" in valid_characters, these characters are not
included in "id#94022". If there is a chance the case may be mixed, use the UPPER( ) function to convert
string to uppercase.
For example:
REMOVE(UPPER("id#94022"), "ID0123456789")
Related functions
REMOVE( ) is similar to the INCLUDE( ) function, with the following difference:
l REMOVE( ) adds blanks to the end of the output to replace the characters that have been removed.
The original length of string is retained.
l INCLUDE( ) does not add any blanks.
REPEAT( ) function
Returns a string that repeats a substring a specified number of times.
Syntax
REPEAT(string, count)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "ABCABCABC":
REPEAT("ABC",3)
Returns "000000000":
REPEAT("0",9)
Remarks
When to use REPEAT( )
Use the REPEAT( ) function to initialize a variable with constant values or blanks, or to set a default value
for a computed field.
REPLACE( ) function
Replaces all instances of a specified character string with a new character string.
Syntax
REPLACE(string, old_text, new_text)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "a12345efg":
REPLACE("abcdefg","bcd","12345")
Returns "Rd.":
REPLACE("Road","Road","Rd.")
Returns "ac":
REPLACE("abc","b","")
Advanced examples
Removing specified characters
Use REPLACE( ) to remove a specified character string from a source string, by replacing it with an empty
character string ( "" ).
Returns "1234 Scott":
The field length is not automatically increased for subsequent replacements, and truncation can result if
the field is not long enough to accommodate all the new characters.
Returns "9ABC9A":
To avoid truncation, you can increase the length of string using the BLANKS( ) function, or literal blank
spaces.
Returns "9ABC9ABC":
If the resulting string is shorter than string, the resulting string is padded with blanks to maintain the same
field length.
Returns "9X9 ":
Remarks
How it works
The REPLACE( ) function replaces every instance of an existing string with a new string.
Returns "1234 Scott Road":
Case sensitivity
The REPLACE( ) function is case-sensitive. If you specify "RD." in old_text and the values in string are lower-
case, the new_text value will not be substituted because no matches will be found.
If there is a chance the case in string may be mixed, first use the UPPER( ) function to convert all characters
to uppercase.
Returns "1234 SCOTT ROAD":
By adding both a leading space and a trailing space to the value in old_text ( " rd " ), you prevent the function
from replacing instances where "rd" occurs in a name, because in these instances there are no leading
spaces.
Returns "645 Richard Road":
REVERSE( ) function
Returns a string with the characters in reverse order.
Syntax
REVERSE(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "E DCBA":
REVERSE("ABCD E")
RJUSTIFY( ) function
Returns a right-justified string the same length as a specified string, with any trailing spaces moved to the left
of the string.
Syntax
RJUSTIFY(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns " ABC":
RJUSTIFY("ABC ")
Remarks
When to use RJUSTIFY( )
Use the RJUSTIFY( ) function to right-justify a character field.
RLOGICAL( ) function
Returns a logical value calculated by an R function or script. Data processing in R is external to Analytics.
Syntax
RLOGICAL(rScript|rCode <,field|value <,...n>>)
Parameters
Name Type Description
rScript | rCode character The full or relative path to the R script, or a snippet of R code to run.
If you enter R code directly rather than use an external file, you can-
not use the enclosing quotation character in your code, even if you
escape it:
o valid – 'var <- "\"test\"" '
o invalid – 'var <- "\'test\'" '
field | value <,...n> character The list of fields, expressions, or literal values to use as arguments for
the R script or code snippet.
optional numeric
The values are passed into the function you call in the order you spe-
datetime
cify them, and you reference them using value1, value2 ... valueN in
logical the R code.
You may include as many arguments as necessary to satisfy the func-
tion definition in the R code.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Logical.
Examples
Basic examples
Returns T:
Advanced examples
Using an external R script
Accepts an amount, and an upper and lower threshold value. The function returns a truth value based on a
series of logical comparisons:
Remarks
Returning data from R
When calling R scripts, use the source function and assign the return object to a variable. You can then
access the value returned from your R function from the return object:
# 'a' holds the response object and a[[1]] access the data value
"a<-source('c:\\scripts\\r_scripts\\sample.r');a[[1]]"
R log file
Analytics logs R language messages to an aclrlang.log file in the project folder. Use this log file for
debugging R errors.
Tip
The log file is available in the Results folder of Analytics Exchange analytic jobs.
RNUMERIC( ) function
Returns a numeric value calculated by an R function or script. Data processing in R is external to Analytics.
Syntax
RNUMERIC(rScript|rCode, decimals <,field|value <,...n>>)
Parameters
Name Type Description
rScript | rCode character The full or relative path to the R script, or a snippet of R code to run.
If you enter R code directly rather than use an external file, you cannot
use the enclosing quotation character in your code, even if you escape
it:
o valid – 'var <- "\"test\"" '
o invalid – 'var <- "\'test\'" '
decimals numeric The number of decimal places to include in the return value. Must be a
positive integer.
field | value <,...n> character The list of fields, expressions, or literal values to use as arguments for
the R script or code snippet.
optional numeric
The values are passed into the function you call in the order you spe-
datetime
cify them, and you reference them using value1, value2 ... valueN in
logical the R code.
You may include as many arguments as necessary to satisfy the func-
tion definition in the R code.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Numeric.
Examples
Basic examples
Returns 100 with 10 decimals (100.0000000000):
Advanced examples
Storing R code as a variable
Returns 100 with 10 decimals (100.0000000000):
ASSIGN v_rcode = "print(value1)"
RNUMERIC(v_rcode, 10, 100)
Remarks
Returning data from R
When calling R scripts, use the source function and assign the return object to a variable. You can then
access the value returned from your R function from the return object:
# 'a' holds the response object and a[[1]] access the data value
"a<-source('c:\\scripts\\r_scripts\\sample.r');a[[1]]"
R log file
Analytics logs R language messages to an aclrlang.log file in the project folder. Use this log file for
debugging R errors.
Tip
The log file is available in the Results folder of Analytics Exchange analytic jobs.
ROOT( ) function
Returns the square root of a numeric expression.
Syntax
ROOT(number, decimals)
Parameters
Name Type Description
number numeric The numeric expression to find the square root of.
This function returns zero if number is a negative number.
Output
Numeric.
Examples
Basic examples
Returns 10.00:
ROOT(100, 2)
Returns 31.6228:
ROOT(1000, 4)
Remarks
How it works
The ROOT( ) function returns the square root of the numeric expression or field value with the specified num-
ber of decimal places. The result is rounded appropriately.
ROUND( ) function
Returns a rounded whole number for a numeric value.
Syntax
ROUND(number)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 7:
ROUND(7.2)
Returns 8:
ROUND(7.5)
Returns -8:
ROUND(-7.5)
Advanced examples
Rounding monetary values
Creates a field that is equal to the balance rounded to the nearest dollar value:
Remarks
How it works
ROUND( ) returns a number equal to the number value rounded to the nearest integer:
ROUND(number)
is equivalent to:
DEC(number, 0)
RSTRING( ) function
Returns a string value calculated by an R function or script. Data processing in R is external to Analytics.
Syntax
RSTRING(rScript|rCode, length <,field|value <,...n>>)
Parameters
Name Type Description
rScript | rCode character The full or relative path to the R script, or a snippet of R code to run.
If you enter R code directly rather than use an external file, you can-
not use the enclosing quotation character in your code, even if you
escape it:
o valid – 'var <- "\"test\"" '
o invalid – 'var <- "\'test\'" '
field | value <,...n> character The list of fields, expressions, or literal values to use as arguments for
the R script or code snippet.
optional numeric
The values are passed into the function you call in the order you spe-
datetime
cify them, and you reference them using value1, value2 ... valueN in
logical the R code.
You may include as many arguments as necessary to satisfy the func-
tion definition in the R code.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Character.
Examples
Basic examples
Returns "abc123":
RSTRING("print(paste(value1,value2,sep=""))",6,"abc","123")
Advanced examples
Using an external R script
Concatenates x and y into a single string delimited by a space character:
Tip
To install the uuid package, open R.exe and execute the following command:
install.packages("uuid")
Remarks
Returning data from R
When calling R scripts, use the source function and assign the return object to a variable. You can then
access the value returned from your R function from the return object:
# 'a' holds the response object and a[[1]] access the data value
"a<-source('c:\\scripts\\r_scripts\\sample.r');a[[1]]"
R log file
Analytics logs R language messages to an aclrlang.log file in the project folder. Use this log file for
debugging R errors.
Tip
The log file is available in the Results folder of Analytics Exchange analytic jobs.
RTIME( ) function
Returns a time value calculated by an R function or script. Data processing in R is external to Analytics.
Syntax
RTIME(rScript|rCode <,field|value <,...n>>)
Parameters
Name Type Description
rScript | rCode character The full or relative path to the R script, or a snippet of R code to run.
If you enter R code directly rather than use an external file, you cannot
use the enclosing quotation character in your code, even if you escape
it:
o valid – 'var <- "\"test\"" '
o invalid – 'var <- "\'test\'" '
field | value <,...n> character The list of fields, expressions, or literal values to use as arguments for
the R script or code snippet.
optional numeric
The values are passed into the function you call in the order you spe-
datetime
cify them, and you reference them using value1, value2 ... valueN in
logical the R code.
You may include as many arguments as necessary to satisfy the func-
tion definition in the R code.
Note
Use the ALLTRIM() function to remove any leading or
trailing spaces from character input: ALLTRIM(str). For
more information, see "ALLTRIM( ) function" on
page 461.
Output
Datetime.
Examples
Basic examples
Returns `t0545`:
RTIME("value1+2700",`t0500`)
Advanced examples
Using an external R script
Adds 45 minutes to a time field by passing a field and a literal value to an external R function:
Remarks
Returning data from R
When calling R scripts, use the source function and assign the return object to a variable. You can then
access the value returned from your R function from the return object:
# 'a' holds the response object and a[[1]] access the data value
"a<-source('c:\\scripts\\r_scripts\\sample.r');a[[1]]"
R log file
Analytics logs R language messages to an aclrlang.log file in the project folder. Use this log file for
debugging R errors.
Tip
The log file is available in the Results folder of Analytics Exchange analytic jobs.
SECOND( ) function
Extracts the seconds from a specified time or datetime and returns it as a numeric value.
Syntax
SECOND(time/datetime)
Parameters
Name Type Description
time/datetime datetime The field, expression, or literal value to extract the seconds from.
Output
Numeric.
Examples
Basic examples
Returns 30:
SECOND(`t235930`)
SECOND(`20141231 235930`)
SECOND(Call_start_time)
Remarks
Parameter details
A field specified for time/datetime can use any time or datetime format, as long as the field definition cor-
rectly defines the format.
Specifying a literal time or datetime value
When specifying a literal time or datetime value for time/datetime, you are restricted to the formats in the
table below, and you must enclose the value in backquotes – for example, `20141231 235959`.
Do not use any separators such as slashes (/) or colons (:) between the individual components of dates or
times.
l Time values – you can use any of the time formats listed in the table below. You must use a separator
before a standalone time value for the function to operate correctly. Valid separators are the letter 't',
or the letter 'T'. You must specify times using the 24-hour clock. Offsets from Coordinated Universal
Time (UTC) must be prefaced by a plus sign (+) or a minus sign (-).
l Datetime values – you can use any combination of the date, separator, and time formats listed in the
table below. The date must precede the time, and you must use a separator between the two. Valid
separators are a single blank space, the letter 't', or the letter 'T'.
thhmmss `t235959`
Thhmm `T2359`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
SHIFT( ) function
Returns a single character string with the bits in the first character of the input value shifted to the left or
right.
Syntax
SHIFT(character, number_of_bits_to_left)
Parameters
Name Type Description
number_of_bits_to_ numeric Specifies the number of bits to shift the character value.
left o If the value is positive – character is shifted to the left
o If the value is negative – character is shifted to the right
If the specified value is greater than 15 or less than -15 the result is
binary zero, CHR(0).
Output
Character.
Examples
Basic examples
Returns the letter "X", or CHR(88) (00010110 becomes 01011000):
SHIFT(CHR(22), 2)
SHIFT(CHR(16), -1)
SHIFT(CHR(155), 5)
Remarks
When to use SHIFT( )
Use the SHIFT( ) function in conjunction with the BYTE( ), CHR( ) and MASK( ) functions to isolate and
move individual bits in a record.
SIN( ) function
Returns the sine of an angle expressed in radians, with a precision of 15 decimal places.
Syntax
SIN(radians)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 0.500000000000000 (the sine of the specified number of radians, equivalent to 30 degrees):
SIN(0.523598775598299)
SIN(30 * PI( )/180)
Advanced examples
Using degrees as input
Returns 0.500 (the sine of 30 degrees, rounded to 3 decimal places):
DEC(SIN(30 * PI( )/180),3)
Remarks
Performing the Mantissa Arc Test
The three trigonometric functions in Analytics – SIN( ), COS( ), and TAN( ) – support performing the Mantissa
Arc Test associated with Benford's Law.
SOUNDEX( ) function
Returns the soundex code for the specified string, which can be used for phonetic comparisons with other
strings.
Syntax
SOUNDEX(name)
Parameters
Name Type Description
Output
Character. Returns a four-character soundex code.
Examples
Basic examples
Words that sound the same but are spelled differently
The two examples below return the same soundex code because they sound the same even though they
are spelled differently.
Returns F634:
SOUNDEX("Fairdale")
Returns F634:
SOUNDEX("Faredale")
SOUNDEX("Jonson")
Returns J523:
SOUNDEX("Jonston")
SOUNDEX("Smith")
Returns M235:
SOUNDEX("MacDonald")
Field input
Returns the soundex code for each value in the Last_Name field:
SOUNDEX(Last_Name)
Advanced examples
Identifying matching soundex codes
Create the computed field Soundex_Code to display the soundex code for each value in the Last_Name
field:
Add the computed field Soundex_Code to the view, and then perform a duplicates test on the computed
field to identify any matching soundex codes:
Matching soundex codes indicate that the associated character values in the Last_Name field are pos-
sible duplicates.
Remarks
When to use SOUNDEX( )
Use the SOUNDEX( ) function to find values that sound similar. Phonetic similarity is one way of locating
possible duplicate values, or inconsistent spelling in manually entered data.
How it works
SOUNDEX( ) returns the American Soundex code for the evaluated string. All codes are one letter fol-
lowed by three numbers. For example: "F634".
For example, a word that begins with "F", and a word that begins with a "Ph", could sound the same
but they will never be matched.
Related functions
l SOUNDSLIKE( ) – an alternate method for phonetically comparing strings.
l ISFUZZYDUP( ) and LEVDIST – compare strings based on an orthographic comparison (spelling)
rather than on a phonetic comparison (sound).
l DICECOEFFICIENT( ) – de-emphasizes or completely ignores the relative position of characters or
character blocks when comparing strings.
SOUNDSLIKE( ) function
Returns a logical value indicating whether a string phonetically matches a comparison string.
Syntax
SOUNDSLIKE(name, sounds_like_name)
Parameters
Name Type Description
Output
Logical. Returns T (true) if the values being compared phonetically match, and F (false) otherwise.
Examples
Basic examples
Returns T, because "Fairdale" and "Faredale" both have a soundex code of F634:
SOUNDSLIKE("Fairdale","Faredale")
Returns F, because "Jonson" has a soundex code of J525, and "Jonston" has a soundex code of J523:
SOUNDSLIKE("Jonson","Jonston")
Returns a logical value (T or F) indicating whether the soundex code for each value in the Last_Name field
matches the soundex code for the string "Smith":
SOUNDSLIKE(Last_Name,"Smith")
Advanced examples
Isolating values that sound like "Smith"
Create a filter that isolates all values in the Last_Name field that sound like "Smith":
Remarks
When to use SOUNDSLIKE( )
Use the SOUNDSLIKE( ) function to find values that sound similar. Phonetic similarity is one way of locating
possible duplicate values, or inconsistent spelling in manually entered data.
How it works
SOUNDSLIKE( ) converts the comparison strings to four-character American Soundex codes, which are
based on the first letter, and the first three consonants after the first letter, in each string.
The function then compares each string's code and returns a logical value indicating whether they match.
For more information about soundex codes, see "SOUNDEX( ) function" on page 770.
Case sensitivity
The function is not case-sensitive, so "SMITH" is equivalent to "smith."
Related functions
l SOUNDEX( ) – an alternate method for phonetically comparing strings.
l ISFUZZYDUP( ) and LEVDIST – compare strings based on an orthographic comparison (spelling)
rather than on a phonetic comparison (sound).
SPLIT( ) function
Returns a specified segment from a string.
Syntax
SPLIT(string, separator, segment <,text_qualifier>)
Parameters
Name Type Description
text_qualifier character The character or characters that indicate the start and end of segments
of text.
optional
If the separator character occurs inside a paired set of text qualifiers, it
is read as text and not as a separator.
You must enclose the text qualifier in quotation marks. A single quo-
tation mark text qualifier must be enclosed in double quotation marks,
and a double quotation mark text qualifier must be enclosed in single
quotation marks.
Tip
This optional parameter can be useful when working with
imported source data that retains separators and text
qualifiers.
Output
Character.
Examples
Basic examples
Comma-delimited segments
Returns "seg1":
SPLIT("seg1,seg2,seg3", ",", 1)
Returns "seg3":
SPLIT("seg1,seg2,seg3", ",", 3)
SPLIT("seg1,seg2,,seg4", ",", 3)
SPLIT("seg1/*seg2/*seg3", "/*", 3)
Returns "Doe":
Advanced examples
Extracting digits from a credit card number
Use the SPLIT( ) command to remove dashes from a credit card number.
Variables are used to capture each segment of the credit card number, and then the segments are con-
catenated together in an additional variable.
Remarks
How it works
The SPLIT( ) function breaks character data into segments based on separators such as spaces or com-
mas and returns a specified segment.
SPLIT("seg1,seg2,seg3", ",", 1)
If the source string begins with a separator, the segment that follows the separator is treated as segment 2.
Returns "seg1":
SPLIT(",seg1,seg2,seg3", ",", 2)
Case sensitivity
If separator or text_qualifier specify characters that have both an uppercase and a lowercase version, the
case used must match the case of the separator or text qualifier in the data.
Related functions
SPLIT( ) and SUBSTR( ) both return a segment of data from a longer source string.
l SPLIT( ) identifies the segment based on a separator character.
l SUBSTR( ) identifies the segment based on a numeric character position.
STOD( ) function
Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation for "Serial to
Date".
Syntax
STOD(serial_date <,start_date>)
Parameters
Name Type Description
start_date datetime The start date from which serial dates are calculated. If omitted, the
default start date of 01 January 1900 is used.
optional
Output
Datetime. The date value is output using the current Analytics date display format.
Examples
Basic examples
Returns `20141231` displayed as 31 Dec 2014 assuming a current Analytics date display format of DD
MMM YYYY:
STOD(42003)
Returns `20181231` displayed as 31 Dec 2018 assuming a current Analytics date display format of DD
MMM YYYY:
STOD(42003, `19040101`)
Returns the equivalent date for each serial date value in the Invoice_Date field:
STOD(Invoice_Date)
Advanced examples
Adjusting for a start date before 1900-01-01
Use date arithmetic to adjust the start date to a value that is earlier than the Analytics minimum date of
January 1, 1900:
1. Convert the serial date using the default start date.
2. Subtract the number of days before 1900-01-01 that the actual start date falls.
To use 1899-01-01 as the start date (evaluates to `20131231`):
STOD(42003) - 365
Remarks
How it works
The STOD( ) function allows you to convert serial dates to regular dates. Analytics serial dates represent
the number of days that have elapsed since 01 January 1900.
1 02 January 1900
0 not valid
Point of similarity
Both Analytics and Excel treat the year 1900 as a leap year, with 366 days. Although 1900 was not in fact a
leap year, Excel treated it as one in order to maintain compatibility with Lotus 1-2-3.
Point of difference
Analytics serial dates are offset from Excel serial dates by one day. In Excel, 01 January 1900 has a serial
date of '1'. In Analytics, 01 January 1900 is not counted, and 02 January 1900 has a serial date of '1'.
The start_date
Some source data files may use a start date other than 01 January 1900. The start_date allows you to
match the start date in a source data file. The start date is the date from which serial dates are calculated.
Start date in
source data
file Specify: Details
01 January STOD(date_field, `19010101`) You specify a start_date of `19010101` to match the start date of
1901 01 January 1901 used in the source data file.
01 January STOD(date_field) - 365 You cannot specify a start_date earlier than 01 January 1900. If
1899 a source data file uses a start date earlier than 01 January 1900,
you can create a datetime expression that subtracts an appro-
priate number of days from the output results of the STOD( ) func-
tion.
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional portion
of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
hours equaling 1 – to a time value. Abbreviation for "Serial to Time".
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a char-
acter or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a char-
acter or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can
also return the current operating system date.
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can
also return the current operating system time.
STODT( ) function
Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional portion of 24 hours –
to a datetime value. Abbreviation for "Serial to Datetime".
Syntax
STODT(serial_datetime <,start_date>)
Parameters
Name Type Description
start_date datetime The start date from which serial dates are calculated. If omitted, the
default start date of 01 January 1900 is used.
optional
Output
Datetime. The datetime value is output using the current Analytics date and time display formats.
Examples
Basic examples
Unadjusted start dates
Returns `20141231t060000` displayed as 31 Dec 2014 06:00:00 AM assuming current Analytics date and
time display formats of DD MMM YYYY and hh:mm:ss PM:
STODT(42003.25000)
Returns `20141231t191530` displayed as 31 Dec 2014 07:15:30 PM assuming current Analytics date and
time display formats of DD MMM YYYY and hh:mm:ss PM:
STODT(42003.802431)
STODT(42003.50000, `19040101`)
Fields as input
Returns the equivalent datetime for each serial datetime value in the Receipt_datetime field:
STODT(Receipt_datetime)
Advanced examples
Adjusting for a start date before 1900-01-01
Use date arithmetic to adjust the start date to a value that is earlier than the Analytics minimum date of
January 1, 1900:
1. Convert the serial datetime using the default start date.
2. Subtract the number of days before 1900-01-01 that the actual start date falls.
To use 1899-01-01 as the start date (evaluates to `20131231t180000`):
STODT(42003.75000) - 365
Remarks
How it works
The STODT( ) function allows you to convert serial datetimes to regular datetimes. Analytics serial dat-
etimes represent the number of days that have elapsed since 01 January 1900, and following the decimal
point, represent a fractional portion of 24 hours, with 24 hours equaling 1.
Point of similarity
Both Analytics and Excel treat the year 1900 as a leap year, with 366 days. Although 1900 was not in fact a
leap year, Excel treated it as one in order to maintain compatibility with Lotus 1-2-3.
Point of difference
Analytics serial dates are offset from Excel serial dates by one day. In Excel, 01 January 1900 has a serial
date of '1'. In Analytics, 01 January 1900 is not counted, and 02 January 1900 has a serial date of '1'.
The start_date
Some source data files may use a start date other than 01 January 1900. The start_date allows you to
match the start date in a source data file. The start date is the date from which serial datetimes are cal-
culated.
Start date in
source data
file Specify: Details
01 January STODT(datetime_field, You specify a start_date of `19010101` to match the start date of
1901 `19010101`) 01 January 1901 used in the source data file.
01 January STODT(datetime_field) - 365 You cannot specify a start_date earlier than 01 January 1900. If
1899 a source data file uses a start date earlier than 01 January 1900,
you can create a datetime expression that subtracts an appro-
Start date in
source data
file Specify: Details
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
hours equaling 1 – to a time value. Abbreviation for "Serial to Time".
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a char-
acter or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a char-
acter or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can
also return the current operating system date.
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can
also return the current operating system time.
STOT( ) function
Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24 hours equaling 1
– to a time value. Abbreviation for "Serial to Time".
Syntax
STOT(serial_time)
Parameters
Name Type Description
Output
Datetime. The time value is output using the current Analytics time display format.
Examples
Basic examples
Returns `t060000` displayed as 06:00:00 AM assuming a current Analytics time display format of hh:mm:ss
PM:
STOT(0.25000)
Returns `t191530` displayed as 07:15:30 PM assuming a current Analytics time display format of hh:mm:ss
PM:
STOT(0.802431)
Returns the equivalent regular time for each serial time value in the Login_time field:
STOT(Login_time)
Remarks
When to use STOT( )
Use the STOT( ) function to convert serial times to regular times.
0.00 12:00:00 AM
0.0006945 12:01:00 AM
0.04167 01:00:00 AM
0.0423645 01:01:00 AM
0.042998 01:01:55 AM
0.25 06:00:00 AM
0.50 12:00:00 PM
0.75 06:00:00 PM
0.79167 07:00:00 PM
0.802431 07:15:30 PM
1.00 12:00:00 AM
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional portion
of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a character
or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a character
or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can also
return the current operating system date.
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
Function Description
TIME( ) Extracts the time from a specified time or datetime and returns it as a character string. Can also
return the current operating system time.
STRING( ) function
Converts a numeric value to a character string.
Syntax
STRING(number, length <,format>)
Parameters
Name Type Description
format character The formatting to apply to the output string. For example, "(9,999.99)"
optional
Output
Character.
Examples
Basic examples
Unformatted strings
Returns " 125.2":
STRING(125.2, 6)
Returns "25.2" (-1 is truncated because length is less than the number of digits and formatting characters in
number):
STRING(-125.2, 4)
STRING(-125.2, 7)
Formatted strings
Returns " (125.20)":
Returns "25.20" (1 is truncated because length is less than the number of digits and formatting characters
in number):
STRING(125.2, 6, "(9,999.99)")
Field input
Returns numeric values in the Employee_number field as character strings with a length of 10 characters.
If required, the return value is padded or truncated:
STRING(Employee_number, 10)
Remarks
Padded and truncated return values
STRING( ) converts number into a character string of the length specified in length:
l If number is shorter than length, leading spaces are added to the return value
l If number is longer than length, the return value is truncated from the left side
Related functions
The STRING( ) function is the opposite of the VALUE( ) function, which converts character data to numeric
data.
SUBSTR( ) function
Returns a specified substring from a string.
Syntax
SUBSTR(string, start, length)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Literal character input
Returns "BCD":
SUBSTR("ABCDEF", 2, 3)
Returns "EF":
SUBSTR("ABCDEF", 5, 10)
SUBSTR("***189543***", 4, 6)
Returns the four-digit year out of a character field containing dates formatted as "MM/DD/YYYY":
SUBSTR(DATE, 7, 4)
Advanced examples
Increasing field length
Use SUBSTR( ) to increase the length of a character field. Increasing the length of a field is a common har-
monization task that you may need to perform before joining or appending two fields.
The example below pads the Product_Description field with blank spaces to create the computed field
Product_Description_Long with a length of 50 characters.
Remarks
How it works
The SUBSTR( ) function returns characters from the string value starting at the character position specified
by start. The number of characters returned is specified by length.
Creates the computed field Product_Description_Long, with a length of 50 characters, based on the phys-
ical field Product_Description, with a length of 24 characters:
Note
Even though SUBSTR( ) specifies a length of 50 characters, the output is limited to the
field length of Product_Description.
Related functions
SUBSTR( ) and SPLIT( ) both return a segment of data from a longer source string.
l SUBSTR( ) identifies the segment based on a numeric character position.
l SPLIT( ) identifies the segment based on a separator character.
TAN( ) function
Returns the tangent of an angle expressed in radians, with a precision of 15 decimal places.
Syntax
TAN(radians)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 0.999999999999999 (the tangent of the specified number of radians, equivalent to 45 degrees):
TAN(0.785398163397448)
TAN(45 * PI( )/180)
Advanced examples
Using degrees as input
Returns 1.000 (the tangent of 45 degrees, rounded to 3 decimal places):
DEC(TAN(45 * PI( )/180),3)
Remarks
Performing the Mantissa Arc Test
The three trigonometric functions in Analytics – SIN( ), COS( ), and TAN( ) – support performing the Man-
tissa Arc Test associated with Benford's Law.
TEST( ) function
Returns a logical value indicating whether a specified string occurs at a specific byte position in a record.
Syntax
TEST(byte_position, string)
Parameters
Name Type Description
byte_position numeric The sequential number from the left in the table layout that identifies
the location of the first character of string.
The function evaluates to F if the start of string is not identified at this
position, even if string appears at another position in the record.
Output
Logical. Returns T (true) if the specified string starts at the specified byte location within a record, and F
(false) otherwise.
Examples
Basic examples
Given a record containing:
Department: Marketing
....|....|....|....|....|
Returns T:
TEST(5, "Department")
Returns F, because in the record, "Department" starts at the fifth byte position, not the sixth:
TEST(6, "Department")
TEST(5, "DEPARTMENT")
Advanced examples
Isolating records that are page headings
Use TEST( ) to create a filter that isolates all records that start with "Page:":
TIME( ) function
Extracts the time from a specified time or datetime and returns it as a character string. Can also return the
current operating system time.
Syntax
TIME(< time/datetime> <,format>)
Parameters
Name Type Description
time/datetime datetime The field, expression, or literal value to extract the time from. If omitted,
the current operating system time is returned in the format hh:mm:ss.
optional
format character The format to apply to the output string, for example "hh:mm:ss". If omit-
ted, the current Analytics time display format is used. You cannot spe-
optional
cify format if you have omitted time/datetime.
Output
Character.
Examples
Basic examples
Literal input values
Returns "23:59:59" assuming an Analytics time display format of hh:mm:ss:
TIME(`20141231 235959`)
Returns the current operating system time returned as a character string in hh:mm:ss format (24-hour
clock):
TIME()
TIME(Receipt_timestamp)
Returns a character string for each value in the Receipt_timestamp field, using the specified time display
format:
TIME(Receipt_timestamp, "hh:mm:ss")
Advanced examples
Calculating the elapsed time for a command or a script to execute
Use the TIME( ) function to help calculate the amount of time a particular Analytics command, or an entire
script, takes to execute.
Immediately before the command you want to time, or at the start of the script, specify this line to create a
variable that stores the current operating system time:
Immediately after the command, or at the end of the script, specify the two lines below.
The first line creates a variable that stores the operating system time after the command or script com-
pletes. The second line calculates the difference between the finish and start times, and converts the result
to an easily readable format.
Tip
You can double-click the CALCULATE log entry to see the elapsed time for the command
or the script.
If the command or script will run over the boundary of midnight, use this second line instead:
Remarks
Output string length
The length of the output string is always 14 characters. If the specified output format, or the Analytics time
display format, is less than 14 characters, the output string is padded with trailing blank spaces.
Parameter details
A field specified for time/datetime can use any time or datetime format, as long as the field definition cor-
rectly defines the format.
If you use format to control how the output string is displayed, you are restricted to the formats in the table
below. You can use any combination of time and AM/PM formats. The AM/PM format is optional, and is
placed last.
Specify format using single or double quotation marks. For example: "hh:mm:ss AM".
hhmm
hh
Time (UTC) must be prefaced by a plus sign (+) or a minus sign (-).
l Datetime values – you can use any combination of the date, separator, and time formats listed in
the table below. The date must precede the time, and you must use a separator between the two.
Valid separators are a single blank space, the letter 't', or the letter 'T'.
thhmmss `t235959`
Thhmm `T2359`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset.
For example, avoid: hh+hhmm. Results
can be unreliable.
Related functions
If you need to return the current operating system time as a datetime value, use NOW( ) instead of TIME(
).
DATE( ) Extracts the date from a specified date or datetime and returns it as a character string. Can
also return the current operating system date.
DATETIME( ) Converts a datetime to a character string. Can also return the current operating system dat-
etime.
CTOD( ) Converts a character or numeric date value to a date. Can also extract the date from a character
or numeric datetime value and return it as a date. Abbreviation for "Character to Date".
CTODT( ) Converts a character or numeric datetime value to a datetime. Abbreviation for "Character to
Datetime".
CTOT( ) Converts a character or numeric time value to a time. Can also extract the time from a character
or numeric datetime value and return it as a time. Abbreviation for "Character to Time".
STOD( ) Converts a serial date – that is, a date expressed as an integer – to a date value. Abbreviation
for "Serial to Date".
STODT( ) Converts a serial datetime – that is, a datetime expressed as an integer, and a fractional portion
of 24 hours – to a datetime value. Abbreviation for "Serial to Datetime".
STOT( ) Converts a serial time – that is, a time expressed as a fractional portion of 24 hours, with 24
hours equaling 1 – to a time value. Abbreviation for "Serial to Time".
TODAY( ) function
Returns the current operating system date as a Datetime data type.
Syntax
TODAY()
Parameters
This function does not have any parameters.
Output
Datetime.
Examples
Basic examples
Returns the current operating system date as a datetime value, for example `20141231`, displayed using
the current Analytics date display format:
TODAY()
Remarks
Related functions
If you need to return the current operating system date as a character string, use DATE( ) instead of
TODAY( ).
TRANSFORM( ) function
Reverses the display order of bi-directional text within a specified string.
Syntax
TRANSFORM(original_string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
In the input string, the characters "XZQB" represent Hebrew/bidirectional characters in an input string that
otherwise contains regular characters.
In the output string, the direction of "XZQB" is reversed, and returns "BQZX". The other characters are
unmodified.
Returns "ABC BQZX 123":
Remarks
How it works
The TRANSFORMS( ) function identifies bi-directional data and displays it correctly in the view, from right
to left.
All other characters processed by the function are unmodified and continue to display from left to right.
TRIM( ) function
Returns a string with trailing spaces removed from the input string.
Syntax
TRIM(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Note that in both examples the leading spaces and spaces between words are not removed by the TRIM( )
function.
Returns " Vancouver":
TRIM(" Vancouver ")
TRIM(" New York")
Advanced examples
Removing non-breaking spaces
The REPLACE( ) function replaces any non-breaking spaces with regular spaces, and then TRIM( )
removes any trailing regular spaces.
Remarks
How it works
The TRIM( ) function removes trailing spaces only. Spaces inside the string, and leading spaces, are not
removed.
Related functions
TRIM( ) is related to the LTRIM( ) function, which removes any leading spaces from a string, and to the
ALLTRIM( ) function, which removes both leading and trailing spaces.
UNSIGNED( ) function
Returns numeric data converted to the Unsigned data type.
Syntax
UNSIGNED(number, length_of_result)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns 000075:
UNSIGNED(75, 3)
UNSIGNED(-75, 3)
UNSIGNED(7.5, 3)
Returns 2456 (1 is truncated because only 4 digits can be stored when the length_of_result is 2):
UNSIGNED(12456, 2)
Returns 000000012456:
UNSIGNED(-12.456, 6)
Remarks
What is Unsigned data?
The Unsigned data type is used by mainframe operating systems to store numeric values in a format that
uses minimal space, storing two digits in each byte. The Unsigned data type is the same as the Packed
data type, but it does not use the last byte to specify whether the value is positive or negative.
UPPER( ) function
Returns a string with alphabetic characters converted to uppercase.
Syntax
UPPER(string)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Returns "ABC":
UPPER("abc")
UPPER("AbCd 12")
UPPER(Last_Name)
Remarks
How it works
The UPPER( ) function converts all alphabetic characters in string to uppercase. All non-alphabetic char-
acters are left unchanged.
UTOD( ) function
Converts a Unicode string containing a formatted date to an Analytics date value. Abbreviation for "Unicode
to Date".
Note
This function is specific to the Unicode edition of Analytics. It is not a supported function in
the non-Unicode edition.
Use this function when working with dates in languages and formats that are different from
your default installation. If the string you want to convert is in your default language, use
CTOD( ) instead.
Syntax
UTOD(string <,locale> <,style>)
Parameters
Name Type Description
locale character The code that specifies the language and locale of the output string,
and optionally the version of the language associated with a particular
optional
country or region.
For example, "zh" specifies Chinese, and "pt_BR" specifies Brazilian
Portuguese.
If omitted, the default locale for your computer is used. If a language is
specified, but no country is specified, the default country for the lan-
guage is used.
You cannot specify locale if you have not specified date.
For more information about locale codes, see www.unicode.org.
style numeric The date format style to use for the Unicode string. The format style
matches the standard for the locale you specify:
optional
o 0 – full specification format, such as "Sunday, September 18, 2016"
o 1 – long format, such as "September 18, 2016"
DISPLAY DTOU(`20160909`, "es_MX", 3)
Output
Datetime. The date value is output using the current Analytics date display format.
Examples
Basic examples
Note
All examples assume a current Analytics date display format of DD MMM YYYY.
In the examples below, the locale code for Chinese ( "zh" ) and Simplified Chinese ( "zh_
CN" ) match different input strings and are not interchangeable.
You must also specify the correct style. A long Unicode date string (that is, style is 1 ) does
not return an Analytics date if you specify a style of 2.
UTOD(Invoice_date, "zh", 1)
UTOD("2014年12月31日星期三", "zh_CN", 0)
UTOD("2014年12月31日", "zh_CN", 1)
Remarks
Converting Unicode strings successfully
To successfully convert Unicode strings containing dates into Analytics dates you must specify locale and
style parameters that match the language country/region (if applicable), and style of the date in the Unicode
string.
Related functions
UTOD( ) is the inverse of DTOU( ), which converts a date to a Unicode string. If you are uncertain which
country/region and style to specify for UTOD( ), you can use DTOU( ) and experiment with different para-
meters to produce an output Unicode string that matches the form of the input Unicode strings you want to
convert with UTOD( ).
VALUE( ) function
Converts a character string to a numeric value.
Syntax
VALUE(string, decimals)
Parameters
Name Type Description
Output
Numeric.
Examples
Basic examples
Returns -123.400:
VALUE("123.4-", 3)
Returns 123456.00:
VALUE("$123,456", 2)
Returns -77.45:
VALUE("77.45CR", 2)
Returns -123457:
VALUE(" (123,456.78)", 0)
Field input
Returns character values in the Salary field as numbers without any decimal places:
VALUE(Salary, 0)
Remarks
How it works
This function converts character data to numeric data. You can use the VALUE( ) function if you need to
convert character expressions or field values to numeric values for use in Analytics commands.
Negative values
The VALUE( ) function can interpret different indicators of negative values such as parentheses and the
minus sign. It can also interpret CR (credit) and DR (debit). For example:
Returns -1000.00:
VALUE("(1000)", 2)
VALUE("1000CR", 2)
VALUE("123", 2)
If the number of decimals specified by decimals is less than the number in the field or expression, the result
is rounded. For example:
Returns "10.6":
VALUE("10.56", 1)
Related functions
The VALUE( ) function is the opposite of the STRING( ) function, which converts numeric data to character
data.
VERIFY( ) function
Returns a logical value indicating whether the data in a physical data field is valid.
Syntax
VERIFY(field)
Parameters
Name Type Description
Output
Logical. Returns T (true) if data in the field is valid, and F (false) otherwise.
Examples
Basic examples
Extracts any records where the VERIFY( ) function evaluates to false to a new Analytics table:
Remarks
The VERIFY( ) function determines whether the data in a field is consistent with the specified data type for
the field.
WORKDAY( ) function
Returns the number of workdays between two dates.
Syntax
WORKDAY(start_date, end_date <,nonworkdays>)
Parameters
Name Type Description
start_date datetime The start date of the period for which workdays are calculated. The
start date is included in the period.
end_date datetime The end date of the period for which workdays are calculated. The
end date is included in the period.
nonworkdays character The days of the week that are weekend days, or non workdays, and
excluded from the calculation. If nonworkdays is omitted, Saturday
optional
and Sunday are used as the default non workdays.
Enter nonworkdays using the following abbreviations, separated by a
space or a comma:
o Mon
o Tue
o Wed
o Thu
o Fri
o Sat
o Sun
nonworkdays is not case-sensitive. The abbreviations must be
entered in English even if you are using a non-English version of Ana-
lytics:
Note
You can specify a datetime value for start_date or end_date but the time portion of the
value is ignored.
If start_date is more recent than end_date, a negative value is returned.
Output
Numeric. The number of workdays in the period for which workdays are calculated.
Examples
Basic examples
Literal input values
Returns 5 (the number of workdays between Monday, March 02, 2015 and Sunday, March 08, 2015 inclus-
ive):
WORKDAY(`20150302`, `20150308`)
Returns 6 (the number of workdays between Monday, March 02, 2015 and Sunday, March 08, 2015 inclus-
ive, when Sunday is the only non workday):
Returns 5 (the number of workdays between Sunday, March 01, 2015 and Saturday, March 07, 2015 inclus-
ive, when Friday and Saturday are the non workdays):
WORKDAY(Start_date, `20151231`)
Returns the number of workdays between each date in the Start_date field and a corresponding date in the
End_date field inclusive:
l Statutory holidays are included in the workdays total and may need to be factored out using a sep-
arate calculation
l A negative return value indicates a start date that is more recent than an end date
WORKDAY(Start_date, End_date)
Remarks
Date formats
A field specified for start_date or end_date can use any date format, as long as the field definition correctly
defines the format.
When specifying a literal date value for start_date or end_date, you are restricted to the formats
YYYYMMDD and YYMMDD, and you must enclose the value in backquotes – for example, `20141231`.
You first create a computed field, for example Workdays , that calculates the workdays for a specified period
during the quarter:
You then create a conditional computed field, for example Workdays_no_holidays , that adjusts the value
returned by the first computed field (Workdays ):
Note
The order of the conditions in the conditional computed field is important.
Analytics evaluates multiple conditions starting at the top. The first condition that evaluates
to true for a record assigns the value of the conditional computed field for that record. A sub-
sequent condition that evaluates to true does not change the assigned value.
YEAR( ) function
Extracts the year from a specified date or datetime and returns it as a numeric value using the YYYY
format.
Syntax
YEAR(date/datetime)
Parameters
Name Type Description
date/datetime datetime The field, expression, or literal value to extract the year from.
Output
Numeric.
Examples
Basic examples
Returns 2014:
YEAR(`20141231`)
YEAR(`141231 235959`)
YEAR(Invoice_date)
Remarks
Parameter details
A field specified for date/datetime can use any date or datetime format, as long as the field definition cor-
rectly defines the format.
YYYYMMDD `20141231`
YYMMDD `141231`
YYYYMMDD hhmmss `20141231 235959`
YYMMDDthhmm `141231t2359`
YYYYMMDDThh `20141231T23`
YYYYMMDD hhmmss+/-hhmm `20141231 235959-0500`
(UTC offset)
YYMMDD hhmm+/-hh `141231 2359+01`
(UTC offset)
Note
Do not use hh alone in the main time
format with data that has a UTC offset. For
example, avoid: hh+hhmm. Results can be
unreliable.
ZONED( ) function
Converts numeric data to character data and adds leading zeros to the output.
Syntax
ZONED(number, length)
Parameters
Name Type Description
Output
Character.
Examples
Basic examples
Integer input
Returns "235":
ZONED(235, 3)
Returns "00235", because length is greater than the number of digits in number so two leading zeros are
added to the result:
ZONED(235, 5)
Returns "35", because length is less than the number of digits in number so the leftmost digit is truncated
from the result:
ZONED(235, 2)
Decimal input
Returns "23585", because the zoned data format does not support a decimal point:
ZONED(235.85, 5)
Negative input
Returns "64489M", because the number is negative and "M" represents the final digit 4:
ZONED(-6448.94, 6)
Returns "489J", because length is less than the number of digits in number so the two leftmost digits are trun-
cated from the result, and the number is negative and "J" represents the final digit 1:
ZONED(-6448.91, 4)
Advanced examples
Adding leading zeros to a character field containing numbers
The Employee_Number field contains the value "254879". You need to convert the value to a 10-digit string
with leading zeros.
Tip
You must use the VALUE( ) function to convert the character to numeric data before using
the numeric as input for ZONED( ).
The two tables each have a CustNo field, but the data format is different:
l Ar – numeric field (for example, 235)
l Customer – 5 character field that pads numbers with leading zeros (for example, "00235")
To harmonize the fields when joining so that the data types and lengths are equal, you use the ZONED( )
function to convert the Ar key field CustNo to a character field of length 5 so that it matches the format of
the key field in Customer:
OPEN Ar PRIMARY
OPEN Customer SECONDARY
JOIN PKEY ZONED(CustNo,5) FIELDS CustNo Due Amount SKEY CustNo UNMATCHED TO Ar_
Cust OPEN PRESORT SECSORT
Remarks
How it works
This function converts numeric data to character data and adds leading zeros to the output. The function is
commonly used to harmonize fields that require leading zeros, for example, check number, purchase
order number, and invoice number fields.
Decimal numbers
The zoned data format does not include an explicit decimal point.
Negative numbers
If the input number is negative, the rightmost digit is displayed as a character in the result:
l "}" for 0
l a letter between "J" and "R" for the digits 1 to 9
ZSTAT( ) function
Returns the standard Z-statistic.
Syntax
ZSTAT(actual, expected, population)
Parameters
Name Type Description
population numeric The total number of items being tested. This parameter must be a pos-
itive whole number greater than 0.
Output
Numeric.
Examples
Advanced examples
Parameters expressed as numbers
Based on 10 years of previous data, you know that the distribution of worker disability claims per month is
normally highly uniform. In April, May, and June of this year, claims were higher by about 10 percent, aver-
aging 220 per month instead of 200. Claims in July and August were slightly low, at 193 and 197. The total
claims for the year were 2,450. To test whether these high and low results were significant, use the Z-stat-
istic.
The actual number of claims for April to June is higher than expected at 660. The expected number of
claims for this period is 25 percent of the 2,450 annual claims, or 612.5. The Z-statistic for these counts is cal-
culated as 2.193:
A Z-statistic of 1.96 has a significance of 0.05, and 2.57 has a significance of 0.01. Thus, the probability that
the higher rates of claims are due to chance is between 1:20 and 1:100.
The actual number of claims for July and August is lower than expected at 390. The expected number of
claims for this period is one sixth of the 2,450 annual claims, or 408.33. The Z-statistic for these proportions
is calculated as 0.967:
This is not a very significant result. Z-statistics of 1.000 and less are very common and can typically be
ignored.
A Z-statistic of 1.96 has a significance of 0.05, and 2.57 has a significance of 0.01. Thus, the probability that
the higher rates of claims are due to chance is between 1:20 and 1:100.
The actual number of claims for July and August is low at 390. The expected number of claims for this period
should be one sixth, or 16.6667 percent of the 2,450 annual claims. The Z-statistic for these proportions is
0.967:
This is not a very significant result. Z-statistics of 1.000 and less are very common and can typically be
ignored.
Remarks
How it works
The ZSTAT( ) function calculates the standard Z-statistic for use in many problem-solving tasks, including
digital analysis. It outputs the result with a precision of three decimal places.
Using ZSTAT( )
Use ZSTAT( ) to evaluate the likely frequency of occurrence of a given result in a specified period or cat-
egory. The larger the resulting Z-statistic, the more unlikely the occurrence.
For example, a Z-statistic of 1.96 has a significance of 0.05, representing the likelihood of a one time in 20
occurrence, whereas a Z-statistic of 2.57 has a significance of 0.01, representing the likelihood of a one
time in 100 occurrence. For information on the Z-statistic, consult a statistics textbook.
Analytics
Analytic scripts
Scripts are not limited to running in Analytics only. By converting regular scripts to analytic scripts , you
can schedule and run scripts in the Robots module of HighBond, or in Analytics Exchange. You can also
run analytic scripts in the Analysis App window, a freestanding component of Analytics.
COMMENT
//ANALYTIC Identify missing checks
This analytic script identifies missing check numbers
END
Robots module of o commit one or more analytic scripts as a version to development mode in Robots, and
HighBond schedule and run an activated version in production mode
Analysis App window o package the project into a compressed analysis app file (.aclapp file), open the pro-
ject as an analysis app (.aclx file), and run the analytic script in the Analysis App win-
dow
For more information, see "Packaging analysis apps" on page 891.
FTYPE("ax_main") = "b"
If the script is running in either Analytics Exchange or the Analysis App window, the expression evaluates to
true (T). For scripts running in Analytics, the expression evaluates to false (F). For more information, see
"FTYPE( ) function" on page 572.
Note
AXRunByUser is only available when running analytic scripts on AX Server. The variable
is unrecognized when running scripts in Analytics.
Example
This analytic header identifies a table and field to use in the script, as well as a start date parameter:
COMMENT
//ANALYTIC Identify missing checks
This analytic identifies missing check numbers
//TABLE v_table_payments Payments Table
Select a table that lists payments and includes a check number column
//FIELD v_check_num CN Check Number
Select the field that contains the check numbers
//PARAM v_start_date D OPTIONAL Start Date (Optional)
Enter the start date for the analysis
END
Tag format
Each tag in the header uses the following format:
The // tag indicator must be the first non-whitespace character on the script line. The tag must immediately
follow the tag indicator, without any space or characters in between.
Tag conventions
Component Convention
Tag attributes When specifying attribute values for a tag, you may include spaces and optionally enclose the
value in quotation marks.
Tag descriptions Descriptions are optional. If a description is specified it can be multi-line, but line breaks are
not preserved in client applications.
When the script runs in Analytics, the parameter takes the value specified in the assignment. When the
analytic runs in a client application, the test value is ignored and the user-defined input parameters are
used.
You must leave a space between the assignment operator and the tag syntax preceding it. Assignment val-
ues must use the correct qualifier for the data type as required throughout Analytics. For more information,
see "Data types" on page 21.
"ANALYTIC" on Designates an Analytics script as an analytic that can run in Robots, on AX Server, or in the
page 846 Analysis App window.
Input tags
Tag Description
"FILE" on page 848 Specifies a non-Analytics file, such as an Excel file, or a delimited file, that provides input for
an analytic running in Robots, or on AX Server.
o Robots – the file must be located in the Input/Output tab in the same robot as the analytic
o AX Server – the file must be located in the Related Files subfolder in the folder where the
analytic is located
"TABLE" on Defines an Analytics table that the user selects as input for an analytic.
page 850
The TABLE tag can be followed by zero or more FIELD tags entered on sequential lines.
"FIELD" on Defines a field that the user selects as input for the analytic.
page 852
The field must be part of the table defined in the preceding TABLE tag. The first FIELD tag
must immediately follow a TABLE tag, and can be followed by additional FIELD tags entered
on sequential lines.
"PARAM" on Creates an input parameter for an analytic, and defines the requirements for the input value.
page 854
An input parameter is a placeholder that allows the user to specify the actual value when
scheduling or running an analytic.
"PASSWORD" on Creates a password input parameter for an analytic. The parameter provides encrypted stor-
page 866 age of a password for subsequent use in an ACLScript command.
The user is prompted to specify the required password value when they schedule or start an
analytic so that user intervention is not required as the analytic is running.
Output tags
"DATA" on page 869 Specifies that an Analytics table output by an analytic is copied to a data subfolder (a storage
location) in the deployment environment.
Typically, you store Analytics tables so that they can be used as input tables for subsequent
analytics.
"RESULT" on Specifies the analytic output results that you want to make available to end users in client
page 873 applications.
Output results, even if they exist, are not automatically made available. You must use a sep-
arate RESULT tag for each result item that you want to make available.
"PUBLISH" on Specifies a file that contains metadata defining which Analytics tables to publish to AX Excep-
page 877 tion when an analytic is finished processing.
ANALYTIC
Designates an Analytics script as an analytic that can run in Robots, on AX Server, or in the Analysis App
window.
Syntax
//ANALYTIC <TYPE IMPORT|PREPARE|ANALYSIS> name
< description>
Parameters
Name Description
TYPE Identifies the kind of analytic script as one of the following three types:
optional o IMPORT – retrieves data from a data source. The output of an import analytic is a raw
data table.
o PREPARE – transforms raw data in whatever way is necessary to make it suitable for
analysis. The output of a preparation analytic is an analysis table.
o ANALYSIS – performs tests on data in analysis tables. The output of an analysis ana-
lytic is one or more results tables.
Analytics with a specified TYPE are organized in the corresponding Import,
Preparation, or Analysis areas in Robots, AX Web Client, and the Analysis App window.
This placement guides the user in the appropriate sequence for running the analytics.
The sequence is not enforced, nor is the type of functionality within the analytic.
If you omit TYPE, the analytic appears in the Analysis section.
Name Description
description A description of the analytic or other information that the user might need to run the ana-
lytic successfully.
optional
The description appears with the analytic in client applications. The description can be
multiline, but it cannot skip lines. The description must be entered on the line below the
associated ANALYTIC tag.
Examples
Basic analytic header
The following analytic header contains a name and a description of the analytic:
COMMENT
//ANALYTIC Identify missing checks
This analytic identifies missing check numbers.
END
COMMENT
//ANALYTIC TYPE PREPARE Standardize address data
This analytic cleans and standardizes the address field in preparation for duplicates analysis.
END
Remarks
An ACLScript COMMENT command must be entered on the first line in the script, followed by the
ANALYTIC tag on the second line. If the ANALYTIC tag is used in any other location it is ignored.
One or more scripts in an Analytics project can include an ANALYTIC tag.
FILE
Specifies a non-Analytics file, such as an Excel file, or a delimited file, that provides input for an analytic
running in Robots, or on AX Server.
l Robots – the file must be located in the Input/Output tab in the same robot as the analytic
l AX Server – the file must be located in the Related Files subfolder in the folder where the analytic is
located
Note
To specify a non-Analytics input file for an analytic run in the Analysis App window, see
"PARAM" on page 854.
Syntax
//FILE filename
Parameters
Name Description
filename The name of the item in the robot, or in the Related Files subfolder, to use as an input
file for an analytic. filename cannot include a path.
Wildcards are supported when specifying the file name. Use a single asterisk ( * ) to sub-
stitute for zero or more characters.
For example:
o Inv12* matches all of the following: Inv12, Inv123, and Inv1234
o *.* matches all files of all extensions in the robot or the Related Files folder
o Inv_*.* matches Inv_Jan.pdf and Inv_Feb.xls
Tip
You can use the //FILE tag to reference a .prf Analytics preferences file.
When you do this, the preferences file in the Related Files subfolder is
used to set the runtime environment settings rather than the global pref-
erences file on AX Server. The preferences file must be from the latest
version of Analytics that is compatible with your Analytics Exchange
installation.
Examples
Basic examples
Specifies a specific file:
//FILE FlaggedAccounts.csv
//FILE Flagged*.csv
//FILE *.*
Advanced examples
Import data from an included file
You run a monthly analysis of employee data on AX Server. One of the analytics in the analysis app imports
the data to analyze from a delimited file that is placed in the Related Files folder on monthly basis:
COMMENT
//ANALYTIC TYPE IMPORT employee_import
Imports employee records from delimited file stored in Related Files folder.
//FILE Employees.csv
END
IMPORT DELIMITED TO Employees "Employees.fil" FROM "Employees.csv" 0 SEPARATOR ","
QUALIFIER '"' CONSECUTIVE STARTLINE 1 KEEPTITLE FIELD "First_Name" C AT 1 DEC 0 WID
11 PIC "" AS "First Name" FIELD "Last_Name" C AT 12 DEC 0 WID 12 PIC "" AS "Last Name"
Remarks
The FILE tag is not supported for use in analytics run in the Analysis App window. To specify an input file for
analytics run in the Analysis App window, use the PARAM tag. For more information, see "PARAM" on
page 854.
TABLE
Defines an Analytics table that the user selects as input for an analytic.
The TABLE tag can be followed by zero or more FIELD tags entered on sequential lines.
Note
The TABLE tag requires that a table pre-exists in the storage location in order to be avail-
able to be selected. For more information, see "DATA" on page 869.
Syntax
//TABLE id name
<description>
Parameters
Name Description
id The variable that stores the input table name selected by the user. Use this value in the
analytic script to reference the table.
description Descriptive text that specifies the purpose of the table. The description can be multiline,
but it cannot skip lines.
optional
The value is displayed in client applications when the user is prompted to select the
table. The description can prompt the user to select the correct table. For example,
"Select the table that includes payroll information".
The description must be entered on the line below the associated TABLE tag.
Examples
Basic examples
TABLE tag with description to help user select the correct input table:
Advanced examples
Using a table defined in a TABLE tag in the script
The following script runs an AGE command on a table that is selected by the user from the data tables in the
project:
COMMENT
//ANALYTIC example_script
//TABLE v_table_payments Payments Table
Select a table that lists payments and includes a check number column.
END
OPEN %v_table_payments%
AGE ON payment_date CUTOFF 20141231 INTERVAL 0,30,60,90,120,10000 SUBTOTAL Pay-
ment_Amount TO r_output
CLOSE %v_table_payments%
FIELD
Defines a field that the user selects as input for the analytic.
The field must be part of the table defined in the preceding TABLE tag. The first FIELD tag must imme-
diately follow a TABLE tag, and can be followed by additional FIELD tags entered on sequential lines.
Note
The TABLE tag requires that a table pre-exists in the storage location in order to be avail-
able to be selected. For more information, see "DATA" on page 869.
Syntax
//FIELD id type name
<description>
Parameters
Name Description
id The variable that stores the input field name selected by the user. Use this value in the
analytic script to reference the field.
type The types of fields that can be selected. Any type, or combination of types, from the fol-
lowing list can be selected:
o C – character data
o N – numeric data
o D – date, datetime, or time subtype of datetime data
o L – logical data
Any computed fields in a table can be selected regardless of the type specified.
description Descriptive text that specifies the purpose of the field. The description can be multiline,
but it cannot skip lines.
optional
The value is displayed in client applications when the user is prompted to select the
field. The description can prompt the user to select the correct field. For example,
"Select the field that includes payment amount".
The description must be entered on the line below the associated FIELD tag.
Examples
Basic examples
Specifies a character field:
Advanced Examples
TABLE with two accompanying FIELD tags
The following analytic header allows the user to specify two input fields from the v_table_payments table
when the script runs:
COMMENT
//ANALYTIC test analytic
//TABLE v_table_payments Payments Table
Select a table that lists payments and includes a check number column.
//FIELD v_check_num CN Check Number Field
//FIELD v_payment_date D Payment Date Field
Select the column that contains the check payment date.
END
OPEN %v_table_payments%
EXTRACT FIELDS %v_check_num%, %v_payment_date% TO t_analyze
PARAM
Creates an input parameter for an analytic, and defines the requirements for the input value.
An input parameter is a placeholder that allows the user to specify the actual value when scheduling or run-
ning an analytic.
Syntax
//PARAM id type <OPTIONAL> <MULTI> <SEPARATOR value> <QUALIFIER value>
<VALUES value_list> label
<description>
Parameters
Name Description
id The variable that stores the analytic input value(s) selected or specified by the user.
For example:
o v_start_date
o v_regions
o v_input_file
Also serves as the unique identifier for the parameter.
Note
When an analytic is run, the variable is created only if the user provides
an input value. If a parameter is optional, and the user skips it, the vari-
able is not created.
If subsequent logic in the analytic requires the variable to exist, you can
test for its existence, and if it does not exist, create it and initialize it. For
more information, see "Designing optional input parameters" on
page 860.
type The data type of the parameter, which controls what sort of input values can be entered.
The following types can be specified using uppercase letters:
o C – character data
o N – numeric data
o D – date subtype of datetime data
o DT – datetime subtype of datetime data
o T – time subtype of datetime data
o L – logical data
Name Description
Note
Qualifying character input values is required for an analytic to run suc-
cessfully.
How PARAM... F works
You can also specify that a file upload utility, or a Windows file browser, opens:
o F – opens a file upload utility, or a Windows file browser, and allows a user to select a
non-Analytics input file for the analytic when running in AX Web Client or the Ana-
lysis App window
Upon selection, the file name is automatically entered as a Character input value.
Specify F only. Do not specify F C.
For example:
For more information, see "Specifying or selecting a non-Analytics input file for an ana-
lytic" on page 864.
Note
A type of F is not supported for use in analytics run in Robots or AX Cli-
ent. To specify an input file for these environments, use the FILE tag. For
more information, see "FILE" on page 848.
OPTIONAL Specifies that the parameter is optional and the user does not need to enter a value.
optional For more information, see "Designing optional input parameters" on page 860.
MULTI The user can select one or more values from a list of values.
VALUES
VALUES
For more information, see "Summary of the MULTI and VALUES options" on page 861.
MULTI cannot be used if type is L (Logical).
Multiple character input values
If you specify MULTI , and type is C (Character), you can also specify the SEPARATOR
and QUALIFIER options to automatically insert separators (delimiters) and text qualifiers
in a string of input values.
Note
Delimiting and qualifying multiple character input values is required for
an analytic to run successfully. The separators and qualifiers can be
inserted automatically, or manually by the user.
Name Description
SEPARATOR value SEPARATOR can be used only when MULTI is specified, and type is C (Character).
optional Specifies that a separator character is automatically inserted between multiple char-
acter input values, creating a delimited list that is passed to the analytic for processing.
value specifies the separator character to use. A commonly used separator, or delimiter,
is the comma , .
If SEPARATOR is omitted, a single space is used as the separator by default. The space
character cannot be specified as value.
For more information, see "Delimiting and qualifying character input values" on
page 862.
QUALIFIER value QUALIFIER can be used only when MULTI is specified, and type is C (Character).
optional Specifies that a text qualifier character is automatically inserted at the start and end of
each character input value in a delimited list that is passed to the analytic for pro-
cessing. Any text enclosed within the qualifier characters is treated as plain text.
value specifies the qualifier character to use. A commonly used qualifier is the single
quotation mark ' .
If QUALIFIER is omitted, there is no default qualifier used. You cannot specify a space
character as value.
For more information, see "Delimiting and qualifying character input values" on
page 862.
Note
Analytic input parameters currently do not support the use of the double
quotation mark (") as a text qualifier. You can use the single quotation
mark (') instead. Specifying a double quotation mark qualifier will cause
the PARAM tag to malfunction.
VALUES value_list A list of values that the user can select from when running the analytic.
optional Use the following syntax to specify the values:
VALUES The user can select one or more values from the list of values.
MULTI
VALUES The user can select a single value from the list of values.
MULTI
For more information, see "Summary of the MULTI and VALUES options" on page 861.
Format of values in value_list
Name Description
ues
description Descriptive text that provides additional information about the parameter.
optional In client applications, description is displayed with the input field.
description can provide instructions that assist the user. For example, "Enter the cutoff
date for the payroll period".
description must be entered on the next line after the associated PARAM tag. The text
can be multiline, but it cannot skip lines. Line breaks are not preserved when displayed
in client applications.
Examples
Basic examples
Allows the user to optionally specify a date range:
Advanced examples
Require a user to specify an amount range
You need to classify the records in a table that fall between a minimum and maximum amount range. This
range changes occasionally, so you provide input parameters that allow the user who runs the analytic to
define the range when scheduling or running the script:
COMMENT
//ANALYTIC test_analytic
//PARAM v_min_amount N Minimum Amount
Enter a minimum amount
//PARAM v_max_amount N Maximum Amount
Enter a maximum amount
END
COMMENT
//ANALYTIC test_analytic
//PARAM v_cust_no C OPTIONAL MULTI SEPARATOR , QUALIFIER ' Customer Number(s) to
exclude (optional)
Specify one or more customer numbers. Press "Enter" after each number, so that each number is on a
separate line. Do not enclose numbers in quotation marks.
END
Allow the user to select an input file (AX Web Client or the Analysis App win-
dow only)
You are distributing an analysis app to colleagues who will run it in the Analysis App window. When they run
the analytic script in the app, you want to provide them with a Windows file browser to select a Microsoft
Excel file to import data from:
COMMENT
//ANALYTIC test_analytic
//PARAM v_input_file F Input File
Select an input file
END
Require the user to specify an input file path and file name (the Analysis App
window only)
You are distributing an analysis app to colleagues who will run it in the Analysis App window. When they run
the analytic script in the app, you want them to specify a filepath and filename to use as an import file:
COMMENT
//ANALYTIC test_analytic
//PARAM v_input_file C Input File Path and Name
Enter an absolute file path and a file name, for example: C:\Users\username\Documents\ACL
Data\Sample Data Files\ Trans_May.xls
END
COMMENT
//ANALYTIC test
This analytic tests the PARAM
//RESULT TABLE t_results
//PARAM v_start_date D OPTIONAL Enter Start Date
//PARAM v_end_date D OPTIONAL Enter End Date
//PARAM v_entity_list C MULTI OPTIONAL |entity1|entity2|
END
Remarks
Designing optional input parameters
If you use OPTIONAL with the PARAM tag, the variable associated with the analytic input parameter may
or may not be created when the analytic runs:
l variable automatically created – if the user specifies an input value
l variable not created – if the user skips the optional parameter and does not specify an input value
Use SEPARATOR and Include the SEPARATOR and QUALIFIER options in the PARAM tag.
QUALIFIER
For example:
Manually specify sep- Require the user of the analytic to manually specify separators and qualifiers in
arators and qualifiers addition to the actual input values.
For example:
'North America','Europe','Asia'
Include qualifiers in the Include qualifiers with each value in the value_list specified with the VALUES
value_list option.
For example:
Enclose the parameter vari- In the syntax of the Analytics script, enclose the parameter variable in text qual-
able in qualifiers ifiers.
For example:
IF MATCH(REGIONS, "%v_regions%")
4 Use this method only if you are using VALUES without MULTI.
Note
Analytic input parameters currently do not support the use of the double quotation mark (") as a
text qualifier. You can use the single quotation mark (') instead with the QUALIFIER option, in
the value_list, or when manually specifying qualifiers around input values. Double quotation
marks can be used as text qualifiers in the body of an Analytics script.
The
Analysis
AX Cli- AX Web App win-
Method Details Robots ent Client dow
PARAM tag o AX Web Client – user selects the input file using a
with type of file upload utility
'F'
The file name is automatically specified as the ana-
lytic input value. The file is automatically uploaded
to the appropriate Related Files subfolder on AX
Server.
o Analysis App window – user selects the input file
using the Windows file browser
The file path and the file name are automatically
specified as the analytic input value.
This method is the best option because it combines
flexibility, ease of use, and precision.
PARAM tag The user manually specifies an input file path and file
with type of name as an analytic input value.
'C'
This method provides flexibility because the file path
and the file name are not specified in advance.
However, it is laborious and error prone because it
requires the user to manually enter these values.
The
Analysis
AX Cli- AX Web App win-
Method Details Robots ent Client dow
Input file This method avoids use of the PARAM tag, however it
path and is the least flexible. On every computer where the ana-
file name lytic is run, the user must ensure that the input file has
hard-coded a file path and a file name identical to those specified
in the ana- in the analytic.
lytic
PASSWORD
Creates a password input parameter for an analytic. The parameter provides encrypted storage of a pass-
word for subsequent use in an ACLScript command.
The user is prompted to specify the required password value when they schedule or start an analytic so
that user intervention is not required as the analytic is running.
Syntax
//PASSWORD index name
< description>
Parameters
Name Description
index The numerical identifier associated with the password. The value must be from 1 to 10.
name The label for the password prompt. name is displayed in the client application when the
application prompts the user to enter a password.
Examples
Create a password input parameter for a Direct Link SAP query
The analytic header specifies a password input parameter that prompts the user to enter an
SAP password. The stored password is used in the subsequent RETRIEVE command in the body of the
script.
COMMENT
//ANALYTIC SAP Password Example
//PASSWORD 1 SAP Password:
//DATA RSADMIN
END
Note
The password input parameter and the password parameter in the RETRIEVE command
are linked by using the same numerical identifier:
COMMENT
//ANALYTIC HighBond Password Example
//PASSWORD 3 HighBond Password:
END
SET SAFETY OFF
OPEN AR_Exceptions
EXPORT FIELDS No Due Date Ref Amount Type ACLGRC PASSWORD 3 TO "10926@us"
SET SAFETY ON
Remarks
Password storage and encryption
Password values are associated with individual users, and are encrypted at rest. Passwords remain secure
throughout analytic processing, and are encrypted in any temporary files created in the deployment envir-
onment.
Testing in Analytics
If you test an analytic that has one or more PASSWORD tags in Analytics, Analytics automatically gen-
erates a PASSWORD command and prompts you to enter the appropriate password. This auto-generated
command saves you the labor of inserting PASSWORD commands in the script portion of an analytic for
the purposes of testing, and then removing them again before delivering the analytic to users.
The auto-generated PASSWORD command is saved in the log, without the password value.
Password values are not saved when you run an analytic in Analytics, so you must specify the password or
passwords each time you run the analytic, including running or stepping through the analytic from the
cursor position.
DATA
Specifies that an Analytics table output by an analytic is copied to a data subfolder (a storage location) in the
deployment environment.
Typically, you store Analytics tables so that they can be used as input tables for subsequent analytics.
Note
ACL Robotics with a cloud-based Robots Agent does not include a storage location for Ana-
lytics tables. The //DATA tag is ignored in analytics run with a cloud-based agent.
Syntax
//DATA name
Parameters
Name Description
name The name of the Analytics table to be stored. The value of name cannot contain any
spaces.
Note
The value of name must exactly match the name of the Analytics output
table in the analytic script. You are not naming a table with name, you are
matching a name specified in the script.
You must specify the table name, not the source data file name.
Correct:
//DATA Missing_Checks
Incorrect:
//DATA Missing_Checks.fil
Note
Any existing Analytics table with the same name as the value you specify
is overwritten.
Wildcard characters
You can use wildcard characters in name if part of the table name may change. For
example, if the table name depends on the month (invoices-jan, invoices-feb, and so on),
specifying invoices-* ensures that the table is copied to the data subfolder regardless of
Name Description
//DATA *
Caution
Be careful when using wildcards characters. You may unintentionally
overwrite existing data tables if the wildcard pattern that you specify
matches unintended tables.
As a best practice, make the value of name as specific as possible. Use
wildcard characters only where they are required.
Uploads to Robots
For information about uploads to Robots, see "Uploads to the cloud-based Robots mod-
ule" on page 872.
Examples
Copying an Analytics table to the storage location
The following analytic header specifies that the Invoices table, which is output in the associated script, is
copied to the storage location:
COMMENT
//ANALYTIC Import Table
//DATA Invoices
END
IMPORT DELIMITED TO Invoices "Invoices.fil" FROM "Invoices.csv" 0 SEPARATOR "," QUALIFIER
'"' CONSECUTIVE STARTLINE 1 KEEPTITLE ALLCHAR ALLFIELDS
Remarks
Storing output tables
Output tables are not automatically copied to the storage location. You must use a DATA tag for each
table that you want to store. You can include multiple DATA tags in an analytic header if necessary.
Deployment envir-
onment Use the DATA tag if... Do not need the DATA tag if...
Robots o an Analytics table output in one robot task o an Analytics table is output and sub-
is required as input in another robot task sequently input during a sequence of ana-
(Enterprise Edition
lytic scripts run in a single robot task
only) o an entire data analysis process is com-
pleted using a single analytic script
AX Server o an Analytics table output by one analytic o an entire data analysis process is com-
script is required as input for another ana- pleted using a single analytic script
lytic script
Analysis App win- o an Analytics table output by one analytic o an entire data analysis process is com-
dow script is required as input for another ana- pleted using a single analytic script
lytic script
//DATA src_Invoices
You must add the prefix to the table name in both the //DATA tag and in the accompanying script.
The Source tables section allows you to visually separate tables that provide input for subsequent scripts.
If no output table names have the src_ prefix, the Source tables section does not appear in the Input/Out-
put tab and all tables are located by default in the Other tables section.
RESULT
Specifies the analytic output results that you want to make available to end users in client applications.
Output results, even if they exist, are not automatically made available. You must use a separate RESULT
tag for each result item that you want to make available.
Syntax
//RESULT type name
<description>
Parameters
Name Description
name The name of the result item. The name value cannot contain any spaces.
Note
The name value must exactly match the name of the result item in the ana-
lytic script. You are not naming an item with name, you are matching a
name specified in the script.
Table name
The name value specifies an Analytics table name. You must specify the table name, not
the source data file name.
Correct:
Incorrect:
Wildcard characters
Name Description
You can use wildcard characters in name if part of the table name may change. For
example, if the table name depends on the month (invoices-jan, invoices-feb, and so on),
specifying invoices-* ensures that the table is made available in the results regardless of
the month suffix.
Log name
Optional. The name value specifies an analytic log file name. If you do not specify name,
the default log name is used: analytic_name.log.
Note
If you specify a log name, SET LOG TO log_name must appear in the
script.
File name
The name value specifies a non-Analytics file name.
You must specify the appropriate file extension for the type of non-Analytics file being out-
put.
Correct:
Incorrect:
Wildcard characters
You can use wildcard characters for all or part of the name value to specify all files with a
specific extension ( *.xlsx ), or to specify files where part of the file name may change.
For example, if the file name depends on the month (invoices-jan.xlsx, invoices-feb.xlsx,
and so on), specifying invoices-*.xlsx ensures that the file is made available in the results
regardless of the month suffix.
description Descriptive text about the result or other information. The description can be multiline, but
it cannot skip lines.
optional
Examples
RESULT tag for an Analytics table:
//RESULT LOG
Remarks
Uploads to the cloud-based Robots module
With analytic scripts run in Robots installations, specifying RESULT LOG or RESULT FILE uploads analytic
log files or non-Analytics files from the on-premise Robots Agent to the cloud-based Robots module in
HighBond.
For more information about logs, see "How log files are output" below.
Specifying RESULT TABLE uploads the table layout only (field name, data type, field length). Result table
data remains on your organization's network, within the Robots Agent directory.
All information is encrypted in transit.
Analytic
script Robots Agent AX Server Analysis App window
Analytic
script Robots Agent AX Server Analysis App window
o RESULT LOG tag not con- o RESULT LOG tag not con- o RESULT LOG tag not con-
sidered sidered sidered
l log file automatically out-
log file automatically output log file automatically output to
put to Robots Agent base
to AX Server (available in cli- Results tab
data folder
ent applications)
(configuration setting =
"false")
l log file automatically
uploaded to cloud-based
Robots module
(configuration setting =
"true" (default))
See configuration setting
UploadLogsWhenFailed in
Failed Configuring a Robots Agent.
PUBLISH
Specifies a file that contains metadata defining which Analytics tables to publish to AX Exception when an
analytic is finished processing.
Syntax
//PUBLISH filename
Parameters
Name Description
filename The name of the file containing publishing metadata for AX Exception.
Examples
The analytic definition and the text file that specifies the publishing details for
the analytic
The FILE tag is required if the publish file is stored in the AX folder, so that the file is retrieved when the ana-
lytic is processed.
COMMENT
//ANALYTIC Publish Results
//RESULT TABLE Results
//FILE ex_publish.txt
//PUBLISH ex_publish.txt
END
EXTRACT RECORD TO Results
The ex_publish.txt file uploaded to the Related Files subfolder in the collection contains the following
line of text. The values must be quoted, and must use the following syntax: "table name","entity name","ana-
lytic name"
"Results","MyEntity","MyAnalytic"
Developing analytics
To facilitate debugging and to isolate errors, develop the script body before adding the analytic header.
Once you add the analytic header, use the log file and temporary test values to test how the analytic
executes. Finally, deploy the analytic to the target environment.
Note
The following workflow is a suggested approach for developing analytics, however you are
free to develop analytics in whatever manner best suits you.
//RESULT LOG
For more information about assigning temporary test values, see "Specifying test input values in Analytics"
on page 844.
COMMENT
Analytic tags go here.
END
Inputs Outputs
Use the PARAM tag to add input parameters that accept user-specified input values and store them in vari-
ables. For example, if you want an analytic to select data based on a date range, you need to add Start Date
and End Date input parameters that allow users to specify these dates.
3. If the analytic header contains an error, correct the error and click Validate Analytic Header
again to ensure that there are no further errors.
Tip
If the nature of the error is not apparent based on the error message, review the Help
topic for the associated analytic tag. Carefully compare the syntax in the topic with the
syntax in the line of the analytic header. Errors can be caused by minor discrepancies
in the analytic header syntax.
form the validation manually if you add the Check Scripts button to the Analytics toolbar.
3. If the analytic headers contain an error, correct the error and click Check Scripts again to
ensure there are no further errors.
Test locally
Test all analytics locally before deploying them to the target environment. Ensure that analytics run as expec-
ted, and that they do not require user interaction.
For more information, see "Developing analytics" on page 878.
PASSWORD //PASSWORD
PAUSE no equivalent
Guidelines
l to prevent analytic processing failures, remove all interactive commands
l to ensure files can be overwritten as necessary without displaying a confirmation dialog box, add the
SET SAFETY OFF command at the beginning of an analytic and then add the SET SAFETY ON
command at the end of the analytic to restore the default behavior
l to prevent confirmation dialogs from crashing the analytic, add the OK parameter after any com-
mands that normally display a confirmation dialog box:
l RENAME
l DELETE
l Correct any script syntax that generates a warning, and click Check Scripts again to ensure
that the warnings no longer appear.
l Ensure that the deployment environment contains a directory structure, or external scripts, that
align with the paths or external scripts specified in the analytic.
OPEN LargeTable
SET FILTER TO trans_date >= `20091201` AND trans_date < `20100101`
COUNT
TOTAL amount
CLASSIFY ON account ACCUMULATE amount TO TransClassAccount
OPEN LargeTable
SET FILTER TO trans_date >= `20091201` AND trans_date < `20100101`
EXTRACT FIELDS trans_date desc account type amount TO AnalysisTable
OPEN AnalysisTable
COUNT
TOTAL amount
CLASSIFY ON account ACCUMULATE amount TO TransClassAccount
2. In Analytics, right-click the project entry in the Overview tab of the Navigator and select Package
Analysis App.
The Analytics project is the top-level folder in the tree view.
3. In the Select Tables dialog box, do the following:
a. If you want to include one or more of the project tables and associated data files in the analysis
app, select the table(s) and the data file(s) to include.
Note
Generally you should include only static tables and data files that are required by
one or more of the analytics in the analysis app, such as a master vendor table, or
a list of merchant category codes.
b. Optional. To include the interpretations from the existing analysis app, select Include Inter-
pretations .
Interpretations that are associated with tables or scripts that do not exist in the new package are not
included.
c. Click To and navigate to the location where you want to save the packaged analysis app.
d. In the Save As dialog box, enter a File name with the .aclapp file extension and click Save.
e. Click OK.
Result – the packaged analysis app is saved to the location you specified. Other users can retrieve the pack-
aged analysis app from this location, or you can distribute it by email, or by other appropriate method. You
can also import this file into AX Server.
l AX Web Client
COMMENT
//ANALYTIC TYPE PREPARE Sample Preparation analytic
This analytic prepares the raw data table for analysis and saves it to the new Analytics table "Trans_
May_prepared" (the analysis table). The analytic defines a shorter version of the "Description" field
because classifying only supports field lengths up to 64 characters.
//TABLE v_RawTable Table to prepare
Select the raw data table you want to prepare
//RESULT TABLE Trans_*_prepared
//DATA Trans_*_prepared
//RESULT LOG
END
COMMENT
//ANALYTIC TYPE ANALYSIS Sample Analysis analytic
This analytic classifies the analysis table and outputs the results to the new Analytics table "Clas-
sified_Trans_May_prepared" (the results table). You can specify merchant category codes, customer
numbers, and date and transaction amount ranges, to restrict which records are processed.
//TABLE v_AnalysisTable Table to classify
Select the analysis table you want to classify
//FIELD v_FieldA C Field to classify on
Select the field you want to classify on
//PARAM v_codes C MULTI SEPARATOR , QUALIFIER ' VALUES |4112 Passenger Railways|4121
Taxis/Limousines|4131 Bus travel|4215 Courier Services - Air or Ground|4411 Cruise Lines|4457
Boat Leases and Boat Rentals|4722 Travel Agencies and Tour Operations|4814 Local/long-distance
calls|5812 Restaurants|5813 Drinking Places (Alcoholic Beverages)|5814 Fast food restaurants|5921
Package Stores, Beer, Wine, Liquor|5993 Cigar Stores & Stands|5994 Newsstand|7216 Dry cleaners|
MC code(s) to include
Specify one or more merchant category codes to include
//PARAM v_cust_no C OPTIONAL MULTI SEPARATOR , QUALIFIER ' Customer Number(s) to
exclude (optional)
Specify one or more customer numbers to exclude. Press "Enter" after each number, so that each num-
ber is on a separate line. Do not enclose numbers in quotation marks.
//PARAM v_start_date D VALUES
|05/01/2003|05/02/2003|05/03/2003|05/04/2003|05/05/2003|05/06/2003|05/07/2003|05/08/2003|05/0-
9/2003|05/10/2003|05/11/2003|05/12/2003|05/13/2003|05/14/2003|05/15/2003|05/16/2003|05/17/200-
3|05/18/2003|05/19/2003|05/20/2003|05/21/2003|05/22/2003|05/23/2003|05/24/2003|05/25/2003|05/-
26/2003|05/27/2003|05/28/2003|05/29/2003|05/30/2003|05/31/2003|Start Date
Select a start date
//PARAM v_end_date D End Date
Enter an end date or pick one from the calendar
//PARAM v_min_amount N Minimum Amount
Enter a minimum amount
//PARAM v_max_amount N Maximum Amount
Enter a maximum amount
//RESULT TABLE Classified_*
//RESULT LOG
END
Appendix
System requirements
Before installing Analytics, ensure that your computer meets the minimum software and hardware require-
ments.
Software requirements
Note
Some software prerequisites are automatically installed if not already present on your
computer. For a complete list of automatically installed prerequisites, see the online doc-
umentation.
One of the following operating ACL for Windows is a 32-bit application that can run on the 64-bit versions of Win-
systems: dows.
o Microsoft Windows 10 (64-bit) Note
o Microsoft Windows 8.1 (64-
To install ACL for Windows on Windows 7, you must have Service
bit)
o Microsoft Windows 7 Service Pack 1 installed. ACL for Windows requires Microsoft .NET 4.6.x,
Pack 1 (SP1) (32-bit/64-bit) which cannot be installed on versions of Windows 7 prior to SP1.
Windows XP is no longer a supported operating system.
To use Analytics functions that The bitness of the installed version of R must match the bitness of the operating
integrate with the system.
R programming language, you
must install and configure R (32- If you are using one of the CRAN R packages, you may need to add the path to
bit/64-bit). the R binary folder to the PATH environment variable on your computer.
The following versions of R have For example:
been tested and work with Ana- o C:\Program Files\R\R-<version>\bin\i386 (32-bit)
lytics: o C:\Program Files\R\R-<version>\bin\x64 (64-bit)
o 3.3.2 Note
o 3.3.1
o 3.2.5 You do not need to install R if you do not intend to use the
o 3.2.3 Analytics functions integrated with this language.
You can use either CRAN R (32-
bit/64-bit) or Microsoft R (64-bit
only).
Note
Other versions of R
may work as well.
However, they are
not guaranteed to
work.
Currently, R 3.5.0
and later are not sup-
ported for use with
Analytics.
To use Analytics functions that Python version 3.5.x is fully tested and supported. You may use a different version
integrate with the Python pro- such as 3.3.x or 3.6.x, however these versions do not offer the same guarantee of
gramming language, you must testing and support with Analytics as 3.5.x.
install and configure:
When installing Python, you must also configure it to run on your system. For
o Python version 3.3.x or later more information, see "Configuring Python for use with Analytics" on page 905.
(32-bit)
Note
o PYTHONPATH environment
variable You do not need to install Python if you do not intend to use the
o ACLPYTHONDLL envir- Analytics functions integrated with this language.
onment variable
The local copy of the Python Engine contained in the Analytics
installation directory is not intended for use with the Analytics
Python functions, or for general Python use. You must install a sep-
arate instance of Python for these purposes.
To use the ACL Connector for o You do not need to install Oracle Instant Client if you do not intend to use the
Oracle, you must install: ACL Connector for Oracle.
o The bitness of Oracle Instant Client must match your operating system's bit-
o Oracle Instant Client 11g or
ness. If the 32-bit Instant Client is installed on a 64-bit machine, the connection
12c
fails.
o If you are using the connector with Analytics Exchange and you install the
Oracle Instant Client after AX Server, you must restart the Analytics Exchange
Service before you can use the connector.
Hardware requirements
Note
The best Analytics performance in a production environment may require greater
resources than the minimum specification.
l Non-Unicode
l Unicode
For more information about Unicode and non-Unicode editions, see "Unicode versus non-Unicode edi-
tions" on page 908.
Start Analytics
To start Analytics, do one of the following:
To create a new, empty Analytics project Under Create, click Analytic Project
To open a recent or a sample Analytics project (.acl) Under Recent Analytics Files, or Sample Files, click the
name of a project
How it works
To run Python scripts, Analytics must be able to call the Python executable and find the scripts it is instruc-
ted to run. Analytics uses the PATH environment variable to locate Python and the
PYTHONPATH environment variable to locate scripts.
3. In the System variables section, click New and enter the following variables:
Variable name Variable value Example
4. To save the variable, click OK and then in the System Properties dialog box, click OK.
Note
If you make any edits to a Python script, you must refresh the view in your Analytics project
to use the latest version of the Python script. The simplest way to refresh the view is to close
the table you are working with and then re-open it.
What is Unicode?
Unicode is a standard for encoding text that uses two or more bytes to represent each character, and char-
acters for all languages are contained in a single character set. The Unicode editions of Galvanize products
allow you to view and work with files and databases that contain Unicode-encoded data in all modern lan-
guages.
Note
Analytics and the AX Engine support little-endian (LE) encoded Unicode data. These
products cannot be used to analyze big-endian (BE) encoded data.
Non-Unicode
Unicode
Conversion Functions
l PACKED( )
l STRING( )
l UNSIGNED( )
l VALUE( )
l ZONED( )
String functions
l AT( )
l BLANKS( )
l INSERT( )
l LAST( )
l LENGTH( )
l REPEAT( )
l SUBSTRING( )
Miscellaneous functions
l FILESIZE( )
l LEADING( )
l OFFSET( )
l RECLEN( )
Prerequisites
To run R scripts on AX Server, you must:
1. Install a supported version of the R scripting language on your AX Server computer.
2. Add the .r extension to the file extension whitelist on AX Server.
3. In Analytics, create a project to work with and import into AX Server.
Note
For help completing these prerequisites, contact your Analytics Exchange administrator
and see:
l AX Server requirements
l Whitelisting file extensions
Example R files
The following example R files contain trivial scripts that concatenate two strings and return a single string
joined by a space character. These examples are intended to show how R scripts run on AX Server, not
how to analyze data with R.
analysis_a.r
conc<-function(x, y) {
paste(x, y, sep=" ")
}
print(conc(value1, value2))
analysis_b.r
conc<-function(x, y) {
paste(y, x, sep=" ")
}
print(conc(value1, value2))
COMMENT
//ANALYTIC R integration test
verify R integration on AX Server
//DATA t_tmp
//FILE analysis_a.r
//FILE analysis_b.r
//RESULT TABLE results
END
OPEN t_tmp
CLOSE t_tmp
COMMENT
//ANALYTIC R integration test
verify R integration on AX Server
//DATA t_tmp
//FILE analysis_a.r
//FILE analysis_b.r
//RESULT TABLE results
END
OPEN t_tmp
CLOSE t_tmp
Note
Make sure you select the R files in the project folder as well as the Analytics project
using Ctrl+click so that they are imported into AX Server. You must also import the
source data files for the t_tmp table.
c. Click Open.
l Analysis Apps
l ACLProjectName
l analyticScriptName
l Data
l t_tmp
l Related Files
l analysis_a.r
l analysis_b.r
Results
Server explorer after running the analytic
l collectionName
l folderName
l Analysis Apps
l ACLProjectName
l analyticScriptName
l Data
l results
l Related Files
l analysis_a.r
l analysis_b.r
Results table
l value
l test value
l value test
Prerequisites
To run Python scripts on AX Server, you must:
1. Install a supported version of the Python scripting language on your AX Server computer.
2. Set the PYTHONPATH environment variable on AX Server.
3. In Analytics, create a project to work with and import into AX Server.
Note
For help completing these prerequisites, contact your Analytics Exchange administrator
and see:
l AX Server requirements
l Configuring Python for use with AX Server
Filename: lambda_example.py
COMMENT
//ANALYTIC Python integration test
verify Python integration on AX Server
//DATA py
//DATA results
//RESULT TABLE results
END
OPEN py
GROUP
ASSIGN v_max = 11
ASSIGN v_counter = 1
LOOP WHILE v_counter < v_max
EXTRACT PYNUMERIC("lambda_example,myFunc",0,v_counter) AS "Results value" TO "res-
ults.fil"
v_counter = v_counter + 1
END
END
CLOSE py
COMMENT
//ANALYTIC Python integration test
verify Python integration on AX Server
//DATA py
//DATA results
//RESULT TABLE results
END
OPEN py
GROUP
ASSIGN v_max = 11
ASSIGN v_counter = 1
LOOP WHILE v_counter < v_max
EXTRACT PYNUMERIC("lambda_example,myFunc",0,v_counter) AS "Results value" TO "res-
ults.fil"
v_counter = v_counter + 1
END
END
CLOSE py
l collectionName
l folderName
l Analysis Apps
l ACLProjectName
l analyticScriptName
l Data
l py
l Related Files
Results
Server explorer after running the analytic
l collectionName
l folderName
l Analysis Apps
l ACLProjectName
l analyticScriptName
l Data
l py
l results
l Related Files
Results table
l Results value
l 1
l 4
l 9
l 16
l 25
l 36
l 49
l 64
l 81
l 100
1000 No preference file was specified. A new default preference file was created.
1001 There is a problem with the preference file. A new default preference file was created.
1002 The project has been upgraded from an earlier version. A copy was saved with a .old extension.
Error
Code Description
1003 The project file could not be processed. The last saved project was used.
1008 The specified .old project file cannot be used. You must specify a project file with the .ACL extension.
Command errors
The following table lists the error codes that are returned when an analytic fails because of an invalid
ACLScript command. The error code number returned identifies the command that failed.
Error
Code Command
1 SAMPLE
2 EXTRACT
3 LIST
4 TOTAL
Error
Code Command
5 DEFINE
6 COMMENT
7 QUIT
8 STOP
9 BYE
10 USE
11 OPEN
12 SAVE
13 DISPLAY
14 ACTIVATE
15 CLOSE
16 HELP
17 COUNT
18 STATISTICS
19 HISTOGRAM
20 STRATIFY
21 SUMMARIZE
22 EXPLAIN
23 GROUP
24 ELSE
25 END
26 CANCEL
27 SUBMIT
Error
Code Command
28 DELETE
29 RANDOM
30 SORT
31 FIND
32 DIRECTORY
33 TYPE
34 DUMP
35 INDEX
37 SET
40 DO
41 TOP
42 EXPORT
43 VERIFY
44 SEEK
45 JOIN
46 MERGE
47 SEQUENCE
48 CALCULATE
49 PRINT
50 LOCATE
51 RENAME
54 COPY
55 REPORT
Error
Code Command
56 EJECT
58 LET
59 ACCUMULATE
63 ACCEPT
64 ASSIGN
65 AGE
66 CLASSIFY
67 PROFILE
68 DO REPORT
69 LOOP
70 PAUSE
71 SIZE
72 EVALUATE
73 DIALOG
74 IF
75 GAPS
76 DUPS
77 SQLOPEN
78 PASSWORD
79 IMPORT
80 REFRESH
81 NOTIFY
82 CONNECT
Error
Code Command
83 RETRIEVE
84 FIELDSHIFT
85 BENFORD
86 CROSSTAB
87 (not used)
88 ESCAPE
89 NOTES
90 FUZZY DUPLICATE
91 EXECUTE
-10 The analytic results could not be saved because the destination results folder was deleted after the analytic
started running.
Error
Code Error message
-23 Publish failed. One or more of the table's column names are too long.
-24 Publish failed. Invalid values within data cells within an Analytics table.
-25 Publish failed. Not supported data types within table fields.
-27 Job did not run. The user was removed or does not have permission.
-28 Job did not run. Unexpected error. See the server log and Analytics log for details.
-29 Could not copy data files. The analytic failed because the required data files could not be copied to the jobs
directory.
-31 Publish failed. The exception mapping file could not be located.
-34 Failed to store job results. Check that there is sufficient space on the drive storing the jobs folder and that no
data files are locked.
l Click Display Variables on the toolbar (requires that you first add the button to the toolbar)
The second and third methods also display the remaining memory available to store variables.
o Any command that out- o The number of records in the output table
puts a table o The number of data validity errors (Verify)
o Verify o The number of sequence errors (Sequence)
WRITEn o Sequence
o Any command that out- The path to the Analytics project folder in the Navigator that
puts an Analytics table contains the output table.
This a DOS-style path that uses the format /folder-
name/subfoldername, in which the initial slash (/) indicates the
root level in the Overview tab.
Tip
Use the SET FOLDER command to specify a dif-
ferent output folder or to create a new output
OUTPUTFOLDER folder.
ACL_Ver_Major o Display Version The major version of Analytics that is currently running.
(Analytics version num-
ACL_Ver_Minor bers are in the format The minor version of Analytics that is currently running.
major.minor.patch)
ACL_Ver_Patch The patch version of Analytics that is currently running.
Note
When Analytics identifies the lowest value, duplic-
ate values are not factored out. For example, if val-
ues in ascending order are 1, 1, 2, 3, the 3rd
lowest value is 2, not 3.
MODEn The most frequently occurring value in the first specified field.
The first quartile value (lower quartile value) in the first spe-
Q25n cified field.
The third quartile value (upper quartile value) in the first spe-
Q75n cified field.
o Total The sum total of the values in the first specified field.
TOTALn o Statistics
Reserved keywords
Analytics reserves certain keywords for special purposes. You cannot name fields or variables with these
reserved keyword values.
If you add a suffix to a reserved keyword, then you can use it as a field or variable name. For example, the
name "Field" is not permitted, but "Field_1" or "Field_2" are permitted.
Note
In some cases, you are also prevented from using abbreviations of the reserved keywords,
such as "Can" (CANCEL), "Form" (FORMAT), or "Rec" (RECORD).
AXRunByUser A system variable that stores the username of the user running an analytic script on AX Server
in the format "domain\username".
D Specifies a descending sort order for the preceding expression or field name.
END Concludes the input stream and acts like a null line.
IF Specifies a condition.
LINE Used by the DEFINE COLUMN command to specify whether a field breaks over a specified
number of lines.
OTHER Indicates which fields or expressions to include, but not subtotal, in the output of the
SUMMARIZE command.