Data Management
As you manage your data, you may be required to modify your server and communications configuration files. The first step is understanding how and where data is described and the roles of the server and adapters in managing the processing flow.
Describing Data Sources
To access a table or view, you must first describe it using two files: a Master File and an associated Access file.
Master Files and Access Files can represent an entire table or part of a table. Also, several pairs of Master and Access Files can define different subsets of columns for the same table, or one pair of Master and Access Files can describe several tables.
Processing Requests
When requests are processed, control is passed from the server to an adapter and back. During the process, selected information is read from the Master and Access Files as described below.
The server processes a request as follows:
- The request is parsed to identify the table.
- The Master File for the table is read.
- The SUFFIX value in the Master File is checked (SUFFIX indicates the type of data source).
- Control is passed to the appropriate adapter.
The adapter then:
- Locates the corresponding Access File.
- Uses the information contained in the Master and Access Files to generate the DML statements (if necessary) required to accomplish the request.
- Passes the DML statements to the data source.
- Retrieves the answer set generated by the data source.
- Returns control to the server.
Depending on the requirements of the request, additional processing may be performed on the returned data.
Master File
A Master file describes a logical data source. A logical data source can be made up of one or more physical data sources of the same type. Each segment is a physical data source.
Master files contain three types of declarations:
Declaration Type |
Description |
---|---|
File |
Names the file and describes the type of data source. |
Segment |
Identifies a table, file, view, or segment. |
Field |
Describes the columns of the table, view, or fields in the file. |
The following guidelines apply:
- A declaration consists of attribute-value pairs separated by commas.
- Each declaration must begin on a separate line. A declaration can span as many lines as necessary, as long as no single keyword-value pair spans two lines.
- Do not use system or reserved words as names for files, segments, fields, or aliases. Specifying a reserved word generates syntax errors.
Specify a File Declaration in a Master File
A Master File begins with a file declaration, which has at least two attributes:
Identifies the Master File.
Identifies the adapter needed to interpret the request.
The syntax for a file declaration is
FILE[NAME]=file, SUFFIX=suffix [,$]
where:
Is the file name for the Master file. The file name should start with a letter and be representative of the table or view contents. The actual file must have a .mas extension, but the value for this attribute should not include the extension. The file name without the .mas extension can consist of a maximum of eight alphanumeric characters.
Identifies the adapter needed to interpret the request. For example, SQLORA is the value for the Adapter for Oracle.
Specify a Segment Declaration in a Master File
Each table described in a Master File requires a segment declaration. The segment declaration consists of at least two attributes:
Identifies one table.
Identifies the physical storage of rows and the uniqueness of column values.
The syntax for a segment declaration is
SEGNAME=segname, SEGTYPE=S0 [,$]
where:
Is the segment name that serves as a link to the actual table name. It may be the same as the name chosen for FILENAME, the actual table name, or an arbitrary name. It can consist of a maximum of 8 alphanumeric characters.
The SEGNAME value in the Master File must be the same as the SEGNAME value specified in the Access File, where the TABLENAME portion of the segment declaration contains the fully qualified name of the table.
Indicates that the RDBMS is responsible for both the physical storage of rows and the uniqueness of column values (if a unique index or constraint exists). It always has a value of S0 (S zero).
Specify a Field Declaration in a Master File
Each row in a table may consist of one or more columns. These columns are described in the Master File as fields with the following primary field attributes:
Identifies the name of a field.
Identifies the full column name.
Identifies how to display a field on reports.
Identifies the data type and length in bytes for a field.
Identifies whether a field supports null data.
You can obtain values for these attributes by using the system catalog table ALL_TAB_COLUMNS for the existing table or view you wish to describe.
The syntax for a field declaration is
FIELD[NAME]=fieldname, [ALIAS=]sqlcolumn, [USAGE=]display_format, [ACTUAL=]storage_format [,MISSING={ON|OFF}], $
where:
The name of the field. This value must be unique within the Master File. The name can consist of a maximum of 48 alphanumeric characters including letters, digits, and underscores. The name must begin with a letter. Special characters and embedded blanks are not recommended. The order of field declarations in the Master File is significant with regard to the specification of key columns. For more information, see Primary Key.
It is not necessary to describe all the columns of the table in your Master File.
Is the full column name (the adapter uses it to generate SQL statements). This value must comply with the naming conventions for identifiers, where a name should start with a letter and may be followed by any combination of letters, digits, or underscores. Embedded spaces are not allowed.
Is the display format. The value must include the field type and length and may contain edit options.
The data type of the display format must be identical to that of the ACTUAL format. For example, a field with an alphanumeric USAGE data type must have an alphanumeric ACTUAL data type.
Fields or columns with decimal or floating point data types must be described with the correct scale (s) and precision (p). Scale is the number of positions to the right of the decimal point. Precision is the total length of the field.
For the server, the total display length of the field or column includes the decimal point and negative sign. In SQL, the total length of the field or column excludes the decimal point and negative sign. For example, a column defined as DECIMAL(5,2) would have a USAGE attribute of P7.2 to allow for the decimal point and a possible negative sign.
Is the storage format of the data type and length in bytes.
Displays the character specified by the NODATA parameter for missing data. For related information, see MISSING Attribute.
Displays blanks or zeroes for fields having no value. This is the default. For more information, see MISSING Attribute.
MISSING Attribute
In a table, a null value represents a missing or unknown value. It is not the same as a blank or a zero. For example, a column specification that allows null values is used where a column need not have a value in every row (such as a raise amount in a table containing payroll data).
- The default NODATA character is a period.
- A column in a table that allows null data does not need to include the NULL clause in its table definition, since that is the default.
- In the Master File for that table, the column that allows null data must be described with the MISSING attribute value ON. The default for this attribute is OFF, which corresponds to the NOT NULL attribute in the table definition.
If the column allows null data but the corresponding field in the Master File is described with the MISSING attribute value OFF, null data values appear as zeroes or blanks.
Access File
Each Master File may have a corresponding Access File. The name of the Access File must be identical to that of the Master File, but the extension will be .acx instead of .mas.
The Access File serves as a link between the server and the data source by providing the means to associate a segment in the Master File with the table it describes. The Access File minimally identifies the table and primary key (if there is one). It may also indicate the logical sort order of data and identify storage areas for the table.
Specify a Segment Declaration in an Access File
The segment declaration in the Access File establishes the link between one segment of the Master File and the actual table or view. Attributes that constitute the segment declaration are:
Identifies one table.
Identifies the table or view. It may contain the owner ID as well as the table name.
Identifies how many columns constitute the primary key.
Identifies the logical sort sequence of data by the primary key.
The syntax for a segment declaration in an Access File is
SEGNAME=segname, TABLENAME=owner.tablename databaselink [,KEYS={n|0}] [,KEYORDER={LOW|HIGH}] ,$
where:
Is the same value as the SEGNAME value in the Master File.
Is the user ID by default.
Is the name of the table or view.
Is the DATABASE LINK name to be used in the currently connected database server.
Is the number of columns that constitute the primary key. It can be a value from 0 to 16. The default value is 0. For more information, see Primary Key.
Indicates an ascending primary key logical sort order. This value is the default.
Indicates a descending primary key logical sort order.
Primary Key
A primary key consists of the column or combination of columns whose values uniquely identify each row of the table. In the employee table, for example, every employee is assigned a unique employee identification number. Each employee is represented by one and only one row of the table, and is uniquely identified by that identification number.
The primary key definition must be defined partly in the Master File and partly in the Access File:
- The order of field declarations in the Master File is significant to the specification of key columns. To define the primary key in a Master File, describe its component fields immediately after the segment declaration. You can specify the remaining fields in any order. In the Access File, the KEYS attribute completes the process of defining the primary key.
- To identify the primary key, the adapter uses the number of columns (n) indicated by the KEYS attribute in the Access File and the first n fields described in the Master File.
Typically, the primary key is supported by the creation of a unique index in the SQL language to prevent the insertion of duplicate key values. The adapter itself does not require any index on the columns comprising the primary key (although a unique index is certainly desirable for both data integrity and performance reasons).
Creating Virtual Fields
You use the DEFINE command to accomplish these tasks.
Create Virtual Fields With the DEFINE Command
DEFINE fieldname/format [WITH fieldname]=expression ;$
where:
A field name for the virtual field. It can consist of 1 to 48 characters. You must not qualify the field name.
Provides the display format for the field and follows the rules for USAGE formats. This operand is optional. If not specified, the default value is D12.2.
Must be coded when the expression is a constant. Any real field can be chosen from the same segment the DEFINE is associated with.
Can be either a mathematical or a logical statement. It can consist of constants, database fields, and virtual fields. The expression must end with a semicolon followed by a dollar sign (;$).
Place your DEFINE statements after all of the field descriptions in the segment. If you are using the DESCRIPTION or TITLE attributes with virtual fields, you must place these attributes on a separate line.
Defining a Virtual Field in a Master File
In the example that follows, the virtual field PROFIT is defined at the end of the segment named BODY.
SEGMENT=BODY, SEGTYPE=S0 , PARENT=CARREC,$ FIELDNAME=BODYTYPE ,ALIAS=BODYTYPE ,A12,A12,$ FIELDNAME=DEALER_COST ,ALIAS=DEALER_COST ,D8, D8 ,$ FIELDNAME=RETAIL_COST ,ALIAS=RETAIL_COST ,D8, D8 ,$ DEFINE PROFIT/D8 = RETAIL_COST - DEALER_COST ;DESC=NET_COST, TITLE='NET,COST' ,$
As a result of this DEFINE statement, you can use PROFIT as a field name in reports. PROFIT is treated as a field with a value equal to the value of RETAIL_COST minus DEALER_COST.
- Since the complete data source needs to be read to calculate virtual fields, screening conditions on virtual fields may incur additional overhead.
- Virtual fields in the Master File for relational and remote data sources will, if referenced in a query, disable Automatic Passthru.
Cross-Century Dates
Many existing business applications use two digits to designate a year, instead of four digits. When they receive a value for a year, such as 00, they typically interpret it as 1900, assuming that the first two digits are 19, for the twentieth century. There is considerable risk that date-sensitive calculations in existing applications are wrong unless an apparatus is provided for determining the century in question. This impacts almost every type of application, including those that process mortgages, insurance policies, anniversaries, bonds, inventory replenishment, contracts, leases, pensions, receivables, and customer records.
The cross-century dates feature enables you to solve this problem at the file and field level of your applications. You can retain your global settings while changing the file-level settings for greater flexibility.
You can enable this feature:
- Using SET commands.
- At the file level in a Master File.
- At the field level in a Master File.
Cross-Century Dates SET Commands
The server delivers SET commands that provide a means of interpreting the century if the first two digits of the year are not provided:
SET DEFCENT SET YRTHRESH
If the first two digits are provided, they are simply accepted and validated.
Implement a Cross-Century Date
The DEFCENT syntax is
SET DEFCENT=nn
where:
Is 19 unless otherwise specified.
The YRTHRESH syntax is
SET YRTHRESH=nn
where:
Is zero unless otherwise specified.
The combination of DEFCENT and YRTHRESH establishes a base year for a 100-year window. Any 2-digit year is assumed to fall within that window, and the first two digits are set accordingly. Years outside the declared window must be handled by user coding.
The default values for the two commands are SET DEFCENT=19, SET YRTHRESH=00. When you provide a year threshold, years greater than or equal to that value assume the value assigned by DEFCENT. Years lower than that threshold become DEFCENT plus 1.
To see how DEFCENT and YRTHRESH are applied to interpret 2-digit years, consider the following:
SET DEFCENT=19, SET YRTHRESH=80
This set of commands describes a range from 1980 to 2079. If a 2-digit year field contains the value 99, then the server interprets the year as 1999. If the year field is 79, then the year is interpreted as 2079. If the year field is 00, then the year is interpreted as 2000.
Master File Syntax
Instead of using SET commands, you can include settings at the file level in a Master File, or at the field level in a Master File.
Add Cross-Century Date Settings at the File Level
The FDEFCENT syntax is
{FDEFCENT|FDFC}=nn
where:
Is 19, unless otherwise specified.
The FYRTHRESH syntax is
{FYRTHRESH|FYRT}=nn
where:
Is zero, unless otherwise specified.
Add Cross-Century Date Settings at the Field Level
At the field level, DEFCENT, and YRTHRESH can be added. The DEFCENT syntax is
{DEFCENT|DFC}=nn
where:
Is 19, unless otherwise specified.
The YRTHRESH syntax is
{YRTHRESH|YRT}=nn
where:
Is zero, unless otherwise specified.
Add Cross-Century Dates Using a DEFINE Command
DEFINE FILE EMPLOYEE fld/fmt [{DEFCENT|DFC} nn {YRTHRESH|YRT} nn] [MISSING...]=expression; END
The DFC and YRT syntax must follow the field format information.
Implementing Cross-Century Dates
The following example illustrates how century interpretation is implemented at both the file level and field level in a Master File.
FILENAME=EMPLOYEE, SUFFIX=FOC, FDEFCENT=20, FYRTHRESH=66,$ SEGNAME=EMPINFO, SEGTYPE=S1 FIELDNAME=EMP_ID, ALIAS=EID, FORMAT=A9, $ FIELDNAME=LAST_NAME, ALIAS=LN, FORMAT=A15, $ FIELDNAME=FIRST_NAME, ALIAS=FN, FORMAT=A10, $ FIELDNAME=HIRE_DATE, ALIAS=HDT, FORMAT=I6YMD, DEFCENT=19, YRTHRESH=75,$
The next example illustrates the conversion of a 2-digit year field with the DEFINE command:
DEFINE FILE EMPLOYEE ESHIRE_DATE/YYMD = HIRE_DATE; (The format of HIRE_DATE is I6YM.) ESHIRE DFC 19 YRT 80 = HIRE_DATE; END
Synonym Management Options
You can right-click a synonym in the Application navigation pane of either the WebFOCUS Reporting Server browser interface or the ibi™ Data Migrator desktop interface to access the following options.
Option |
Description |
---|---|
Open |
Opens the Master File for viewing and editing using a graphical interface. If an Access file is used, it is also available. |
Edit as Text |
Enables you to view and manually edit the Master File synonym. Note: To update the synonym,
it is strongly recommended that you use the graphical interface
provided by the Open option, rather than
manually editing the Master File.
|
Edit Access File as Text |
Enables you to view and manually edit the Access File synonym. Note: This option is available
only when an Access File is created as part of the synonym.
|
Sample Data |
Retrieves up to 20 rows from the associated data source. |
Data Profiling |
Data Profiling provides the data characteristics for synonym columns. Alphanumeric columns provide the count of distinct values, total count, maximum, minimum, average length, and number of nulls. Numeric columns provide the count of distinct values, total count, maximum, minimum, average value, and number of nulls. |
Refresh Synonym (if applicable) |
Regenerates the synonym. Use this option if the underlying object has been altered. |
Data Management |
Followed by these options, if applicable: Recreate DBMS Table. Recreates the data source table. You are asked to confirm this selection before the table is regenerated. (Note that the table will be dropped and recreated. During the process, data may be lost.) Delete All Data. Deletes all existing data. You are asked to confirm this selection before the data is deleted. Drop Table. Drops the table so that it is removed from the DBMS. Insert Sample Data. Inserts the specified number of sample records, populating all fields with counter values. Show/Modify Data. Opens a window that shows the data in the data source with buttons you can click to insert values, filter values, reload the data source, and customize the view. Reorganize. Recreates the data source table preserving original data. Note: This option is not available in
the WebFOCUS Reporting Server browser interface.
|
Impact Analysis |
Generates a report showing where this synonym is stored and used, with links to the synonym instances. Impact Analysis reports enable you to evaluate changes before they are made by showing which components will be affected. See the ibi™ WebFOCUS® Reporting Server Administration manual for details about Impact Analysis reports. |
Dependencies Analysis |
Generates a report showing information about the synonym and other synonyms and objects that are referenced within it. |
Copy |
Copies the synonym to the clipboard. |
Delete |
Deletes the synonym. You are asked to confirm this selection before the synonym is deleted. |
Cut |
Deletes the synonym and places it on the clipboard. |
Privileges |
Shows the security subjects on the server and the privileges that they have to this synonym. |
Properties |
Displays the properties of the synonym, including physical location, last modified date, description, and privileges. |
Data Type Support Report
SQL Data Type mapping options are available in a report available from the WebFOCUS Reporting Server browser interface.
Access the Data Type Report
To access the Data Type Report:
- Procedure
- Select Get Data from the main menu.
- From the side menu, click
the Data Types button.
The Filter Data Types Report page opens, as shown in the following image.
- Select an Adapter Subcategory from the corresponding list box.
- Select an Adapter from the corresponding list box.
- Select a Server Data Type from the corresponding list box.
- Click Show Report.
The Filter Data Types Report is displayed, as shown in the following image.
Changing the Precision and Scale of Numeric Columns
You can alter the length and scale of numeric columns returned by a SELECT request to the server by creating different specifications in your login profile or in a stored procedure. The conversion settings are reflected in the Master File in the USAGE and ACTUAL formats of the fields generated by CREATE SYNONYM. This affects how the fields are processed and formatted by the server.
Override the Default Precision and Scale
ENGINE ADAPTER_ID SET CONVERSION RESET ENGINE ADAPTER_ID SET CONVERSION format RESET ENGINE ADAPTER_ID SET CONVERSION format [PRECISION precision [scale]] ENGINE ADAPTER_ID SET CONVERSION format [PRECISION MAX]
where:
Indicates the adapter. You can omit this value if you previously issued the SET SQLENGINE command.
The adapter values are:
Adapter |
Adapter ID |
---|---|
Cache |
dbsqlism |
Db2 |
dbdb2 |
Excel |
dbsqlexc |
HP Vertica |
dbsqlvrt |
Hyperstage PG |
dbsqlhyp |
i Access |
dbsqliia |
CA-IDMS/DB |
dbidmsr |
Informix |
dbsqlinf |
JDBC |
dbsqljdb |
MariaDB |
dbmariadb |
Microsoft Access |
dbsqlmac |
Microsoft SQL Server |
dbsqlmss |
Microsoft SQL Server ODBC |
dbmsodbc |
MySQL |
dbsqlmys |
Netezza |
dbsqlnez |
ODBC |
dbsqlodb |
Oracle |
dbsqlora |
PostgreSQL |
dbsqlpst |
Progress |
dbsqlpro |
PSQL |
dbsqlpsq |
SAP Hana |
dbsqlhan |
Remote Servers |
dbeda |
Sybase |
dbsqlsyb |
Teradata |
dbsqldbc |
UniVerse |
dbsqluv |
UniData |
dbsqlund |
XML |
dbxml |
Returns any previously specified precision and scale values to the adapter defaults. If you specify RESET immediately following the SET CONVERSION command, all data types return to the defaults. If you specify RESET following a particular data type, only columns of that data type are reset.
Any valid format supported by the data source. Possible values are:
INTEGER
which indicates that
the command applies only to INTEGER columns.
DECIMAL
which
indicates that the command applies only to DECIMAL columns.
REAL
which
indicates that the command applies only to single-precision floating-point
columns. Only applies to Db2, CA-IDMS/SQL, Microsoft SQL Server,
and Sybase.
FLOAT
which indicates that the
command applies only to double-precision floating-point columns.
Is the precision. Must be greater than 1 and less than or equal to the maximum allowable value for the data type (see the description of MAX).
Is the scale. This is valid with DECIMAL, FLOAT, and REAL data types. If you do not specify a value for scale, the current scale setting remains in effect. The default scale value is 2.
If the scale is not required, you must set the scale to 0 (zero).
Sets the precision to the maximum allowable value for the indicated data type:
Data Type |
MAX Precision |
---|---|
INTEGER |
11 |
DECIMAL |
18 |
REAL |
9 |
FLOAT |
20 |
However, when issuing a SELECT statement, the answer set description does not use the information in the Master File. The length and scale used for the answer set description depends on whether a CONVERSION command is in effect.
If a CONVERSION command is in effect, the answer set description uses the length and scale that is set by the CONVERSION command.
If a CONVERSION command is not in effect, the answer set description uses the actual length and scale of the data.
Setting the Precision and Scale Attributes
The following example shows how to set the precision attribute for all INTEGER and SMALLINT fields to 7:
ENGINE ADAPTER_ID SET CONVERSION INTEGER PRECISION 7
The following example shows how to set the precision attribute for all DOUBLE PRECISION fields to 14 and the scale attribute to 3:
ENGINE ADAPTER_ID SET CONVERSION FLOAT PRECISION 14 3
The following example shows how to set the precision attribute for all INTEGER and SMALLINT fields to the default:
ENGINE ADAPTER_ID SET CONVERSION INTEGER RESET
The following example shows how to set the precision and scale attributes for all fields to the defaults:
ENGINE ADAPTER_ID SET CONVERSION RESET