Skip Headers
Oracle® Fusion Middleware Connectivity and Knowledge Modules Guide for Oracle Data Integrator
11g Release 1 (11.1.1)

Part Number E12644-03
Go to Documentation Home
Home
Go to Book List
Book List
Go to Table of Contents
Contents
Go to Master Index
Master Index
Go to Feedback page
Contact Us

Go to previous page
Previous
Go to next page
Next
View PDF

13 IBM DB2 for iSeries

This chapter describes how to work with IBM DB2 for iSeries in Oracle Data Integrator.

This chapter includes the following sections:

13.1 Introduction

Oracle Data Integrator (ODI) seamlessly integrates data in IBM DB2 for iSeries. Oracle Data Integrator features are designed to work best with IBM DB2 for iSeries, including reverse-engineering, changed data capture, data integrity check, and integration interfaces.

13.1.1 Concepts

The IBM DB2 for iSeries concepts map the Oracle Data Integrator concepts as follows: An IBM DB2 for iSeries server corresponds to a data server in Oracle Data Integrator. Within this server, a collection or schema maps to an Oracle Data Integrator physical schema. A set of related objects within one schema corresponds to a data model, and each table, view or synonym will appear as an ODI datastore, with its attributes, columns and constraints.

Oracle Data Integrator uses Java Database Connectivity (JDBC) to connect to IBM DB2 for iSeries.

13.1.2 Knowledge Modules

Oracle Data Integrator provides the Knowledge Modules (KM) listed in Table 13-1 for handling IBM DB2 for iSeries data. In addition to these specific IBM DB2 for iSeries Knowledge Modules, it is also possible to use the generic SQL KMs with IBM DB2 for iSeries. See Chapter 4, "Generic SQL" for more information.

Table 13-1 IBM DB2 for iSeries Knowledge Modules

Knowledge Module Description

IKM DB2 400 Incremental Update

Integrates data in an IBM DB2 for iSeries target table in incremental update mode.

IKM DB2 400 Incremental Update (CPYF)

Integrates data in an IBM DB2 for iSeries target table in incremental update mode. This IKM is similar to the "IKM DB2 400 Incremental Update" except that it uses the CPYF native OS/400 command to write to the target table, instead of set-based SQL operations.

IKM DB2 400 Slowly Changing Dimension

Integrates data in an IBM DB2 for iSeries target table used as a Type II Slowly Changing Dimension in your Data Warehouse.

JKM DB2 400 Consistent

Creates the journalizing infrastructure for consistent journalizing on IBM DB2 for iSeries tables using triggers.

JKM DB2 400 Simple

Creates the journalizing infrastructure for simple journalizing on IBM DB2 for iSeries tables using triggers.

JKM DB2 400 Simple (Journal)

Creates the journalizing infrastructure for simple journalizing on IBM DB2 for iSeries tables using the journals.

LKM DB2 400 Journal to SQL

Loads data from an IBM DB2 for iSeries source to a ANSI SQL-92 compliant staging area database. This LKM can source from tables journalized with the JKM DB2 400 Simple (Journal) as it refreshes the CDC infrastructure from the journals.

LKM DB2 400 to DB2 400

Loads data from an IBM DB2 for iSeries source database to an IBM DB2 for iSeries staging area database using CRTDDMF to create a DDM file on the target and transfer data from the source to this DDM file using CPYF.

LKM SQL to DB2 400 (CPYFRMIMPF)

Loads data from an ANSI SQL-92 compliant source database to an IBM DB2 for iSeries staging area database using a temporary file loaded into the DB2 staging area with CPYFRMIPF.

RKM DB2 400

Retrieves metadata for IBM DB2 for iSeries: physical files, tables, views, foreign keys, unique keys.


13.2 Installation and Configuration

Make sure you have read the information in this section before you start working with the IBM DB2 for iSeries technology:

13.2.1 System Requirements and Certifications

Before performing any installation you should read the system requirements and certification documentation to ensure that your environment meets the minimum installation requirements for the products you are installing.

The list of supported platforms and versions is available on Oracle Technical Network (OTN):

http://www.oracle.com/technology/products/oracle-data-integrator/index.html.

13.2.2 Technology Specific Requirements

Some of the Knowledge Modules for IBM DB2 for iSeries use specific features of this database. The following restrictions apply when using these Knowledge Modules.

See the IBM DB2 for iSeries documentation for additional information on these topics.

Using System commands

This section describes the requirements that must be met before using iSeries specific commands in the knowledge modules for IBM DB2 for iSeries:

  • Knowledge modules using system commands such as CPYF or CPYFRMIPF require that the agent runs on the iSeries runs on the iSeries system.

Using CDC with Journals

This section describes the requirements that must be met before using the Journal-based Change Data Capture with IBM DB2 for iSeries:

  • This journalizing method requires that a specific program is installed and runs on the iSeries system. See Setting up Changed Data Capture for more information.

13.2.3 Connectivity Requirements

This section lists the requirements for connecting to an IBM DB2 for iSeries system.

JDBC Driver

Oracle Data Integrator is installed with a default IBM DB2 Datadirect Driver. This drivers directly uses the TCP/IP network layer and requires no other installed component or configuration. You can alternatively use the drivers provided by IBM, such as the Native Driver when installing the agent on iSeries.

13.3 Setting up the Topology

Setting up the Topology consists of:

  1. Creating a DB2/400 Data Server

  2. Creating a DB2/400 Physical Schema

13.3.1 Creating a DB2/400 Data Server

An IBM DB2/400 data server corresponds to an iSeries server connected with a specific user account. This user will have access to several databases in this server, corresponding to the physical schemas in Oracle Data Integrator created under the data server.

13.3.1.1 Creation of the Data Server

Create a data server for the IBM DB2/400 technology using the standard procedure, as described in "Creating a Data Server" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator. This section details only the fields required or specific for defining an IBM DB2/400 data server:

  1. In the Definition tab:

    • Name: Name of the data server that will appear in Oracle Data Integrator

    • Server: Physical name of the data server

    • User/Password: DB2 user with its password

  2. In the JDBC tab:

    • JDBC Driver: weblogic.jdbc.db2.DB2Driver

    • JDBC URL: jdbc:weblogic:db2://hostname:port[;property=value[;...]]

13.3.2 Creating a DB2/400 Physical Schema

Create an IBM DB2/400 physical schema using the standard procedure, as described in "Creating a Physical Schema" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator.

The work schema and data schema in this physical schema correspond each to a schema (collection or library). The work schema should point to a temporary schema and the data schema should point to the schema hosting the data to integrate.

Create for this physical schema a logical schema using the standard procedure, as described in "Creating a Logical Schema" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator and associate it in a given context.

13.4 Setting Up an Integration Project

Setting up a project using the IBM DB2 for iSeries database follows the standard procedure. See "Creating an Integration Project" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator.

It is recommended to import the following knowledge modules into your project for getting started with IBM DB2 for iSeries:

13.5 Creating and Reverse-Engineering an IBM DB2/400 Model

This section contains the following topics:

13.5.1 Create an IBM DB2/400 Model

Create an IBM DB2/400 Model using the standard procedure, as described in "Creating a Model" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator.

13.5.2 Reverse-engineer an IBM DB2/400 Model

IBM DB2 for iSeries supports both Standard reverse-engineering - which uses only the abilities of the JDBC driver - and Customized reverse-engineering, which uses a RKM to retrieve the metadata.

In most of the cases, consider using the standard JDBC reverse engineering for starting.

Consider switching to customized reverse-engineering for retrieving more metadata. IBM DB2 for iSeries customized reverse-engineering retrieves the physical files, database tables, database views, columns, foreign keys and primary and alternate keys.

Standard Reverse-Engineering

To perform a Standard Reverse-Engineering on IBM DB2 for iSeries use the usual procedure, as described in "Reverse-engineering a Model" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator.

Customized Reverse-Engineering

To perform a Customized Reverse-Engineering on IBM DB2 for iSeries with a RKM, use the usual procedure, as described in "Reverse-engineering a Model" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator. This section details only the fields specific to the IBM DB2/400 technology:

In the Reverse tab of the IBM DB2/400 Model, select the KM: RKM DB2 400.<project name>.

13.6 Setting up Changed Data Capture

Oracle Data Integrator handles Changed Data Capture on iSeries with two methods:

13.6.1 Setting up Trigger-Based CDC

This method support Simple Journalizing and Consistent Set Journalizing. The IBM DB2 for iSeries JKMs use triggers to capture data changes on the source tables.

Oracle Data Integrator provides the Knowledge Modules listed in Table 13-2 for journalizing IBM DB2 for iSeries tables using triggers.

See Chapter "Working with Changed Data Capture" of the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator for details on how to set up journalizing and how to use captured changes.

Table 13-2 IBM DB2 for iSeries Journalizing Knowledge Modules

KM Notes

JKM DB2 400 Consistent

Creates the journalizing infrastructure for consistent journalizing on IBM DB2 for iSeries tables using triggers.

JKM DB2 400 Simple

Creates the journalizing infrastructure for simple journalizing on IBM DB2 for iSeries tables using triggers.


13.6.2 Setting up Log-Based CDC

This method is set up with the JKM DB2/400 Journal Simple and used by the LKM DB2/400 Journal to SQL. It uses also an RPG program to retrieve the journal content.

13.6.2.1 How does it work?

A iSeries transaction journal contains the entire history of the data changes for a given period. It is handled by the iSeries system for tables that are journaled. A journaled table is either a table from a collection, or a table for which a journal receiver and a journal have been created and journaling started.

Reading the transaction journal is performed by the a journal retriever CDCRTVJRN RPG program provided with Oracle Data Integrator. This program loads on demand the tables of the Oracle Data Integrator CDC infrastructure (J$ tables) with the contents from the transaction journal.

This program can be either scheduled on the iSeries system or called by the KMs through a stored procedure also called CDCRTVJRN. This stored procedure is automatically created by the JKM DB2/400 Journal Simple and invoked by the LKM DB2/400 Journal to SQL when data extraction is needed.

13.6.2.2 CDCRTVJRN Program Details

This program connects to the native iSeries journal for a given table, and captures changed data information into the Oracle Data Integrator Journal (J$).

The program works as follows:

  1. Journalized table attributes retrieval:

    1. Table attributes retrieval: PK columns, J$ table name, last journal reading date.

    2. Attributes enrichment (short names, record size, etc.) using the QSYS.QADBXREF system table.

    3. Location of the iSeries journal using the QADBRTVFD() API.

  2. PK columns information retrieval:

    1. PK columns attributes (short name, data types etc.) using the QSYS.QADBIFLD system table.

    2. Attributes enrichment (real physical length) using the QUSLFLD() API.

    3. Data preprocessing (RPG to SQL datatype conversion) for the primary key columns.

  3. Extraction the native journal information into the J$ table:

    1. Native journal reading using the QJoRetrieveJournalEntries() API.

    2. Conversion of the raw data to native SQL data and capture into the J$ table.

    3. Update of the changes count.

This program accepts the parameters listed in Table 13-3.

Table 13-3 CDCRTVJRN Program Parameters

Parameter RPG Type SQL Type Description

SbsTName

A138

Char(138)

Full name of the subscribers table in the following format: <Lib>.<Table>.

Example: ODILIB.SNP_SUBSCRIBERS

JrnTName

A138

Char(138)

Full name of the table for which the extract is done from the journal.

Example: FINANCE.MY_COMPANY_ORDERS

JrnSubscriber

A50

Char(50)

Name of the current subscriber. It must previously have been added to the list of subscribers.

LogMessages

A1

Char(1)

Flag activating logging in a spool file. Possible values are: Y enable logging, and N to disable logging.


13.6.2.3 Installing the CDC Components on iSeries

There are two major components installed on the iSeries system to enable native journal reading:

  • The CDCRTVJRN Program. This program is provided in an archive that should installed in the iSeries system. The installation process is described below.

  • The CDC Infrastructure. It includes the standard CDC objects (J$ tables, views, ...) and the CDCRTVJRN Stored Procedure created by the JKM and used by the LKM to read journals. This stored procedure executes the CDCRTVJRN program.

Note:

The program must be set up in a library defined in the Topology as the default work library for this iSeries data server. In the examples below, this library is called ODILIB.

Installing the CDCRTVJRN Program

To install the CDCRTVJRN program:

  1. Identify the location the program SAVF file. It is located in the ODI_HOME/setup/manual/cdc-iseries directory, and is also available on the Oracle Data Integrator Companion CD.

  2. Connect to the iSeries system.

  3. Create the default work library if it does not exist yet. You can use, for example, the following command to create an ODILIB library:

    CRTLIB LIB(ODILIB)
    
  4. Create in this library an empty save file that has the same name as the SAVF file (mandatory). For example:

    CRTSAVF FILE(ODILIB/SAVPGM0110)
    
  5. Upload the local SAVF file on the iSeries system in the library and on top of the file you have just created. Make sure that the upload process is performed in binary mode.

    An FTP command sequence performing the upload is given below as an example.

    FTP 192.168.0.13 LCD /oracle/odi/setup/manual/cdc-iseries/
    BI
    CD ODILIB
    PUT SAVPGM0110 
    BYE
    
  • Restore the objects from the save file, using the RSTOBJ command. For example:

    RSTOBJ OBJ(*ALL) SAVLIB(CDCODIRELE) DEV(*SAVF) OBJTYPE(*ALL) SAVF(ODILIB/SAVPGM0110) RSTLIB(ODILIB)
    
  • Check that the objects are correctly restored. The target library should contain a program object called CDCRTVJRN.

    Use the following command below to view it:

    WRKOBJ OBJ(ODILIB/CDCRTVJRN)
    

The CDCRTVJRN Stored Procedure

This procedure is used to call the CDCRTVJRN program. It is automatically created by the JKM DB2/400 Journal Simple KM when journalizing is started. Journalizing startup is described in the Change Data Capture topic.

The syntax for the stored procedure is provided below for reference:

create procedure ODILIB.CDCRTVJRN(
   SbsTName char(138), /* Qualified Subscriber Table Name */
   JrnTName char(138), /* Qualified Table Name */
   Subscriber char(50) , /* Subscriber Name */
   LogMessages char(1) /* Create a Log (Y - Yes, N - No) */
)
language rpgle
external name 'ODILIB/CDCRTVJRN'

Note:

The stored procedure and the program are installed in a library defined in the Topology as the default work library for this iSeries data server

13.6.2.4 Using the CDC with the Native Journals

Once the program is installed and the CDC is setup, using the native journals consists in using the LKM DB2/400 Journal to SQL to extract journalized data from the iSeries system. The retrieval process is triggered if the RETRIEVE_JOURNAL_ENTRIES option is set to true for the LKM.

13.6.2.5 Problems While Reading Journals

This section list the possibly issues when using this changed data capture method.

CDCRTVJRN Program Limits

The following limits exist for the CDCRTVJRN program:

  • The source table should be journaled and the iSeries journal should be readable by the user specified in the iSeries data server.

  • The source table should have one PK defined in Oracle Data Integrator.

  • The PK declared in Oracle Data Integrator should be in the 4096 first octets of the physical record of the data file.

  • The number of columns in the PK should not exceed 16.

  • The total number of characters of the PK column names added to the number of columns of the PK should not exceed 255.

  • Large object datatypes are not supported in the PK. Only the following SQL types are supported in the PK: SMALLINT, INTEGER, BIGINT, DECIMAL (Packed), NUMERIC (Zoned), FLOAT, REAL, DOUBLE, CHAR, VARCHAR, CHAR VARYING, DATE, TIME, TIMESTAMP and ROWID.

  • Several instances of CDCRTVJRN should not be started simultaneously on the same system.

  • Reinitializing the sequence number in the iSeries journal may have a critical impact on the program (program hangs) if the journal entries consumption date (SNP_SUBSCRIBERS.JRN_CURFROMDATE) is before the sequence initialization date. To work around this problem, you should manually set a later date in SNP_SUBSCRIBERS.JRN_CURFROMDATE.

Troubleshooting the CDCRTVJRN Program

The journal reading process can be put in trace mode:

  • either by calling from your query tool the CDCRTVJRN stored procedure with the LogMsg parameter set to Y,

  • or by forcing the CREATE_SPOOL_FILE LKM option to 1 then restarting the interface.

The reading process logs are stored in a spool file which can be reviewed using the WRKSPLF command.

You can also review the raw contents of the iSeries journal using the DSPJRN command.

13.7 Setting up Data Quality

Oracle Data Integrator provides the generic CKM SQL for checking data integrity against constraints defined in DB2/400. See "Set up Flow Control and Post-Integration Control" in the Oracle Fusion Middleware Developer's Guide for Oracle Data Integrator for details.

See Chapter 4, "Generic SQL" for more information.

13.8 Designing an Interface

You can use IBM DB2 for iSeries as a source, staging area or a target of an integration interface.

The KM choice for an interface or a check determines the abilities and performance of this interface or check. The recommendations in this section help in the selection of the KM for different situations concerning an IBM DB2 for iSeries data server.

13.8.1 Loading Data from and to IBM DB2 for iSeries

IBM DB2 for iSeries can be used as a source, target or staging area of an interface. The LKM choice in the Interface Flow tab to load data between IBM DB2 for iSeries and another type of data server is essential for the performance of an interface.

13.8.1.1 Loading Data from IBM DB2 for iSeries

Oracle Data Integrator provides Knowledge Modules that implement optimized methods for loading data from IBM DB2 for iSeries to a target or staging area database. These optimized IBM DB2 for iSeries KMs are listed in Table 13-4.

In addition to these KMs, you can also use the Generic SQL KMs or the KMs specific to the other technology involved to load data from IBM DB2 for iSeries to a target or staging area database.

Table 13-4 KMs for loading data from IBM DB2 for iSeries

Source or Staging Area Technology KM Notes

IBM DB2 for iSeries

LKM DB2 400 to DB2 400

Loads data from an IBM DB2 for iSeries source database to an IBM DB2 for iSeries staging area database using CRTDDMF to create a DDM file on the target and transfer data from the source to this DDM file using CPYF.

IBM DB2 for iSeries

LKM DB2 400 Journal to SQL

Loads data from an IBM DB2 for iSeries source to a ANSI SQL-92 compliant staging area database. This LKM can source from tables journalized with the JKM DB2 400 Simple (Journal) as it refreshes the CDC infrastructure from the journals.


13.8.1.2 Loading Data to IBM DB2 for iSeries

Oracle Data Integrator provides Knowledge Modules that implement optimized methods for loading data from a source or staging area into an IBM DB2 for iSeries database. These optimized IBM DB2 for iSeries KMs are listed in Table 13-5.

In addition to these KMs, you can also use the Generic SQL KMs or the KMs specific to the other technology involved.

Table 13-5 KMs for loading data to IBM DB2 for iSeries

Source or Staging Area Technology KM Notes

IBM DB2 for iSeries

LKM DB2 400 to DB2 400

Loads data from an IBM DB2 for iSeries source database to an IBM DB2 for iSeries staging area database using CRTDDMF to create a DDM file on the target and transfer data from the source to this DDM file using CPYF.

SQL

LKM SQL to DB2 400 (CPYFRMIMPF)

Loads data from an ANSI SQL-92 compliant source database to an IBM DB2 for iSeries staging area database using a temporary file loaded into the DB2 staging area with CPYFRMIPF.


13.8.2 Integrating Data in IBM DB2 for iSeries

Oracle Data Integrator provides Knowledge Modules that implement optimized data integration strategies for IBM DB2 for iSeries. These optimized IBM DB2 for iSeries KMs are listed in Table 13-6. I

In addition to these KMs, you can also use the Generic SQL KMs.

The IKM choice in the Interface Flow tab determines the performances and possibilities for integrating.

Table 13-6 KMs for integrating data to IBM DB2 for iSeries

KM Notes

IKM DB2 400 Incremental Update

Integrates data in an IBM DB2 for iSeries target table in incremental update mode.

IKM DB2 400 Incremental Update (CPYF)

Integrates data in an IBM DB2 for iSeries target table in incremental update mode. This IKM is similar to the "IKM DB2 400 Incremental Update" except that it uses the CPYF native OS/400 command to write to the target table, instead of set-based SQL operations.

IKM DB2 400 Slowly Changing Dimension

Integrates data in an IBM DB2 for iSeries target table used as a Type II Slowly Changing Dimension in your Data Warehouse.


Using Slowly Changing Dimensions

For using slowly changing dimensions, make sure to set the Slowly Changing Dimension value for each column of the target datastore. This value is used by the IKM DB2 400 Slowly Changing Dimension to identify the Surrogate Key, Natural Key, Overwrite or Insert Column, Current Record Flag and Start/End Timestamps columns.

13.9 Specific Considerations with DB2 for iSeries

This section provides specific considerations when using Oracle Data Integrator in an iSeries environment.

13.9.1 Installing the Run-Time Agent on iSeries

The Oracle Data Integrator Standalone Agent can be installed on iSeries.

See the Oracle Fusion Middleware Installation Guide for Oracle Data Integrator for more information.

13.9.2 Alternative Connectivity Methods for iSeries

It is preferable to use the built-in IBM DB2 Datadirect driver in most cases. This driver directly use the TCP/IP network layer and require no other components installed on the client machine. Other methods exist to connect DB2 on iSeries.

13.9.2.1 Using Client Access

It is also possible to connect through ODBC with the IBM Client Access component installed on the machine. This method does not have very good performance and does not support the reverse engineering and some other features. It is therefore not recommended.

13.9.2.2 Using the IBM JT/400 and Native Drivers

This driver appears as a jt400.zip file you must copy into your Oracle Data Integrator installation drivers directory.

To connect DB2 for iSeries with a Java application installed on the iSeries machine, IBM recommends that you use the JT/400 Native driver (jt400native.jar) instead of the JT/400 driver (jt400.jar). The Native driver provides optimized access to the DB2 system, but works only from the iSeries machine.

To support seamlessly both drivers with one connection, Oracle Data Integrator has a built-in Driver Wrapper for AS/400. This wrapper connects through the Native driver if possible, otherwise it uses the JT/400 driver. It is recommended that you use this wrapper if running agents installed on AS/400 systems.

To configure a data server with the driver wrapper:

  1. Change the driver and URL to your AS/400 server with the following information:

    • Driver: com.sunopsis.jdbc.driver.wrapper.SnpsDriverWrapper

    • URL: jdbc:snps400:<machine_name>[;param1=value1[;param2=value2...]]

  2. Set the following java properties for the java machine the run-time agent deployed on iSeries:

    • HOST_NAME: comma separated list of host names identifying the current machine.

    • HOST_IP: IP Address of the current machine.

    The value allow the wrapper to identify whether this data server is accessed on the iSeries machine or from a remote machine.

13.10 Troubleshooting

This section provides information on how to troubleshoot problems that you might encounter when using Oracle Knowledge Modules. It contains the following topics:

13.10.1 Troubleshooting Error messages

Errors in Oracle Data Integrator appear often in the following way:

java.sql.SQLException: The application server rejected the connection.(Signon was canceled.)at ... at ... ...

the java.sql.SQLExceptioncode simply indicates that a query was made to the database through the JDBC driver, which has returned an error. This error is frequently a database or driver error, and must be interpreted in this direction.

Only the part of text in bold must first be taken in account. It must be searched in the DB2 or iSeries documentation. If its contains sometimes an error code specific to your system, with which the error can be immediately identified.

If such an error is identified in the execution log, it is necessary to analyze the SQL code send to the database to find the source of the error. The code is displayed in the description tab of the erroneous task.

13.10.2 Common Problems and Solutions

This section describes common problems and solutions.

13.10.2.1 Connection Errors

  • UnknownDriverException

    The JDBC driver is incorrect. Check the name of the driver.

  • The application requester cannot establish the connection.(<name or IP address>) Cannot open a socket on host: <name or IP address>, port: 8471 (Exception: java.net.UnknownHostException:<name or IP address>)

    Oracle Data Integrator cannot connect to the database. Either the machine name or IP address is invalid, the DB2/400 Services are not started or the TCP/IP interface on AS/400 is not started. Try to ping the AS/400 machine using the same machine name or IP address, and check with the system administrator that the appropriate services are started.

  • Datasource not found or driver name not specified

    The ODBC Datasource specified in the JDBC URL is incorrect.

  • The application server rejected the connection.(Signon was canceled.) Database login failed, please verify userid and password. Communication Link Failure. Comm RC=8001 - CWBSY0001 - ...

    The user profile used is not valid. This error occurs when typing an invalid user name or an incorrect password.

  • Communication Link Failure

    An error occurred with the ODBC connectivity. Refer to the Client Access documentation for more information.

  • SQL5001 - Column qualifier or table &2 undefined. SQL5016 - Object name &1 not valid for naming convention

    Your JDBC connection or ODBC Datasource is configured to use the wrong naming convention. Use the ODBC Administrator to change your datasource to use the proper (*SQL or *SYS) naming convention, or use the appropriate option in the JDBC URL to force the naming conversion (for instance jdbc:as400://195.10.10.13;naming=system) . Note that if using the system naming convention in the Local Object Mask of the Physical Schema, you must enter %SCHEMA/%OBJECT instead of %SCHEMA.%OBJECT.

    "*SQL" should always be used unless your application is specifically designed for *SYS. Oracle Data Integrator uses the *SQL naming convention by default.

  • SQL0204 &1 in &2 type *&3 not found

    The table you are trying to access does not exist. This may be linked to an error in the context choice, or in the sequence of operations (E.g.: The table is a temporary table which must be created by another interface).

  • Hexadecimal characters appear in the target tables. Accentuated characters are incorrectly transferred.

    The iSeries computer attaches a language identifier or CCSID to files, tables and even fields (columns). CCSID 65535 is a generic code that identifies a file or field as being language independent: i.e. hexadecimal data. By definition, no translation is performed by the drivers. If you do not wish to update the CCSID of the file, then translation can be forced, in the JDBC URL, thanks to the flags ccsid=<ccsid code> and convert _ccsid_65535=yes|no. See the driver's documentation for more information.

  • SQL0901 SQL system error

    This error is an internal error of the DB2/400 system.

  • SQL0206 Column &1 not in specified tables

    Keying error in a mapping/join/filter. A string which is not a column name is interpreted as a column name, or a column name is misspelled.

    This error may also appear when accessing an error table associated to a datastore with a structure recently modified. It is necessary to impact in the error table the modification, or drop the error tables and let Oracle Data Integrator recreate it in the next execution.