Table of Contents
- List of Examples
- List of Figures
- List of Tables
- Title and Copyright Information
- Preface
- Changes in This Release for Oracle Database Utilities
-
Part I Oracle Data Pump
-
1
Overview of Oracle Data Pump
- Data Pump Components
- How Does Data Pump Move Data?
- Using Data Pump With CDBs
- Required Roles for Data Pump Export and Import Operations
- What Happens During Execution of a Data Pump Job?
- Monitoring Job Status
- File Allocation
- Exporting and Importing Between Different Database Releases
- SecureFiles LOB Considerations
- Data Pump Exit Codes
- Auditing Data Pump Jobs
- How Does Data Pump Handle Timestamp Data?
- Character Set and Globalization Support Considerations
- Oracle Data Pump Behavior with Data-Bound Collation
-
2
Data Pump Export
- What Is Data Pump Export?
- Invoking Data Pump Export
- Filtering During Export Operations
-
Parameters Available in Export's Command-Line Mode
- ABORT_STEP
- ACCESS_METHOD
- ATTACH
- CLUSTER
- COMPRESSION
- COMPRESSION_ALGORITHM
- CONTENT
- DATA_OPTIONS
- DIRECTORY
- DUMPFILE
- ENCRYPTION
- ENCRYPTION_ALGORITHM
- ENCRYPTION_MODE
- ENCRYPTION_PASSWORD
- ENCRYPTION_PWD_PROMPT
- ESTIMATE
- ESTIMATE_ONLY
- EXCLUDE
- FILESIZE
- FLASHBACK_SCN
- FLASHBACK_TIME
- FULL
- HELP
- INCLUDE
- JOB_NAME
- KEEP_MASTER
- LOGFILE
- LOGTIME
- METRICS
- NETWORK_LINK
- NOLOGFILE
- PARALLEL
- PARFILE
- QUERY
- REMAP_DATA
- REUSE_DUMPFILES
- SAMPLE
- SCHEMAS
- SERVICE_NAME
- SOURCE_EDITION
- STATUS
- TABLES
- TABLESPACES
- TRANSPORT_FULL_CHECK
- TRANSPORT_TABLESPACES
- TRANSPORTABLE
- VERSION
- VIEWS_AS_TABLES
- Commands Available in Export's Interactive-Command Mode
- Examples of Using Data Pump Export
- Syntax Diagrams for Data Pump Export
-
3
Data Pump Import
- What Is Data Pump Import?
- Invoking Data Pump Import
- Filtering During Import Operations
-
Parameters Available in Import's Command-Line Mode
- ABORT_STEP
- ACCESS_METHOD
- ATTACH
- CLUSTER
- CONTENT
- DATA_OPTIONS
- DIRECTORY
- DUMPFILE
- ENCRYPTION_PASSWORD
- ENCRYPTION_PWD_PROMPT
- ESTIMATE
- EXCLUDE
- FLASHBACK_SCN
- FLASHBACK_TIME
- FULL
- HELP
- INCLUDE
- JOB_NAME
- KEEP_MASTER
- LOGFILE
- LOGTIME
- MASTER_ONLY
- METRICS
- NETWORK_LINK
- NOLOGFILE
- PARALLEL
- PARFILE
- PARTITION_OPTIONS
- QUERY
- REMAP_DATA
- REMAP_DATAFILE
- REMAP_DIRECTORY
- REMAP_SCHEMA
- REMAP_TABLE
- REMAP_TABLESPACE
- SCHEMAS
- SERVICE_NAME
- SKIP_UNUSABLE_INDEXES
- SOURCE_EDITION
- SQLFILE
- STATUS
- STREAMS_CONFIGURATION
- TABLE_EXISTS_ACTION
- REUSE_DATAFILES
- TABLES
- TABLESPACES
- TARGET_EDITION
- TRANSFORM
- TRANSPORT_DATAFILES
- TRANSPORT_FULL_CHECK
- TRANSPORT_TABLESPACES
- TRANSPORTABLE
- VERSION
- VIEWS_AS_TABLES (Network Import)
- VIEWS_AS_TABLES (Non-Network Import)
- Commands Available in Import's Interactive-Command Mode
- Examples of Using Data Pump Import
- Syntax Diagrams for Data Pump Import
- 4 Data Pump Legacy Mode
- 5 Data Pump Performance
- 6 The Data Pump API
-
1
Overview of Oracle Data Pump
-
Part II SQL*Loader
-
7
SQL*Loader Concepts
- SQL*Loader Features
- SQL*Loader Parameters
- SQL*Loader Control File
- Input Data and Data Files
- LOBFILEs and Secondary Data Files (SDFs)
- Data Conversion and Data Type Specification
- Discarded and Rejected Records
- Log File and Logging Information
- Conventional Path Loads, Direct Path Loads, and External Table Loads
- Loading Objects, Collections, and LOBs
- Partitioned Object Support
- Application Development: Direct Path Load API
- SQL*Loader Case Studies
-
8
SQL*Loader Command-Line Reference
- Invoking SQL*Loader
-
Command-Line Parameters for SQL*Loader
- BAD
- BINDSIZE
- COLUMNARRAYROWS
- CONTROL
- DATA
- DATA_CACHE
- DEFAULTS
- DEGREE_OF_PARALLELISM
- DIRECT
- DIRECT_PATH_LOCK_WAIT
- DISCARD
- DISCARDMAX
- DNFS_ENABLE
- DNFS_READBUFFERS
- EMPTY_LOBS_ARE_NULL
- ERRORS
- EXTERNAL_TABLE
- FILE
- HELP
- LOAD
- LOG
- MULTITHREADING
- NO_INDEX_ERRORS
- PARALLEL
- PARFILE
- PARTITION_MEMORY
- READSIZE
- RESUMABLE
- RESUMABLE_NAME
- RESUMABLE_TIMEOUT
- ROWS
- SDF_PREFIX
- SILENT
- SKIP
- SKIP_INDEX_MAINTENANCE
- SKIP_UNUSABLE_INDEXES
- STREAMSIZE
- TRIM
- USERID
- Exit Codes for Inspection and Display
-
9
SQL*Loader Control File Reference
- Control File Contents
- Specifying Command-Line Parameters in the Control File
- Specifying File Names and Object Names
- Identifying XMLType Tables
- Specifying Field Order
- Specifying Data Files
- Specifying CSV Format Files
- Identifying Data in the Control File with BEGINDATA
- Specifying Data File Format and Buffering
- Specifying the Bad File
- Specifying the Discard File
- Specifying a NULLIF Clause At the Table Level
- Specifying Datetime Formats At the Table Level
- Handling Different Character Encoding Schemes
- Interrupted Loads
- Assembling Logical Records from Physical Records
- Loading Logical Records into Tables
- Index Options
- Benefits of Using Multiple INTO TABLE Clauses
- Bind Arrays and Conventional Path Loads
-
10
SQL*Loader Field List Reference
- Field List Contents
- Specifying the Position of a Data Field
- Specifying Columns and Fields
-
SQL*Loader Data Types
- Nonportable Data Types
- Portable Data Types
- Data Type Conversions
- Data Type Conversions for Datetime and Interval Data Types
- Specifying Delimiters
- How Delimited Data Is Processed
- Conflicting Field Lengths for Character Data Types
- Specifying Field Conditions
- Using the WHEN, NULLIF, and DEFAULTIF Clauses
- Examples of Using the WHEN, NULLIF, and DEFAULTIF Clauses
- Loading Data Across Different Platforms
- Byte Ordering
- Loading All-Blank Fields
- Trimming Whitespace
- How the PRESERVE BLANKS Option Affects Whitespace Trimming
- How [NO] PRESERVE BLANKS Works with Delimiter Clauses
- Applying SQL Operators to Fields
- Using SQL*Loader to Generate Data for Input
-
11
Loading Objects, LOBs, and Collections
- Loading Column Objects
- Loading Object Tables
- Loading REF Columns
- Loading LOBs
- Loading BFILE Columns
- Loading Collections (Nested Tables and VARRAYs)
- Dynamic Versus Static SDF Specifications
- Loading a Parent Table Separately from Its Child Table
-
12
Conventional and Direct Path Loads
- Data Loading Methods
- Conventional Path Load
-
Direct Path Load
- Data Conversion During Direct Path Loads
- Direct Path Load of a Partitioned or Subpartitioned Table
- Direct Path Load of a Single Partition or Subpartition
- Advantages of a Direct Path Load
- Restrictions on Using Direct Path Loads
- Restrictions on a Direct Path Load of a Single Partition
- When to Use a Direct Path Load
- Integrity Constraints
- Field Defaults on the Direct Path
- Loading into Synonyms
- Using Direct Path Load
- Optimizing Performance of Direct Path Loads
- Optimizing Direct Path Loads on Multiple-CPU Systems
- Avoiding Index Maintenance
- Direct Path Loads, Integrity Constraints, and Triggers
-
Parallel Data Loading Models
- Concurrent Conventional Path Loads
- Intersegment Concurrency with Direct Path
- Intrasegment Concurrency with Direct Path
- Restrictions on Parallel Direct Path Loads
- Initiating Multiple SQL*Loader Sessions
- Parameters for Parallel Direct Path Loads
- Enabling Constraints After a Parallel Direct Path Load
- PRIMARY KEY and UNIQUE KEY Constraints
- General Performance Improvement Hints
- 13 SQL*Loader Express
-
7
SQL*Loader Concepts
-
Part III External Tables
- 14 External Tables Concepts
-
15
The ORACLE_LOADER Access Driver
- access_parameters Clause
-
record_format_info Clause
- FIXED length
- VARIABLE size
- DELIMITED BY
- XMLTAG
- CHARACTERSET
- EXTERNAL VARIABLE DATA
- PREPROCESSOR
- LANGUAGE
- TERRITORY
- DATA IS...ENDIAN
- BYTEORDERMARK (CHECK | NOCHECK)
- STRING SIZES ARE IN
- LOAD WHEN
- BADFILE | NOBADFILE
- DISCARDFILE | NODISCARDFILE
- LOGFILE | NOLOGFILE
- SKIP
- FIELD NAMES
- READSIZE
- DISABLE_DIRECTORY_LINK_CHECK
- DATA_CACHE
- string
- condition_spec
- [directory object name:] [filename]
- condition
- IO_OPTIONS clause
- DNFS_DISABLE | DNFS_ENABLE
- DNFS_READBUFFERS
- field_definitions Clause
- column_transforms Clause
- Parallel Loading Considerations for the ORACLE_LOADER Access Driver
- Performance Hints When Using the ORACLE_LOADER Access Driver
- Restrictions When Using the ORACLE_LOADER Access Driver
- Reserved Words for the ORACLE_LOADER Access Driver
-
16
The ORACLE_DATAPUMP Access Driver
- access_parameters Clause
- Unloading and Loading Data with the ORACLE_DATAPUMP Access Driver
- Supported Data Types
- Unsupported Data Types
- Performance Hints When Using the ORACLE_DATAPUMP Access Driver
- Restrictions When Using the ORACLE_DATAPUMP Access Driver
- Reserved Words for the ORACLE_DATAPUMP Access Driver
- 17 ORACLE_HDFS and ORACLE_HIVE Access Drivers
-
18
External Tables Examples
- Using the ORACLE_LOADER Access Driver to Create Partitioned External Tables
- Using the ORACLE_DATAPUMP Access Driver to Create Partitioned External Tables
- Using the ORACLE_HDFS Access Driver to Create Partitioned External Tables
- Using the ORACLE_HIVE Access Driver to Create Partitioned External Tables
- Loading LOBs From External Tables
- Loading CSV Files From External Tables
-
Part IV Other Utilities
-
19
ADRCI: ADR Command Interpreter
- About the ADR Command Interpreter (ADRCI) Utility
- Definitions
- Starting ADRCI and Getting Help
- Setting the ADRCI Homepath Before Using ADRCI Commands
- Viewing the Alert Log
- Finding Trace Files
- Viewing Incidents
- Packaging Incidents
-
ADRCI Command Reference
- CREATE REPORT
- ECHO
- EXIT
- HOST
-
IPS
- Using the <ADR_HOME> and <ADR_BASE> Variables in IPS Commands
- IPS ADD
- IPS ADD FILE
- IPS ADD NEW INCIDENTS
- IPS COPY IN FILE
- IPS COPY OUT FILE
- IPS CREATE PACKAGE
- IPS DELETE PACKAGE
- IPS FINALIZE
- IPS GENERATE PACKAGE
- IPS GET MANIFEST
- IPS GET METADATA
- IPS PACK
- IPS REMOVE
- IPS REMOVE FILE
- IPS SET CONFIGURATION
- IPS SHOW CONFIGURATION
- IPS SHOW FILES
- IPS SHOW INCIDENTS
- IPS SHOW PACKAGE
- IPS UNPACK FILE
- PURGE
- QUIT
- RUN
- SELECT
- SET BASE
- SET BROWSER
- SET CONTROL
- SET ECHO
- SET EDITOR
- SET HOMEPATH
- SET TERMOUT
- SHOW ALERT
- SHOW BASE
- SHOW CONTROL
- SHOW HM_RUN
- SHOW HOMEPATH
- SHOW HOMES
- SHOW INCDIR
- SHOW INCIDENT
- SHOW LOG
- SHOW PROBLEM
- SHOW REPORT
- SHOW TRACEFILE
- SPOOL
- Troubleshooting ADRCI
- 20 DBVERIFY: Offline Database Verification Utility
- 21 DBNEWID Utility
-
22
Using LogMiner to Analyze Redo Log Files
- LogMiner Benefits
- Introduction to LogMiner
- Using LogMiner in a CDB
- LogMiner Dictionary Files and Redo Log Files
- Starting LogMiner
- Querying V$LOGMNR_CONTENTS for Redo Data of Interest
- Filtering and Formatting Data Returned to V$LOGMNR_CONTENTS
- Reapplying DDL Statements Returned to V$LOGMNR_CONTENTS
- Calling DBMS_LOGMNR.START_LOGMNR Multiple Times
- Supplemental Logging
- Accessing LogMiner Operational Information in Views
-
Steps in a Typical LogMiner Session
- Typical LogMiner Session Task 1: Enable Supplemental Logging
- Typical LogMiner Session Task 2: Extract a LogMiner Dictionary
- Typical LogMiner Session Task 3: Specify Redo Log Files for Analysis
- Typical LogMiner Session Task 4: Start LogMiner
- Typical LogMiner Session Task 5: Query V$LOGMNR_CONTENTS
- Typical LogMiner Session Task 6: End the LogMiner Session
-
Examples Using LogMiner
-
Examples of Mining by Explicitly Specifying the Redo Log Files of Interest
- Example 1: Finding All Modifications in the Last Archived Redo Log File
- Example 2: Grouping DML Statements into Committed Transactions
- Example 3: Formatting the Reconstructed SQL
- Example 4: Using the LogMiner Dictionary in the Redo Log Files
- Example 5: Tracking DDL Statements in the Internal Dictionary
- Example 6: Filtering Output by Time Range
- Examples of Mining Without Specifying the List of Redo Log Files Explicitly
- Example Scenarios
-
Examples of Mining by Explicitly Specifying the Redo Log Files of Interest
- Supported Data Types, Storage Attributes, and Database and Redo Log File Versions
-
23
Using the Metadata APIs
- Why Use the DBMS_METADATA API?
- Overview of the DBMS_METADATA API
- Using the DBMS_METADATA API to Retrieve an Object's Metadata
- Using the DBMS_METADATA API to Re-Create a Retrieved Object
- Using the DBMS_METADATA API to Retrieve Collections of Different Object Types
- Using the DBMS_METADATA_DIFF API to Compare Object Metadata
- Performance Tips for the Programmatic Interface of the DBMS_METADATA API
- Example Usage of the DBMS_METADATA API
- Summary of DBMS_METADATA Procedures
- Summary of DBMS_METADATA_DIFF Procedures
-
24
Original Export
- What is the Export Utility?
- Before Using Export
- Invoking Export
- Export Modes
-
Export Parameters
- BUFFER
- COMPRESS
- CONSISTENT
- CONSTRAINTS
- DIRECT
- FEEDBACK
- FILE
- FILESIZE
- FLASHBACK_SCN
- FLASHBACK_TIME
- FULL
- GRANTS
- HELP
- INDEXES
- LOG
- OBJECT_CONSISTENT
- OWNER
- PARFILE
- QUERY
- RECORDLENGTH
- RESUMABLE
- RESUMABLE_NAME
- RESUMABLE_TIMEOUT
- ROWS
- STATISTICS
- TABLES
- TABLESPACES
- TRANSPORT_TABLESPACE
- TRIGGERS
- TTS_FULL_CHECK
- USERID (username/password)
- VOLSIZE
- Example Export Sessions
- Warning, Error, and Completion Messages
- Exit Codes for Inspection and Display
- Conventional Path Export Versus Direct Path Export
- Invoking a Direct Path Export
- Network Considerations
- Character Set and Globalization Support Considerations
- Using Instance Affinity with Export and Import
-
Considerations When Exporting Database Objects
- Exporting Sequences
- Exporting LONG and LOB Data Types
- Exporting Foreign Function Libraries
- Exporting Offline Locally-Managed Tablespaces
- Exporting Directory Aliases
- Exporting BFILE Columns and Attributes
- Exporting External Tables
- Exporting Object Type Definitions
- Exporting Nested Tables
- Exporting Advanced Queue (AQ) Tables
- Exporting Synonyms
- Possible Export Errors Related to Java Synonyms
- Support for Fine-Grained Access Control
- Transportable Tablespaces
- Exporting From a Read-Only Database
- Using Export and Import to Partition a Database Migration
- Using Different Releases of Export and Import
-
25
Original Import
- What Is the Import Utility?
- Before Using Import
- Importing into Existing Tables
- Effect of Schema and Database Triggers on Import Operations
- Invoking Import
- Import Modes
-
Import Parameters
- BUFFER
- COMMIT
- COMPILE
- CONSTRAINTS
- DATA_ONLY
- DATAFILES
- DESTROY
- FEEDBACK
- FILE
- FILESIZE
- FROMUSER
- FULL
- GRANTS
- HELP
- IGNORE
- INDEXES
- INDEXFILE
- LOG
- PARFILE
- RECORDLENGTH
- RESUMABLE
- RESUMABLE_NAME
- RESUMABLE_TIMEOUT
- ROWS
- SHOW
- SKIP_UNUSABLE_INDEXES
- STATISTICS
- STREAMS_CONFIGURATION
- STREAMS_INSTANTIATION
- TABLES
- TABLESPACES
- TOID_NOVALIDATE
- TOUSER
- TRANSPORT_TABLESPACE
- TTS_OWNERS
- USERID (username/password)
- VOLSIZE
- Example Import Sessions
- Exit Codes for Inspection and Display
- Error Handling During an Import
- Table-Level and Partition-Level Import
- Controlling Index Creation and Maintenance
- Network Considerations
- Character Set and Globalization Support Considerations
- Using Instance Affinity
-
Considerations When Importing Database Objects
- Importing Object Identifiers
- Importing Existing Object Tables and Tables That Contain Object Types
- Importing Nested Tables
- Importing REF Data
- Importing BFILE Columns and Directory Aliases
- Importing Foreign Function Libraries
- Importing Stored Procedures, Functions, and Packages
- Importing Java Objects
- Importing External Tables
- Importing Advanced Queue (AQ) Tables
- Importing LONG Columns
- Importing LOB Columns When Triggers Are Present
- Importing Views
- Importing Partitioned Tables
- Support for Fine-Grained Access Control
- Snapshots and Snapshot Logs
- Transportable Tablespaces
- Storage Parameters
- Read-Only Tablespaces
- Dropping a Tablespace
- Reorganizing Tablespaces
- Importing Statistics
- Using Export and Import to Partition a Database Migration
- Tuning Considerations for Import Operations
- Using Different Releases of Export and Import
-
19
ADRCI: ADR Command Interpreter
- Part V Appendixes
- A SQL*Loader Syntax Diagrams
-
B
Instant Client for SQL*Loader, Export, and Import
- What is the Tools Instant Client?
- Choosing the Instant Client to Install
- Installing Tools Instant Client by Downloading from OTN
- Installing Tools Instant Client from the 12c Client Release Media
- Configuring Tools Instant Client Package
- Connecting to a Database with the Tools Instant Client Package
- Uninstalling Instant Client
- Index