4 Data Pump Legacy Mode
Data Pump legacy mode lets you use original Export and Import parameters on the Data Pump Export and Data Pump Import command lines.
- Oracle Data Pump Legacy Mode Use Cases
Oracle Data Pump enters legacy mode when it encounters legacy export or import parameters, so that you can continue using existing scripts. - Parameter Mappings
Describes how original Export and Import parameters map to the Data Pump Export and Import parameters that supply similar functionality. - Management of File Locations in Data Pump Legacy Mode
Original Export and Import and Data Pump Export and Import differ on where dump files and log files can be written to and read from because the original version is client-based and Data Pump is server-based. - Adjusting Existing Scripts for Data Pump Log Files and Errors
Describes how to adjust existing scripts for Data Pump log files and errors.
Parent topic: Oracle Data Pump
4.1 Oracle Data Pump Legacy Mode Use Cases
Oracle Data Pump enters legacy mode when it encounters legacy export or import parameters, so that you can continue using existing scripts.
If you use original Export (exp
) and Import (imp
), then you probably have scripts you have been using for many years. Data Pump provides a legacy mode, which allows you to continue to use your existing scripts with Oracle Data Pump.
Oracle Data Pump enters legacy mode when it determines that a parameter unique to original Export or Import is present, either on the command line, or in a script. As Data Pump processes the parameter, the analogous Oracle Data Pump Export or Oracle Data Pump Import parameter is displayed. Oracle strongly recommends that you view the new syntax and make script changes as time permits.
Note:
The Oracle Data Pump Export and Import utilities create and read dump files and log files in Oracle Data Pump format only. They never create or read dump files compatible with original Export or Import. If you have a dump file created with original Export, then you must use original Import to import the data into the database.
Parent topic: Data Pump Legacy Mode
4.2 Parameter Mappings
Describes how original Export and Import parameters map to the Data Pump Export and Import parameters that supply similar functionality.
- Using Original Export Parameters with Data Pump
Data Pump Export accepts original Export parameters when they map to a corresponding Data Pump parameter. - Using Original Import Parameters with Data Pump
Data Pump Import accepts original Import parameters when they map to a corresponding Data Pump parameter.
Parent topic: Data Pump Legacy Mode
4.2.1 Using Original Export Parameters with Data Pump
Data Pump Export accepts original Export parameters when they map to a corresponding Data Pump parameter.
This table describes how Data Pump Export interprets original Export parameters. Parameters that have the same name and functionality in both original Export and Data Pump Export are not included in this table.
Table 4-1 How Data Pump Export Handles Original Export Parameters
Original Export Parameter | Action Taken by Data Pump Export Parameter |
---|---|
|
This parameter is ignored. |
|
This parameter is ignored. In original Export, the The Data Pump Export |
|
Data Pump Export determines the current time and uses |
|
If original Export used The default behavior is to include constraints as part of the export. |
|
This parameter is ignored. Data Pump Export automatically chooses the best export method. |
|
The Data Pump Export In original Export, feedback was given after a certain number of rows, as specified with the |
|
Data Pump Export attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Export |
|
If original Export used If original Export used |
|
If original Export used If original Export used |
|
Data Pump Export attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Export The contents of the log file will be those of a Data Pump Export operation. See Log Files for information about log file location and content. |
|
This parameter is ignored because Data Pump Export processing ensures that each object is in a consistent state when being exported. |
|
The Data Pump |
|
This parameter is ignored because Data Pump Export automatically takes care of buffer sizing. |
|
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the |
|
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the |
|
This parameter is ignored because Data Pump Export automatically provides this functionality to users who have been granted the |
|
If original Export used If original Export used |
|
This parameter is ignored because statistics are always saved for tables as part of a Data Pump export operation. |
|
If original Export also specified If original Export also specified |
|
If original Export used If original Export used |
|
If original Export used If original Export used |
|
If original Export used If original Export used |
|
When the original Export |
Parent topic: Parameter Mappings
4.2.2 Using Original Import Parameters with Data Pump
Data Pump Import accepts original Import parameters when they map to a corresponding Data Pump parameter.
This table describes how Data Pump Import interprets original Import parameters. Parameters that have the same name and functionality in both original Import and Data Pump Import are not included in this table.
Table 4-2 How Data Pump Import Handles Original Import Parameters
Original Import Parameter | Action Taken by Data Pump Import Parameter |
---|---|
|
This parameter is ignored. |
|
This parameter was desupported several releases ago and should no longer be used. It will cause the Data Pump Import operation to abort. |
|
This parameter is ignored. Data Pump Import automatically performs a commit after each table is processed. |
|
This parameter is ignored. Data Pump Import compiles procedures after they are created. A recompile can be executed if necessary for dependency reasons. |
|
If original Import used If original Import used |
|
The Data Pump Import |
|
If original Import used If original Import used |
|
The Data Pump Import In original Import, feedback was given after a certain number of rows, as specified with the |
|
Data Pump Import attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Import |
|
This parameter is ignored because the information is already contained in the Data Pump dump file set. |
|
The Data Pump Import |
|
If original Import used If original Import used |
|
If original Import used If original Import used |
|
If original Import used If original Import used |
|
The Data Pump Import The same method and attempts made when looking for a directory object described for the If no directory object was specified on the original Import, then Data Pump Import uses the directory object specified with the |
|
Data Pump Import attempts to determine the path that was specified or defaulted to for the See Management of File Locations in Data Pump Legacy Mode for more information about how Data Pump handles the original Import The contents of the log file will be those of a Data Pump Import operation. See Log Files for information about log file location and content. |
|
This parameter is ignored because Data Pump handles issues about record length internally. |
|
This parameter is ignored because this functionality is automatically provided for users who have been granted the |
|
This parameter is ignored because this functionality is automatically provided for users who have been granted the |
|
This parameter is ignored because this functionality is automatically provided for users who have been granted the |
|
If original Import used If original Import used |
|
If The name of the file will be the file name specified on the |
|
This parameter is ignored because statistics are always saved for tables as part of a Data Pump Import operation. |
|
This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified. |
|
This parameter is ignored because Data Pump Import automatically determines it; it does not need to be specified |
|
If original Import also specified If original Import also specified |
|
This parameter is ignored. OIDs are no longer used for type validation. |
|
The Data Pump Import The |
|
The |
|
This parameter is ignored because this information is automatically stored in the Data Pump dump file set. |
|
When the original Import |
Parent topic: Parameter Mappings
4.3 Management of File Locations in Data Pump Legacy Mode
Original Export and Import and Data Pump Export and Import differ on where dump files and log files can be written to and read from because the original version is client-based and Data Pump is server-based.
Original Export and Import use the FILE
and LOG
parameters to specify dump file and log file names, respectively. These file names always refer to files local to the client system and they may also contain a path specification.
Data Pump Export and Import use the DUMPFILE
and LOGFILE
parameters to specify dump file and log file names, respectively. These file names always refer to files local to the server system and cannot contain any path information. Instead, a directory object is used to indirectly specify path information. The path value defined by the directory object must be accessible to the server. The directory object is specified for a Data Pump job through the DIRECTORY
parameter. It is also possible to prepend a directory object to the file names passed to the DUMPFILE
and LOGFILE
parameters. For privileged users, Data Pump supports the use of a default directory object if one is not specified on the command line. This default directory object, DATA_PUMP_DIR
, is set up at installation time.
If Data Pump legacy mode is enabled and the original Export FILE=
filespec
parameter and/or LOG=
filespec
parameter are present on the command line, then the following rules of precedence are used to determine a file's location:
Note:
If the FILE
parameter and LOG
parameter are both present on the command line, then the rules of precedence are applied separately to each parameter.
Also, when a mix of original Export/Import and Data Pump Export/Import parameters are used, separate rules apply to them. For example, suppose you have the following command:
expdp system FILE=/user/disk/foo.dmp LOGFILE=foo.log DIRECTORY=dpump_dir
The Data Pump legacy mode file management rules, as explained in this section, would apply to the FILE
parameter. The normal (that is, non-legacy mode) Data Pump file management rules, as described in Default Locations for Dump_ Log_ and SQL Files, would apply to the LOGFILE
parameter.
-
If a path location is specified as part of the file specification, then Data Pump attempts to look for a directory object accessible to the schema executing the export job whose path location matches the path location of the file specification. If such a directory object cannot be found, then an error is returned. For example, assume that a server-based directory object named
USER_DUMP_FILES
has been defined with a path value of'/disk1/user1/dumpfiles/'
and that read and write access to this directory object has been granted to thehr
schema. The following command causes Data Pump to look for a server-based directory object whose path value contains'/disk1/user1/dumpfiles/'
and to which thehr
schema has been granted read and write access:expdp hr FILE=/disk1/user1/dumpfiles/hrdata.dmp
In this case, Data Pump uses the directory object
USER_DUMP_FILES
. The path value, in this example'/disk1/user1/dumpfiles/'
, must refer to a path on the server system that is accessible to the Oracle Database.If a path location is specified as part of the file specification, then any directory object provided using the
DIRECTORY
parameter is ignored. For example, if the following command is issued, then Data Pump does not use theDPUMP_DIR
directory object for the file parameter, but instead looks for a server-based directory object whose path value contains'/disk1/user1/dumpfiles/'
and to which thehr
schema has been granted read and write access:expdp hr FILE=/disk1/user1/dumpfiles/hrdata.dmp DIRECTORY=dpump_dir
-
If no path location is specified as part of the file specification, then the directory object named by the
DIRECTORY
parameter is used. For example, if the following command is issued, then Data Pump applies the path location defined for theDPUMP_DIR
directory object to thehrdata.dmp
file:expdp hr FILE=hrdata.dmp DIRECTORY=dpump_dir
-
If no path location is specified as part of the file specification and no directory object is named by the
DIRECTORY
parameter, then Data Pump does the following, in the order shown:-
Data Pump looks for the existence of a directory object of the form
DATA_PUMP_DIR_
schema_name
, whereschema_name
is the schema that is executing the Data Pump job. For example, the following command would cause Data Pump to look for the existence of a server-based directory object namedDATA_PUMP_DIR_HR
:expdp hr FILE=hrdata.dmp
The
hr
schema also must have been granted read and write access to this directory object. If such a directory object does not exist, then the process moves to step b. -
Data Pump looks for the existence of the client-based environment variable
DATA_PUMP_DIR
. For instance, assume that a server-based directory object namedDUMP_FILES1
has been defined and thehr
schema has been granted read and write access to it. Then on the client system, the environment variableDATA_PUMP_DIR
can be set to point toDUMP_FILES1
as follows:setenv DATA_PUMP_DIR DUMP_FILES1 expdp hr FILE=hrdata.dmp
Data Pump then uses the served-based directory object
DUMP_FILES1
for thehrdata.dmp
file.If a client-based environment variable
DATA_PUMP_DIR
does not exist, then the process moves to step c. -
If the schema that is executing the Data Pump job has DBA privileges, then the default Data Pump directory object,
DATA_PUMP_DIR
, is used. This default directory object is established at installation time. For example, the following command causes Data Pump to attempt to use the defaultDATA_PUMP_DIR
directory object, assuming that system has DBA privileges:expdp system FILE=hrdata.dmp
-
See Also:
Default Locations for Dump_ Log_ and SQL Files for information about Data Pump file management rules of precedence under normal Data Pump conditions (that is, non-legacy mode)
Parent topic: Data Pump Legacy Mode
4.4 Adjusting Existing Scripts for Data Pump Log Files and Errors
Describes how to adjust existing scripts for Data Pump log files and errors.
Data Pump legacy mode requires that you review and update your existing scripts written for original Export and Import because of differences in file format and error reporting.
- Log Files
Data Pump Export and Import do not generate log files in the same format as those created by original Export and Import. - Error Cases
Data Pump Export and Import may not produce the same errors as those generated by original Export and Import. - Exit Status
Data Pump Export and Import have enhanced exit status values to allow scripts to better determine the success of failure of export and import jobs.
Parent topic: Data Pump Legacy Mode
4.4.1 Log Files
Data Pump Export and Import do not generate log files in the same format as those created by original Export and Import.
Any scripts you have that parse the output of original Export and Import must be updated to handle the log file format used by Data Pump Export and Import. For example, the message Successfully Terminated
does not appear in Data Pump log files.
4.4.2 Error Cases
Data Pump Export and Import may not produce the same errors as those generated by original Export and Import.
For example, if a parameter that is ignored by Data Pump Export would have had an out-of-range value in original Export, then an informational message is written to the log file stating that the parameter is being ignored. No value checking is performed, therefore no error message is generated.