Import Export FAQ
Import Export FAQ
Import Export FAQ
Contents
[hide]
• 8 Can one export to multiple files?/ Can one beat the Unix 2 Gig limit?
These utilities can be used to move data between different machines, databases or schema.
However, as they use a proprietary binary file format, they can only be used between Oracle
databases. One cannot export data and expect to import it into a non-Oracle database.
Various parameters are available to control what objects are exported or imported. To get a list
of available parameters, run the exp or imputilities with the help=yes parameter.
The export/import utilities are commonly used to perform the following tasks:
Backup and recovery (small databases only, say < +50GB, if bigger,
use RMAN instead)
Move data between Oracle databases on different platforms (for example from Solaris
to Windows)
Reorganization of data/ eliminate database fragmentation (export, drop and re-import
tables)
Upgrade databases from extremely old versions of Oracle (when in-place upgrades are
not supported by the Database Upgrade Assistant anymore)
Detect database corruption. Ensure that all the data can be read
Transporting tablespaces between databases
Etc.
From Oracle 10g, users can choose between using the old imp/exp utilities, or the newly
introduced Datapump utilities, called expdp and impdp. These new utilities introduce much needed
performance improvements, network based exports and imports, etc.
NOTE: It is generally advised not to use exports as the only means of backing-up a database.
Physical backup methods (for example, when you use RMAN) are normally much quicker and
supports point in time based recovery (apply archivelogs after recovering a database). Also,
exp/imp is not practical for large database environments.
The following examples demonstrate how the imp/exp utilities can be used:
BUFFER=100000
FILE=account.dmp
FULL=n
OWNER=scott
GRANTS=y
COMPRESS=y
NOTE: If you do not like command line utilities, you can import and export data with the
"Schema Manager" GUI that ships with Oracle Enterprise Manager (OEM).
Method 1:
For this to work one needs to be on Oracle 7.3 or higher (7.2 might also be OK). If the import
has more than one table, this statement will only show information about the current table being
imported.
Method 2:
Use the FEEDBACK=n import parameter. This parameter will tell IMP to display a dot for every
N rows imported. For example, FEEDBACK=1000 will show a dot after every 1000 row.
Note: It is also advisable to drop indexes before importing to speed up the import process.
Indexes can easily be recreated after the data was successfully imported.
Oracle also ships some previous catexpX.sql scripts that can be executed as user SYS enabling
older imp/exp versions to work (for backwards compatibility). For example, one can run
$ORACLE_HOME/rdbms/admin/catexp7.sql on an Oracle 8 database to allow the Oracle 7.3
exp/imp utilities to run against an Oracle 8 database.
[edit]Can one export to multiple files?/ Can one beat the Unix 2
Gig limit?
From Oracle8i, the export utility supports multiple output files. This feature enables large exports
to be divided into files whose sizes will not exceed any operating system limits (FILESIZE=
parameter). When importing from multi-file export you must provide the same filenames in the
same sequence in the FILE= parameter. Look at this example:
exp SCOTT/TIGER FILE=D:F1.dmp,E:F2.dmp FILESIZE=10m LOG=scott.log
Use the following technique if you use an Oracle version prior to 8i:
Create a compressed export on the fly. Depending on the type of data, you probably can export
up to 10 gigabytes to a single file. This example uses gzip. It offers the best compression I know
of, but you can also substitute it with zip, compress or whatever.
Set the BUFFER parameter to a high value (e.g. 2Mb -- entered as an integer
"2000000")
Set the RECORDLENGTH parameter to a high value (e.g. 64Kb -- entered as an
integer "64000")
Use DIRECT=yes (direct mode export)
Stop unnecessary applications to free-up resources for your job.
If you run multiple export sessions, ensure they write to different physical disks.
DO NOT export to an NFS mounted filesystem. It will take forever.
IMPORT:
Create an indexfile so that you can create indexes AFTER you have imported data. Do
this by setting INDEXFILE to a filename and then import. No data will be imported but a file
containing index definitions will be created. You must edit this file afterwards and supply the
passwords for the schemas on all CONNECT statements.
Place the file to be imported on a separate physical disk from the oracle data files
Increase DB_CACHE_SIZE (DB_BLOCK_BUFFERS prior to 9i) considerably in the
init$SID.ora file
Set the LOG_BUFFER to a big value and restart oracle.
Stop redo log archiving if it is running (ALTER DATABASE NOARCHIVELOG;)
Create a BIG tablespace with a BIG rollback segment inside. Set all other rollback
segments offline (except the SYSTEM rollback segment of course). The rollback segment
must be as big as your biggest table (I think?)
Use COMMIT=N in the import parameter file if you can afford it
Use STATISTICS=NONE in the import parameter file to avoid time consuming to import
the statistics
Remember to run the indexfile previously created
You are importing duplicate rows. Use IGNORE=YES to skip tables that already exist
(imp will give an error if the object is re-created).
Ask your users to STOP working while you are exporting or try using parameter
CONSISTENT=NO
Use the IGNORE=Y import parameter to ignore these errors, but be careful as you
might end up with duplicate rows.
Category: Frequently Asked Questions