Skip to content

Posts from the ‘Datapump’ Category

Datapump filesize parameter

Default: 0 (unlimited)

Purpose

Specifies the maximum size of each dump file. If the size is reached for any member of the dump file set, that file is closed and an attempt is made to create a new file, if the file specification contains a substitution variable.

Syntax and Description

FILESIZE=integer[B | K | M | G]

The integer can be followed by B, K, M, or G (indicating bytes, kilobytes, megabytes, and gigabytes respectively). Bytes is the default. The actual size of the resulting file may be rounded down slightly to match the size of the internal blocks used in dump files.

Restrictions

The minimum size for a file is ten times the default Data Pump block size, which is 4 kilobytes.

Example :


C:\Users\mazar>expdp testnet/testnet tables=mytest  directory=data_pump_dir dumpfile=filesize.dmp logfile=estimatest.log filesize=50k

Export: Release 11.2.0.1.0 - Production on Tue Nov 9 10:53:50 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "TESTNET"."SYS_EXPORT_TABLE_01":  testnet/******** tables=mytest directory=data_pump_dir dumpfile=filesize.dmp logfile=estimatest.log filesize=50k
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "TESTNET"."MYTEST"                          6.625 KB       1 rows
ORA-31694: master table "TESTNET"."SYS_EXPORT_TABLE_01" failed to load/unload
ORA-39095: Dump file space has been exhausted: Unable to allocate 4096 bytes
Job "TESTNET"."SYS_EXPORT_TABLE_01" stopped due to fatal error at 10:54:43


C:\Users\mazar>expdp testnet/testnet tables=mytest  directory=data_pump_dir dumpfile=filesizenew.dmp logfile=estimatestnew.log filesize=100k

Export: Release 11.2.0.1.0 - Production on Tue Nov 9 12:26:27 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "TESTNET"."SYS_EXPORT_TABLE_02":  testnet/******** tables=mytest directory=data_pump_dir dumpfile=filesizenew.dmp logfile=estimatestnew.log filesize=1
00k
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "TESTNET"."MYTEST"                          6.625 KB       1 rows
Master table "TESTNET"."SYS_EXPORT_TABLE_02" successfully loaded/unloaded
******************************************************************************
Dump file set for TESTNET.SYS_EXPORT_TABLE_02 is:
  C:\APP\ORACLE\MAZAR\ADMIN\AZARDB\DPDUMP\FILESIZENEW.DMP
Job "TESTNET"."SYS_EXPORT_TABLE_02" successfully completed at 12:28:17

Ref :

http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm

Datapump Estimate parameter

ESTIMATE

Default: BLOCKS

Purpose

Specifies the method that Export will use to estimate how much disk space each table in the export job will consume (in bytes). The estimate is printed in the log file and displayed on the client’s standard output device. The estimate is for table row data only; it does not include metadata.

Syntax and Description

ESTIMATE={BLOCKS | STATISTICS}
  • BLOCKS – The estimate is calculated by multiplying the number of database blocks used by the source objects, times the appropriate block sizes.
  • STATISTICS – The estimate is calculated using statistics for each table. For this method to be as accurate as possible, all tables should have been analyzed recently.

Restrictions

If the Data Pump export job involves compressed tables, the default size estimation given for the compressed table is inaccurate when ESTIMATE=BLOCKS is used. This is because the size estimate does not reflect that the data was stored in a compressed form. To get a more accurate size estimate for compressed tables, use ESTIMATE=STATISTICS.

Example : Estimate= Statistics


C:\Users\mazar>expdp testnet/testnet tables=mytest estimate=statistics directory=data_pump_dir dumpfile=estimate_Stat.dmp logfile=estimate.log

Export: Release 11.2.0.1.0 - Production on Tue Nov 9 09:53:33 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "TESTNET"."SYS_EXPORT_TABLE_01":  testnet/******** tables=mytest estimate=statistics directory=data_pump_dir dumpfile=estimate_Stat.dmp logfile=estima
te.log
Estimate in progress using STATISTICS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
.  estimated "TESTNET"."MYTEST"                          6.582 KB
Total estimation using STATISTICS method: 6.582 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "TESTNET"."MYTEST"                          6.625 KB       1 rows
Master table "TESTNET"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for TESTNET.SYS_EXPORT_TABLE_01 is:
  C:\APP\ORACLE\MAZAR\ADMIN\AZARDB\DPDUMP\ESTIMATE_STAT.DMP
Job "TESTNET"."SYS_EXPORT_TABLE_01" successfully completed at 09:54:23

Estimate=blocks


C:\Users\mazar>expdp testnet/testnet tables=mytest estimate=blocks directory=data_pump_dir dumpfile=estimatestat.dmp logfile=estimatest.log

Export: Release 11.2.0.1.0 - Production on Tue Nov 9 09:55:26 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "TESTNET"."SYS_EXPORT_TABLE_01":  testnet/******** tables=mytest estimate=blocks directory=data_pump_dir dumpfile=estimatestat.dmp logfile=estimatest.
log
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
.  estimated "TESTNET"."MYTEST"                             64 KB
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "TESTNET"."MYTEST"                          6.625 KB       1 rows
Master table "TESTNET"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for TESTNET.SYS_EXPORT_TABLE_01 is:
  C:\APP\ORACLE\MAZAR\ADMIN\AZARDB\DPDUMP\ESTIMATESTAT.DMP
Job "TESTNET"."SYS_EXPORT_TABLE_01" successfully completed at 09:55:49
C:\Users\mazar>

ESTIMATE_ONLY

Default: n

Purpose

Instructs Export to estimate the space that a job would consume, without actually performing the export operation.

Syntax and Description

ESTIMATE_ONLY={y | n}

If ESTIMATE_ONLY=y, then Export estimates the space that would be consumed, but quits without actually performing the export operation.

Example :


C:\Users\mazar>set oracle_sid=azardb

C:\Users\mazar>expdp testnet/testnet estimate_only=y nologfile=y

Export: Release 11.2.0.1.0 - Production on Tue Nov 9 09:47:32 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "TESTNET"."SYS_EXPORT_SCHEMA_01":  testnet/******** estimate_only=y nologfile=y
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
.  estimated "TESTNET"."A"                                   9 MB
.  estimated "TESTNET"."DEPT"                               64 KB
.  estimated "TESTNET"."MYTEST"                             64 KB
.  estimated "TESTNET"."TEST"                               64 KB
.  estimated "TESTNET"."EMP"                                 0 KB
.  estimated "TESTNET"."MYVAR"                               0 KB
Total estimation using BLOCKS method: 9.187 MB
Job "TESTNET"."SYS_EXPORT_SCHEMA_01" successfully completed at 09:50:00

Ref :

http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm

content datapump parameter

This post show you , Easy understanding about CONTENT datapump parameter.

Default : ALL

Purpose :

Enables you to filter what Export unloads: data only, metadata only, or both.

Syntax and Description :

CONTENT={ALL | DATA_ONLY | METADATA_ONLY}
  • ALL unloads both data and metadata. This is the default.
  • DATA_ONLY unloads only table row data; no database object definitions are unloaded.
  • METADATA_ONLY unloads only database object definitions; no table row data is unloaded.

Restrictions

The CONTENT=METADATA_ONLY parameter cannot be used in conjunction with the parameter TRANSPORT_TABLESPACES (transportable-tablespace-mode).

See Example :


SQL> conn sys/Admin123@azardb as sysdba
Connected.
SQL> create user dpuser identified by dpuser;

User created.

SQL> grant connect,resource,dba to dpuser;

Grant succeeded.

SQL> grant read,write on directory data_pump_dir to dpuser;

Grant succeeded.

SQL> conn dpuser/dpuser;
Connected.
SQL> create table employee as select  * from scott.emp;

Table created.

Now I’m go to  Export Only Metadata


C:\Users\mazar>expdp dpuser/dpuser@azardb directory=data_pump_dir dumpfile=contentmeta.dmp  log=contentmeta.log content=metadata_only

Export: Release 11.2.0.1.0 - Production on Mon Nov 8 10:10:33 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Legacy Mode Active due to the following parameters:
Legacy Mode Parameter: "log=contentmeta.log" Location: Command Line, Replaced with: "logfile=contentmeta.log"
Legacy Mode has set reuse_dumpfiles=true parameter.
Starting "DPUSER"."SYS_EXPORT_SCHEMA_01":  dpuser/********@azardb directory=data_pump_dir dumpfile=contentmeta.dmp logfile=contentmeta.log content=metadata_on
y reuse_dumpfiles=true
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Master table "DPUSER"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for DPUSER.SYS_EXPORT_SCHEMA_01 is:
  C:\APP\ORACLE\MAZAR\ADMIN\AZARDB\DPDUMP\CONTENTMETA.DMP
Job "DPUSER"."SYS_EXPORT_SCHEMA_01" successfully completed at 10:14:10

** You can see No row data Exported, Only metadata information only exported.

C:\Users\mazar>sqlplus

SQL*Plus: Release 11.2.0.1.0 Production on Mon Nov 8 10:53:55 2010

Copyright (c) 1982, 2010, Oracle.  All rights reserved.

Enter user-name: dpuser/dpuser

Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

SQL> drop table employee;

Table dropped.

I dropped the table and them inport what i was exported (METADATA).

SQL> select * from tab;

TNAME                          TABTYPE  CLUSTERID
------------------------------ ------- ----------
BIN$F3pNm4h2RDelNFamba47MA==$0 TABLE

C:\Users\mazar>impdp dpuser/dpuser@azardb directory=data_pump_dir dumpfile=contentmeta.dmp  log=contentmeta.log

Import: Release 11.2.0.1.0 - Production on Mon Nov 8 10:55:05 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Legacy Mode Active due to the following parameters:
Legacy Mode Parameter: "log=contentmeta.log" Location: Command Line, Replaced with: "logfile=contentmeta.log"
Master table "DPUSER"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "DPUSER"."SYS_IMPORT_FULL_01":  dpuser/********@azardb directory=data_pump_dir dumpfile=contentmeta.dmp logfile=contentmeta.log
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"DPUSER" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Job "DPUSER"."SYS_IMPORT_FULL_01" completed with 1 error(s) at 10:55:12

You can see No rows imported, only metadata


 C:\Users\mazar>sqlplus

SQL*Plus: Release 11.2.0.1.0 Production on Mon Nov 8 10:55:18 2010

Copyright (c) 1982, 2010, Oracle.  All rights reserved.

Enter user-name: dpuser/dpuser

Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

SQL> select * from tab;

TNAME                          TABTYPE  CLUSTERID
------------------------------ ------- ----------
EMPLOYEE                       TABLE

SQL> select * from employee;

no rows selected

CONTENT= data_only :

SQL> create table employee as select * from scott.emp;

Table created.

SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options



C:\Users\mazar>expdp dpuser/dpuser@azardb directory=data_pump_dir dumpfile=contentmeta.dmp  log=contentmeta.log content=data_only

Export: Release 11.2.0.1.0 - Production on Mon Nov 8 10:56:38 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Legacy Mode Active due to the following parameters:
Legacy Mode Parameter: "log=contentmeta.log" Location: Command Line, Replaced with: "logfile=contentmeta.log"
Legacy Mode has set reuse_dumpfiles=true parameter.
Starting "DPUSER"."SYS_EXPORT_SCHEMA_01":  dpuser/********@azardb directory=data_pump_dir dumpfile=contentmeta.dmp logfile=contentmeta.log content=data_only r
use_dumpfiles=true
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
. . exported "DPUSER"."EMPLOYEE"                         8.585 KB      14 rows
Master table "DPUSER"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for DPUSER.SYS_EXPORT_SCHEMA_01 is:
  C:\APP\ORACLE\MAZAR\ADMIN\AZARDB\DPDUMP\CONTENTMETA.DMP
Job "DPUSER"."SYS_EXPORT_SCHEMA_01" successfully completed at 10:56:57



Enter user-name: dpuser/dpuser

Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options

SQL> drop table employee;

Table dropped.

SQL> commit;

Commit complete.

SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options


C:\Users\mazar>impdp dpuser/dpuser@azardb directory=data_pump_dir dumpfile=contentmeta.dmp  log=contentmeta.log content=data_only

Import: Release 11.2.0.1.0 - Production on Mon Nov 8 11:11:08 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Legacy Mode Active due to the following parameters:
Legacy Mode Parameter: "log=contentmeta.log" Location: Command Line, Replaced with: "logfile=contentmeta.log"
Master table "DPUSER"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "DPUSER"."SYS_IMPORT_FULL_01":  dpuser/********@azardb directory=data_pump_dir dumpfile=contentmeta.dmp logfile=contentmeta.log content=data_only
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
ORA-39126: Worker unexpected fatal error in KUPW$WORKER.UPATE_TD_ROW_IMP [62]
TABLE_DATA:"DPUSER"."EMPLOYEE"
ORA-31603: object "EMPLOYEE" of type TABLE not found in schema "DPUSER"

ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
ORA-06512: at "SYS.KUPW$WORKER", line 8171

----- PL/SQL Call Stack -----
  object      line  object
  handle    number  name
290C5B9C     18990  package body SYS.KUPW$WORKER
290C5B9C      8192  package body SYS.KUPW$WORKER
290C5B9C     18552  package body SYS.KUPW$WORKER
290C5B9C      4105  package body SYS.KUPW$WORKER
290C5B9C      8875  package body SYS.KUPW$WORKER
290C5B9C      1649  package body SYS.KUPW$WORKER
2A56420C         2  anonymous block

ORA-39126: Worker unexpected fatal error in KUPW$WORKER.UPATE_TD_ROW_IMP [62]
TABLE_DATA:"DPUSER"."EMPLOYEE"
ORA-31603: object "EMPLOYEE" of type TABLE not found in schema "DPUSER"

ORA-06512: at "SYS.DBMS_SYS_ERROR", line 105
ORA-06512: at "SYS.KUPW$WORKER", line 8171

----- PL/SQL Call Stack -----
  object      line  object
  handle    number  name
290C5B9C     18990  package body SYS.KUPW$WORKER
290C5B9C      8192  package body SYS.KUPW$WORKER
290C5B9C     18552  package body SYS.KUPW$WORKER
290C5B9C      4105  package body SYS.KUPW$WORKER
290C5B9C      8875  package body SYS.KUPW$WORKER
290C5B9C      1649  package body SYS.KUPW$WORKER
2A56420C         2  anonymous block

Job "DPUSER"."SYS_IMPORT_FULL_01" stopped due to fatal error at 11:11:13
You can see NO TABLE avilable, So you couldnot import data without table.

Ref :

http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm

ORA-39170: Schema expression ‘AZAR$RIY’ does not correspond to any schemas

SQL> create user "azar$riy" identified by azar;

User created.

SQL> grant connect,resource,dba to "azar$riy";

Grant succeeded.

SQL> grant read,write on directory data_pump_dir to "azar$riy";

Grant succeeded.

SQL> conn "azar$riy"/azar;
Connected.
SQL> create table a as select * from scott.emp;

Table created.

SQL> commit;

Commit complete.

SQL> exit
Disconnected from Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options


C:\Users\mazar>expdp system/Admin123@azardb directory=data_pump_dir schemas="azar$riy" dumpfile=azartest.dmp log=azartest.log 
Export: Release 11.2.0.1.0 - Production on Sun Nov 7 15:42:09 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39001: invalid argument value
ORA-39170: Schema expression 'AZAR$RIY' does not correspond to any schemas.


C:\Users\mazar>expdp system/Admin123@azardb directory=data_pump_dir schemas='\"azar$riy\"'
 dumpfile=azartest.dmp log=azartest.log 

Export: Release 11.2.0.1.0 - Production on Sun Nov 7 15:42:20 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Legacy Mode Active due to the following parameters:
Legacy Mode Parameter: "log=azartest.log" Location: Command Line, Replaced with: "logfile=azartest.log"
Legacy Mode has set reuse_dumpfiles=true parameter.
Starting "SYSTEM"."SYS_EXPORT_SCHEMA_01":  system/********@azardb directory=data_pump_dir dumpfile=azartest.dmp logfile=azartest.log schemas='"azar$riy"' reuse
_dumpfiles=true
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . exported "azar$riy"."A"                              8.578 KB      14 rows
Master table "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYSTEM.SYS_EXPORT_SCHEMA_01 is:
  C:\APP\ORACLE\MAZAR\ADMIN\AZARDB\DPDUMP\AZARTEST.DMP
Job "SYSTEM"."SYS_EXPORT_SCHEMA_01" successfully completed at 15:43:10
C:\Users\mazar>

Datapump network_link export and import

This post show you, just a easy understanding about Datapump network_link parameter.

I have used two site database

Broker and ACE

Step 1:

Source Database

Go to Broker database,

SQL> create user nettest identified by nettest;

User created.

SQL> grant connect,resource,dba to nettest;

Grant succeeded.

SQL> conn nettest/nettest@broker.world;
Connected.
SQL> create table mynet(empno number,empname varchar2(20));

Table created.

SQL> insert into mynet values(001,’azar’);

1 row created.

SQL> insert into mynet values(002,’jahir’);

1 row created.

SQL> commit;

Commit complete.

Step 2:

Target Database

Go to ACE database

SQL> conn / as sysdba
Connected.
SQL> create user mynet identified by mynet;

User created.

SQL> grant connect,resource,dba to mynet;

Grant succeeded.

SQL> grant read,write on directory data_pump_dir to mynet;

Grant succeeded.

SQL> commit;

Commit complete.

SQL> conn mynet/mynet;
Connected.

SQL> select * from tab;

no rows selected

SQL> create database link useme connect to nettest identified by nettest using ‘
BROKER.world’;

Database link created.

SQL> commit;

Commit complete.

SQL> select * from dba_directories where directory_name like ‘DATA_PUMP_DIR%’;

OWNER                          DIRECTORY_NAME
—————————— ——————————
DIRECTORY_PATH
——————————————————————————–

SYS                            DATA_PUMP_DIR
c:\oracle\product\10.2.0\admin\ace\dpdump\
SQL> select * from tab;

no rows selected

SQL>

 

SQL> select * from dba_db_links where owner=’MYNET’;

OWNER
——————————
DB_LINK
—————————————————-

USERNAME
——————————
HOST
—————————————————-

CREATED
———
MYNET
USEME.REGRESS.RDBMS.DEV.US.ORACLE.COM
NETTEST
BROKER.world
27-OCT-10

Step 3:

Go to target server (ace db server) and execute your export command here, you can see your broker DB schema (nettest) dump file stored in ACE db server schema directory (mynet).

C:\Documents and Settings\Administrator>expdp mynet/mynet@ace dire
ctory=data_pump_dir network_link=USEME.REGRESS.RDBMS.DEV.US.ORACLE.COM logfile=t
estnet.log dumpfile=testnet.dmp tables=mynet

Export: Release 10.2.0.1.0 – 64bit Production on Wednesday, 27 October, 2010 16:
53:52

Copyright (c) 2003, 2005, Oracle.  All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 – Produc
tion
With the Partitioning, OLAP and Data Mining options
Starting “MYNET”.”SYS_EXPORT_TABLE_01″:  mynet/********@ace directory=data
_pump_dir network_link=USEME.REGRESS.RDBMS.DEV.US.ORACLE.COM logfile=testnet.log
dumpfile=testnet.dmp tables=mynet
Estimate in progress using BLOCKS method…
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported “NETTEST”.”MYNET”                            5.25 KB       2 rows
Master table “MYNET”.”SYS_EXPORT_TABLE_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for MYNET.SYS_EXPORT_TABLE_01 is:
C:\ORACLE\PRODUCT\10.2.0\ADMIN\ACE\DPDUMP\TESTNET.DMP
Job “MYNET”.”SYS_EXPORT_TABLE_01″ successfully completed at 16:55:02

Export Finished and then If you need , you can import the dump file from broker db server to ace db server.

For import, dumpfile parameter not need.
C:\Documents and Settings\Administrator>impdp mynet/mynet@ace dire
ctory=data_pump_dir network_link=USEME.REGRESS.RDBMS.DEV.US.ORACLE.COM logfile=t
estnetimp.log remap_schema=nettest:mynet  tables=mynet

Import: Release 10.2.0.1.0 – 64bit Production on Wednesday, 27 October, 2010 17:
00:45

Copyright (c) 2003, 2005, Oracle.  All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 – Produc
tion
With the Partitioning, OLAP and Data Mining options
Starting “MYNET”.”SYS_IMPORT_TABLE_01″:  mynet/********@ace directory=data
_pump_dir network_link=USEME.REGRESS.RDBMS.DEV.US.ORACLE.COM logfile=testnetimp.
log remap_schema=nettest:mynet tables=mynet
Estimate in progress using BLOCKS method…
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . imported “MYNET”.”MYNET”                                  2 rows
Job “MYNET”.”SYS_IMPORT_TABLE_01″ successfully completed at 17:02:15
C:\Documents and Settings\Administrator.ACE-INS>

 Step 4:

SQL> conn mynet/mynet@ace
Connected.
SQL> select * from tab;

TNAME                          TABTYPE  CLUSTERID
—————————— ——- ———-
MYNET                          TABLE

SQL> select * from mynet;

     EMPNO EMPNAME
———- ——————–
         1 azar
         2 jahir

SQL>

 

If you need to refer about network_link parameter, you can from here

http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_export.htm#i1011008

datapump reterive original data using flashback scn

Retrieve original data:-

It may be useful to export the image of a table the way it existed before a change was committed.if the database is properly configured, the database flashback query facility also integrated with expdp may be used.

SQL> conn / as sysdba
Connected.
SQL> create restore point original_emp;

Restore point created.

SQL> select scn,name from v$restore_point;

SCN
———-
NAME
——————————————————————————–
518527
ORIGINAL_EMP

SQL> conn scott/tiger;
Connected.

SQL> select sum(sal) from emp;

SUM(SAL)
———-
29025

SQL> update test set sal=sal*1.1;

14 rows updated.

Here Test table is copy of emp table.

SQL> commit;

Commit complete.

SQL> select sum(sal) from test;

SUM(SAL)
———-
31927.5

[oracle@localhost rlwrap-0.30]$ expdp scott/tiger dumpfile=original directory=test_dir tables=test flashback_scn=518527

Export: Release 10.2.0.1.0 – Production on Friday, 07 May, 2010 13:33:02

Copyright (c) 2003, 2005, Oracle.  All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 – Production
With the Partitioning, OLAP and Data Mining options
Starting “SCOTT”.”SYS_EXPORT_TABLE_01″:  scott/******** dumpfile=original directory=test_dir tables=test flashback_scn=518527
Estimate in progress using BLOCKS method…
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type TABLE_EXPORT/TABLE/TABLE
. . exported “SCOTT”.”TEST”                              7.820 KB      14 rows
Master table “SCOTT”.”SYS_EXPORT_TABLE_01″ successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SYS_EXPORT_TABLE_01 is:
/home/oracle/datapumptest/original.dmp
Job “SCOTT”.”SYS_EXPORT_TABLE_01″ successfully completed at 13:33:11

SQL> drop table test;

Table dropped.

SQL> commit;

Commit complete.

[oracle@localhost rlwrap-0.30]$ impdp scott/tiger dumpfile=original directory=test_dir tables=test

Import: Release 10.2.0.1.0 – Production on Friday, 07 May, 2010 13:34:57

Copyright (c) 2003, 2005, Oracle.  All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 – Production
With the Partitioning, OLAP and Data Mining options
Master table “SCOTT”.”SYS_IMPORT_TABLE_01″ successfully loaded/unloaded
Starting “SCOTT”.”SYS_IMPORT_TABLE_01″:  scott/******** dumpfile=original directory=test_dir tables=test
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported “SCOTT”.”TEST”                              7.820 KB      14 rows
Job “SCOTT”.”SYS_IMPORT_TABLE_01″ successfully completed at 13:35:00

[oracle@localhost rlwrap-0.30]$

SQL> select sum(sal) from test;

SUM(SAL)
———-
29025

DBMS SCHEDULER for Data Pump

How to take logical database backup automatically using Data pump .

We can use many method to achieve this one. i choosed to play with DBMS SCHEDULER.

Step 1:

Connect  sys user.

sql> grant execute on dbms_lock to system;

Connect System user .

Step 2 : Create Directory

SQL> CREATE DIRECTORY NEW_DIR AS ‘D:\ORACLE\DUMP’;

Directory created.

SQL> GRANT READ,WRITE  ON DIRECTORY  NEW_DIR TO SYSTEM;

Grant succeeded.

SQL> COMMIT;

Commit complete.

Step 3:  Create Package

SQL> CREATE OR REPLACE PACKAGE fullexport
  2  IS
  3  PROCEDURE exp;
  4  END fullexport;
  5  /

Package created.

Step 4 : Create  Package Body

SQL> CREATE OR REPLACE PACKAGE BODY fullexport
  2  IS
  3  h1 NUMBER := NULL; PROCEDURE exp
  4  IS
  5  BEGIN
  6  BEGIN
  7  DBMS_LOCK.sleep(30);
  8  END;
  9  –DO THE EXPORT
 10  BEGIN
 11  h1 := DBMS_DATAPUMP.open(
 12  operation => ‘EXPORT’,
 13  job_mode => ‘FULL’,
 14  job_name => ‘MYTEST’,
 15  version => ‘COMPATIBLE’);
 16  DBMS_DATAPUMP.add_file(
 17  handle => h1,
 18  filename => ‘FULLEXP.LOG’,
 19  directory => ‘NEW_DIR’,
 20  filetype => 3);
 21  DBMS_DATAPUMP.add_file(
 22  handle => h1,
 23  filename => ‘MYTEST.DMP’,
 24  directory => ‘NEW_DIR’,
 25  filetype => 1);
 26  DBMS_DATAPUMP.START_JOB(h1);
 27  END;
 28  END;
 29  END fullexport;
 30  /

Package body created.

Step 5: Run procedure

SQL> exec fullexport.exp;

PL/SQL procedure successfully completed.

Step 6: Create DBMS_SCHEDULER

SQL> BEGIN
  2  DBMS_SCHEDULER.CREATE_JOB_CLASS(
  3  job_class_name => ‘BACKUP’,
  4  resource_consumer_group => ‘DEFAULT_CONSUMER_GROUP’,
  5  comments => ‘BACKUP, EXPORT ETC’);
  6  END;
  7  /

PL/SQL procedure successfully completed.

SQL> BEGIN
  2  DBMS_SCHEDULER.CREATE_JOB(
  3  job_name => ‘TEST’,
  4  job_type => ‘STORED_PROCEDURE’,
  5  job_action => ‘FULLEXPORT.EXP’,
  6  start_date => ’16-FEB-10 01:00:00 PM’,
  7  repeat_interval => ‘FREQ=DAILY;BYHOUR=13’,
  8  job_class => ‘BACKUP’,
  9  comments => ‘FULL EXPORT’);
 10  END;
 11  /

PL/SQL procedure successfully completed.

SQL> BEGIN
  2  DBMS_SCHEDULER.ENABLE(‘TEST’);
  3  END;
  4  /

PL/SQL procedure successfully completed.

SQL>

VIEW :

DBA_DATAPUMP_JOBS

USER_TAB_PRIVS