Note: Proofread
any scripts before using. Always try scripts on a test instance first.
This Blog is not responsible for any kind of damage.
Starting with Oracle 10g, Oracle has introduced an enhanced version of EXPORT and IMPORT utility known as DATA PUMP. Data Pump is similar to EXPORT and IMPORT utility but it has many advantages. Some of the advantages are:
1. Most Data Pump export and import operations occur on the Oracle database server. i.e. all the dump files are created in the server even if you run the Data Pump utility from client machine. This results in increased performance because data is not transferred through network.
2. You can Stop and Re-Start export and import jobs. This is particularly useful if you have started an export or import job and after some time you want to do some other urgent work.
3. The ability to detach from and reattach to long-running jobs without affecting the job itself. This allows DBAs and other operations personnel to monitor jobs from multiple locations.
4. The ability to estimate how much space an export job would consume, without actually performing the export
5. Support for an interactive-command mode that allows monitoring of and interaction with ongoing jobs
Using Data Pump Export Utility
To Use Data Pump, DBA has to create a directory in Server Machine and create a Directory Object in the database mapping to the directory created in the file system.
Creating a Directory Alias
To create a directory alias, you must have CREATE ANY DIRECTORY system privileges.
Use a CREATE DIRECTORY statement to create a new directory alias, or a REPLACE DIRECTORY statement to redefine an existing directory alias, using the following PL/SQL syntax:
{CREATE | REPLACE | CREATE OR REPLACE} DIRECTORY alias AS 'pathname';
Where:
alias is the name of the directory alias.
pathname is the physical directory path.
Granting Access Rights to a Directory Alias
After you create a directory alias, grant users and groups access rights to the files contained in that directory, using the following PL/SQL syntax:
GRANT permission ON DIRECTORY alias TO {user | role | PUBLIC};
Where:
permission is one of the following:
READ for read-only access
WRITE for write-only access
ALL for read and write access
alias is the name of the directory alias.
user is a database user name. That user gets immediate access rights.
role is a database role. All users who have been granted that role get immediate access rights.
PUBLIC is all database users. All users gets immediate access rights.
Example: Creating and Using a Directory Alias
The following SQL commands create a directory alias named data_pump_dir to control access to a directory named /oracle/admin/dpdump and grant read access to all users.
$mkdir -p /oracle/admin/dpdump
$sqlplus /nolog
SQL*Plus: Release 10.2.0.5.0 - Production on Mon Apr 15 05:33:05 2013
Copyright (c) 1982, 2010, Oracle. All Rights Reserved.
SQL> connect /as sysdba
Connected.
SQL>
SQL>create directory data_pump_dir as '/oracle/admin/dpdump';
Now grant access on this directory object to PUBLIC or to any specific user
SQL> grant read,write on directory data_pump_dir to public;
or
SQL> grant read,write on directory data_pump_dir to system;
To crosscheck at database end use the following query:
SQL> desc dba_directories
SQL> set linesize 121
SQL> col owner format a15
SQL> col directory_name format a20
SQL> col directory_path format a70
SQL> SELECT * FROM dba_directories;
Example of Exporting a Full Database
To Export Full Database, give the following command
$expdp system/****** FULL=y DIRECTORY=data_pump_dir DUMPFILE=full.dmp
LOGFILE=fullexp.log JOB_NAME=expdp_fulldb_job
The above command will export the full database and it will create the dump file full.dmp in the directory on the server (/oracle/admin/dpdump/)
In some cases where the Database is in Terabytes the above command will not feasible since the dump file size will be larger than the operating system limit, and hence export will fail. In this situation you can create multiple dump files by typing the following command
$expdp system/****** FULL=y DIRECTORY=data_pump_dir DUMPFILE=full%U.dmp
FILESIZE=5G LOGFILE=fullexp.log JOB_NAME=expdp_fulldb_job
This will create multiple dump files named full01.dmp, full02.dmp, full03.dmp and so on. The FILESIZE parameter specifies how much larger the dump file should be.
Example of Exporting a Schema
To export all the objects of SCOTT’S schema you can run the following export data pump command.
$expdp scott/****** DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT
You can omit SCHEMAS since the default mode of Data Pump export is SCHEMAS only.
If you want to export objects of multiple schemas you can specify the following command
$expdp system/****** DIRECTORY=data_pump_dir DUMPFILE=HED_schema.dmp
SCHEMAS=HR,EMP,DEPT
Exporting Individual Tables using Data Pump Export
You can use Data Pump Export utility to export individual tables. The following example shows the syntax to export tables
$expdp hr/****** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp
TABLES=employees,jobs,departments
Exporting Tables located in a Tablespace
If you want to export tables located in a particular tablespace you can type the following command
$expdp hr/****** DIRECTORY=data_pump_dir DUMPFILE=tbs.dmp
TABLESPACES=tbs_4, tbs_5, tbs_6
The above will export all the objects located in tbs_4,tbs_5,tbs_6
Excluding and Including Objects during Export
You can exclude objects while performing a export by using EXCLUDE option of Data Pump utility. For example you are exporting a schema and don’t want to export tables whose name starts with “A” then you can type the following command
$expdp scott/****** DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT EXCLUDE=TABLE:”like ‘A%’”
Then all tables in Scott’s Schema whose name starts with “A “ will not be exported.
Similarly you can also INCLUDE option to only export certain objects like this
$expdp scott/****** DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT INCLUDE=TABLE:”like ‘A%’”
This is opposite of EXCLUDE option i.e. it will export only those tables of Scott’s schema whose name starts with “A”
Similarly you can also exclude INDEXES, CONSTRAINTS, GRANTS, USER, SCHEMA
Using Query to Filter Rows during Export
You can use QUERY option to export only required rows. For Example, the following will export only those rows of employees tables whose salary is above 10000 and whose dept id is 10.
expdp hr/****** QUERY=emp:'"WHERE dept_id > 10 AND sal > 10000"'
NOLOGFILE=y DIRECTORY=dpump_dir1 DUMPFILE=exp1.dmp
Suspending and Resuming Export Jobs (Attaching and Re-Attaching to the Jobs)
You can suspend running export jobs and later on resume these jobs or kill these jobs using Data Pump Export. You can start a job in one client machine and then, if because of some work, you can suspend it. Afterwards when your work has been finished you can continue the job from the same client, where you stopped the job, or you can restart the job from another client machine.
For Example, suppose a DBA starts a full database export by typing the following command at one client machine CLNT1 by typing the following command
$expdp scott/******@mydb FULL=y DIRECTORY=data_pump_dir
DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=expdp_fulldb_job
After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter into interactive mode. Then he will get the Export> prompt where he can type interactive commands
Now he wants to stop this export job so he will type the following command
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.
After finishing his other work, the DBA wants to resume the export job and the client machine from where he actually started the job is locked because, the user has locked his/her cabin. So now the DBA will go to another client machine and he reattach to the job by typing the following command
$expdp hr/******@mydb ATTACH=expdp_fulldb_job
After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging mode and restart the expdp_fulldb_job job.
Export> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.
Invoking Export and Import As SYSDBA
SYSDBA is used internally and has specialized functions; its behavior is not the same as for generalized users. Therefore, you should not typically need to invoke Export or Import as SYSDBA, except in the following situations:
At the request of Oracle technical support
When importing a transportable tablespace set
To invoke Export or Import as SYSDBA, use the following syntax (substitute exp for imp if you are using Export). Add any desired parameters or parameter filenames:
imp \'username/password AS SYSDBA\'
Optionally, you could also specify an instance name:
imp \'username/password@instance AS SYSDBA\'
Below Script is for taking FULL Consistent backup when DB is running and can be used scheduling on CRONTAB for daily/weekly backup:
CRON ENTRY for the script:
# Export FULL backups (Keeps 7 days backup files) - Runs at 23:30 everyday
30 23 * * * /oracle/admin/expdp_full_backup.sh > /tmp/expdp_full_backup.log 2>&1
Script:
#!/bin/bash
# Script to Perform Datapump Export backup Every Day
# If any database name passed as parameter then it runs only for that database
# If no parameter passed along with script then it runs for all running databases in the server
#
# Change History
# ================
#
# DATE AUTHOR DESCRIPTION
# ----------- ------- -----------------------------------------------
# 14-APR-2013 DBA Initial Version for Full DB export using EXPDP
#
#
#Backup Path
BACKUP_PATH=/oracle/admin
MAIL_LIST="abc@xyz.com"
HOSTNAME=`hostname`
DB_SID=$1
#
#Preserve the current ORACLE_SID, PATH and ORACLE_HOME.
#
old_oracle_sid=$ORACLE_SID
old_oracle_home=$ORACLE_HOME
old_path=$PATH
#
# Establish the number of instances
#
if [ $# -eq 0 ] ; then
echo "No arguments supplied"
instances=`ps -ef | grep pmon | grep -v "grep pmon" | cut -d _ -f3-`
else
instances=$DB_SID
fi
for i in $instances
do
instance_name=$i
echo "" >> /tmp/oracle_backup_log.rpt
echo "" >> /tmp/oracle_backup_log.rpt
echo "\nInstance Name : $instance_name " >> /tmp/oracle_backup_log.rpt
echo ".........................................." >> /tmp/oracle_backup_log.rpt
#
# Validate and initialize values for the instance name. >> /tmp/oracle_backup_log.rpt
#
#instance_check=$(cat /var/opt/oracle/oratab | grep $instance_name | wc -l)
if [[ $(cat /var/opt/oracle/oratab | grep -c $instance_name) -eq 0 ]] ; then
echo "The instance id specified does not exist in ORATAB" >> /tmp/oracle_backup_log.rpt
else
oracle_home=$(cat /var/opt/oracle/oratab | grep $instance_name | awk -F\: '{print $2}')
ORACLE_SID=$instance_name ; export ORACLE_SID
ORACLE_HOME=$oracle_home ; export ORACLE_HOME
PATH=$old_path
PATH=${ORACLE_HOME}/bin:$PATH ; export PATH
fi
# Export full cosistent backup
export DATE=`date "+%Y-%m-%d %H:%M:%S"`
${ORACLE_HOME}/bin/expdp \"/ as sysdba\" dumpfile=expdp_${ORACLE_SID}_`date '+%d%m%Y_%H%M%S'`.dmp directory=DATA_PUMP_DIR logfile=expdp_${ORACLE_SID}_`date '+%d%m%Y_%H%M%S'`.log FULL=Y EXCLUDE=STATISTICS CONTENT=ALL PARALLEL=2 JOB_NAME=EXPDP_FULL_BACKUP FLASHBACK_TIME=\"to_timestamp\(\'$DATE\',\'yyyy-mm-dd hh24:mi:ss\'\)\" >> /tmp/oracle_backup_log.rpt
#Granting permissions for Dump files
/usr/bin/chmod -R 755 $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_${ORACLE_SID}_*.dmp >> /tmp/oracle_backup_log.rpt
#Compressing files
/usr/bin/bzip2 --small --verbose $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_${ORACLE_SID}_*.dmp >> /tmp/oracle_backup_log.rpt
latest="$(ls -lrt $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_${ORACLE_SID}_*.log | tail -n 1 | awk '{print $(NF)}')"
WORDCOUNT=$(cat $latest | grep -c "EXP-")
if [ $WORDCOUNT -eq 0 ]; then
#Removing 7 days old dump files
find $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_$instance_name_*.bz2 -mtime +7 -exec rm {} \;
find $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_$instance_name_*.log -mtime +7 -exec rm {} \;
fi
#
#Restore Previous Variable Values.
#
ORACLE_SID=$old_oracle_sid ; export ORACLE_SID
ORACLE_HOME=$old_oracle_home ; export ORACLE_HOME
PATH=$old_path ; export PATH
done
#
#End script
#
echo "JOB COMPLETE" >> /tmp/oracle_backup_log.rpt
mailx -s "Full Exportdump - $HOSTNAME : $(date +%Y%m%d'-'%T)" $MAIL_LIST > /dev/null < /tmp/oracle_backup_log.rpt
rm /tmp/oracle_backup_log.rpt
exit
Starting with Oracle 10g, Oracle has introduced an enhanced version of EXPORT and IMPORT utility known as DATA PUMP. Data Pump is similar to EXPORT and IMPORT utility but it has many advantages. Some of the advantages are:
1. Most Data Pump export and import operations occur on the Oracle database server. i.e. all the dump files are created in the server even if you run the Data Pump utility from client machine. This results in increased performance because data is not transferred through network.
2. You can Stop and Re-Start export and import jobs. This is particularly useful if you have started an export or import job and after some time you want to do some other urgent work.
3. The ability to detach from and reattach to long-running jobs without affecting the job itself. This allows DBAs and other operations personnel to monitor jobs from multiple locations.
4. The ability to estimate how much space an export job would consume, without actually performing the export
5. Support for an interactive-command mode that allows monitoring of and interaction with ongoing jobs
Using Data Pump Export Utility
To Use Data Pump, DBA has to create a directory in Server Machine and create a Directory Object in the database mapping to the directory created in the file system.
Creating a Directory Alias
To create a directory alias, you must have CREATE ANY DIRECTORY system privileges.
Use a CREATE DIRECTORY statement to create a new directory alias, or a REPLACE DIRECTORY statement to redefine an existing directory alias, using the following PL/SQL syntax:
{CREATE | REPLACE | CREATE OR REPLACE} DIRECTORY alias AS 'pathname';
Where:
alias is the name of the directory alias.
pathname is the physical directory path.
Granting Access Rights to a Directory Alias
After you create a directory alias, grant users and groups access rights to the files contained in that directory, using the following PL/SQL syntax:
GRANT permission ON DIRECTORY alias TO {user | role | PUBLIC};
Where:
permission is one of the following:
READ for read-only access
WRITE for write-only access
ALL for read and write access
alias is the name of the directory alias.
user is a database user name. That user gets immediate access rights.
role is a database role. All users who have been granted that role get immediate access rights.
PUBLIC is all database users. All users gets immediate access rights.
Example: Creating and Using a Directory Alias
The following SQL commands create a directory alias named data_pump_dir to control access to a directory named /oracle/admin/dpdump and grant read access to all users.
$mkdir -p /oracle/admin/dpdump
$sqlplus /nolog
SQL*Plus: Release 10.2.0.5.0 - Production on Mon Apr 15 05:33:05 2013
Copyright (c) 1982, 2010, Oracle. All Rights Reserved.
SQL> connect /as sysdba
Connected.
SQL>
SQL>create directory data_pump_dir as '/oracle/admin/dpdump';
Now grant access on this directory object to PUBLIC or to any specific user
SQL> grant read,write on directory data_pump_dir to public;
or
SQL> grant read,write on directory data_pump_dir to system;
To crosscheck at database end use the following query:
SQL> desc dba_directories
SQL> set linesize 121
SQL> col owner format a15
SQL> col directory_name format a20
SQL> col directory_path format a70
SQL> SELECT * FROM dba_directories;
Example of Exporting a Full Database
To Export Full Database, give the following command
$expdp system/****** FULL=y DIRECTORY=data_pump_dir DUMPFILE=full.dmp
LOGFILE=fullexp.log JOB_NAME=expdp_fulldb_job
The above command will export the full database and it will create the dump file full.dmp in the directory on the server (/oracle/admin/dpdump/)
In some cases where the Database is in Terabytes the above command will not feasible since the dump file size will be larger than the operating system limit, and hence export will fail. In this situation you can create multiple dump files by typing the following command
$expdp system/****** FULL=y DIRECTORY=data_pump_dir DUMPFILE=full%U.dmp
FILESIZE=5G LOGFILE=fullexp.log JOB_NAME=expdp_fulldb_job
This will create multiple dump files named full01.dmp, full02.dmp, full03.dmp and so on. The FILESIZE parameter specifies how much larger the dump file should be.
Example of Exporting a Schema
To export all the objects of SCOTT’S schema you can run the following export data pump command.
$expdp scott/****** DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT
You can omit SCHEMAS since the default mode of Data Pump export is SCHEMAS only.
If you want to export objects of multiple schemas you can specify the following command
$expdp system/****** DIRECTORY=data_pump_dir DUMPFILE=HED_schema.dmp
SCHEMAS=HR,EMP,DEPT
Exporting Individual Tables using Data Pump Export
You can use Data Pump Export utility to export individual tables. The following example shows the syntax to export tables
$expdp hr/****** DIRECTORY=data_pump_dir DUMPFILE=tables.dmp
TABLES=employees,jobs,departments
Exporting Tables located in a Tablespace
If you want to export tables located in a particular tablespace you can type the following command
$expdp hr/****** DIRECTORY=data_pump_dir DUMPFILE=tbs.dmp
TABLESPACES=tbs_4, tbs_5, tbs_6
The above will export all the objects located in tbs_4,tbs_5,tbs_6
Excluding and Including Objects during Export
You can exclude objects while performing a export by using EXCLUDE option of Data Pump utility. For example you are exporting a schema and don’t want to export tables whose name starts with “A” then you can type the following command
$expdp scott/****** DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT EXCLUDE=TABLE:”like ‘A%’”
Then all tables in Scott’s Schema whose name starts with “A “ will not be exported.
Similarly you can also INCLUDE option to only export certain objects like this
$expdp scott/****** DIRECTORY=data_pump_dir DUMPFILE=scott_schema.dmp
SCHEMAS=SCOTT INCLUDE=TABLE:”like ‘A%’”
This is opposite of EXCLUDE option i.e. it will export only those tables of Scott’s schema whose name starts with “A”
Similarly you can also exclude INDEXES, CONSTRAINTS, GRANTS, USER, SCHEMA
Using Query to Filter Rows during Export
You can use QUERY option to export only required rows. For Example, the following will export only those rows of employees tables whose salary is above 10000 and whose dept id is 10.
expdp hr/****** QUERY=emp:'"WHERE dept_id > 10 AND sal > 10000"'
NOLOGFILE=y DIRECTORY=dpump_dir1 DUMPFILE=exp1.dmp
Suspending and Resuming Export Jobs (Attaching and Re-Attaching to the Jobs)
You can suspend running export jobs and later on resume these jobs or kill these jobs using Data Pump Export. You can start a job in one client machine and then, if because of some work, you can suspend it. Afterwards when your work has been finished you can continue the job from the same client, where you stopped the job, or you can restart the job from another client machine.
For Example, suppose a DBA starts a full database export by typing the following command at one client machine CLNT1 by typing the following command
$expdp scott/******@mydb FULL=y DIRECTORY=data_pump_dir
DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=expdp_fulldb_job
After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter into interactive mode. Then he will get the Export> prompt where he can type interactive commands
Now he wants to stop this export job so he will type the following command
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.
After finishing his other work, the DBA wants to resume the export job and the client machine from where he actually started the job is locked because, the user has locked his/her cabin. So now the DBA will go to another client machine and he reattach to the job by typing the following command
$expdp hr/******@mydb ATTACH=expdp_fulldb_job
After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging mode and restart the expdp_fulldb_job job.
Export> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.
Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.
Invoking Export and Import As SYSDBA
SYSDBA is used internally and has specialized functions; its behavior is not the same as for generalized users. Therefore, you should not typically need to invoke Export or Import as SYSDBA, except in the following situations:
At the request of Oracle technical support
When importing a transportable tablespace set
To invoke Export or Import as SYSDBA, use the following syntax (substitute exp for imp if you are using Export). Add any desired parameters or parameter filenames:
imp \'username/password AS SYSDBA\'
Optionally, you could also specify an instance name:
imp \'username/password@instance AS SYSDBA\'
Below Script is for taking FULL Consistent backup when DB is running and can be used scheduling on CRONTAB for daily/weekly backup:
CRON ENTRY for the script:
# Export FULL backups (Keeps 7 days backup files) - Runs at 23:30 everyday
30 23 * * * /oracle/admin/expdp_full_backup.sh > /tmp/expdp_full_backup.log 2>&1
Script:
#!/bin/bash
# Script to Perform Datapump Export backup Every Day
# If any database name passed as parameter then it runs only for that database
# If no parameter passed along with script then it runs for all running databases in the server
#
# Change History
# ================
#
# DATE AUTHOR DESCRIPTION
# ----------- ------- -----------------------------------------------
# 14-APR-2013 DBA Initial Version for Full DB export using EXPDP
#
#
#Backup Path
BACKUP_PATH=/oracle/admin
MAIL_LIST="abc@xyz.com"
HOSTNAME=`hostname`
DB_SID=$1
#
#Preserve the current ORACLE_SID, PATH and ORACLE_HOME.
#
old_oracle_sid=$ORACLE_SID
old_oracle_home=$ORACLE_HOME
old_path=$PATH
#
# Establish the number of instances
#
if [ $# -eq 0 ] ; then
echo "No arguments supplied"
instances=`ps -ef | grep pmon | grep -v "grep pmon" | cut -d _ -f3-`
else
instances=$DB_SID
fi
for i in $instances
do
instance_name=$i
echo "" >> /tmp/oracle_backup_log.rpt
echo "" >> /tmp/oracle_backup_log.rpt
echo "\nInstance Name : $instance_name " >> /tmp/oracle_backup_log.rpt
echo ".........................................." >> /tmp/oracle_backup_log.rpt
#
# Validate and initialize values for the instance name. >> /tmp/oracle_backup_log.rpt
#
#instance_check=$(cat /var/opt/oracle/oratab | grep $instance_name | wc -l)
if [[ $(cat /var/opt/oracle/oratab | grep -c $instance_name) -eq 0 ]] ; then
echo "The instance id specified does not exist in ORATAB" >> /tmp/oracle_backup_log.rpt
else
oracle_home=$(cat /var/opt/oracle/oratab | grep $instance_name | awk -F\: '{print $2}')
ORACLE_SID=$instance_name ; export ORACLE_SID
ORACLE_HOME=$oracle_home ; export ORACLE_HOME
PATH=$old_path
PATH=${ORACLE_HOME}/bin:$PATH ; export PATH
fi
# Export full cosistent backup
export DATE=`date "+%Y-%m-%d %H:%M:%S"`
${ORACLE_HOME}/bin/expdp \"/ as sysdba\" dumpfile=expdp_${ORACLE_SID}_`date '+%d%m%Y_%H%M%S'`.dmp directory=DATA_PUMP_DIR logfile=expdp_${ORACLE_SID}_`date '+%d%m%Y_%H%M%S'`.log FULL=Y EXCLUDE=STATISTICS CONTENT=ALL PARALLEL=2 JOB_NAME=EXPDP_FULL_BACKUP FLASHBACK_TIME=\"to_timestamp\(\'$DATE\',\'yyyy-mm-dd hh24:mi:ss\'\)\" >> /tmp/oracle_backup_log.rpt
#Granting permissions for Dump files
/usr/bin/chmod -R 755 $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_${ORACLE_SID}_*.dmp >> /tmp/oracle_backup_log.rpt
#Compressing files
/usr/bin/bzip2 --small --verbose $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_${ORACLE_SID}_*.dmp >> /tmp/oracle_backup_log.rpt
latest="$(ls -lrt $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_${ORACLE_SID}_*.log | tail -n 1 | awk '{print $(NF)}')"
WORDCOUNT=$(cat $latest | grep -c "EXP-")
if [ $WORDCOUNT -eq 0 ]; then
#Removing 7 days old dump files
find $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_$instance_name_*.bz2 -mtime +7 -exec rm {} \;
find $BACKUP_PATH/${ORACLE_SID}/dpdump/expdp_$instance_name_*.log -mtime +7 -exec rm {} \;
fi
#
#Restore Previous Variable Values.
#
ORACLE_SID=$old_oracle_sid ; export ORACLE_SID
ORACLE_HOME=$old_oracle_home ; export ORACLE_HOME
PATH=$old_path ; export PATH
done
#
#End script
#
echo "JOB COMPLETE" >> /tmp/oracle_backup_log.rpt
mailx -s "Full Exportdump - $HOSTNAME : $(date +%Y%m%d'-'%T)" $MAIL_LIST > /dev/null < /tmp/oracle_backup_log.rpt
rm /tmp/oracle_backup_log.rpt
exit