Extract(Read) > Transform(Test&Apply) > Load(Learn) Blog

June 25, 2008

Additional Backup Needs and Some Thoughts Which May Help

Filed under: Oracle Best Practices,Oracle How To — H.Tonguç Yılmaz @ 7:57 am

If you are a DBA of a mission critical database which usually mean there is no tolerance of downtime, data loss and mean time to recover should be minimum, ARCHIVELOG mode and a RMAN hot backup will be a mandatory starting point for a strategy.

The easiest way of scheduling this kind of a backup is done by Database(or Grid) Control, after 10.2 using Oracle’s advised backup strategy option which is a Flash Recovery Area based incremental RMAN backup will be what I advice if you already have enough disk space for FRA. Lutz is not blogging for a while and I miss Lutz’s RMAN posts so I guess this one can be another good quick guide.

Why restore before recover? The new Oracle 10g Backup Strategy

But what if you only want to restore a parameter table with lets say 86 rows, or an apex or some pl/sql application’s previous release, or just a metadata of a simple database object which is now changed or dropped? Here we all have our own workarounds mostly depending on the database version and tools we use around the database. Data Guard and Flashback options if you do have may help or you can customize your Backup strategy for these kind of alternative needs.

In order to restore a parameter table with 86 rows it should be throwing a huge rock to a frog if you go with your RMAN backup, so choosing these kind of parameter tables and including them into a daily binary table level export may be the quickest way to revert them back.

What about a PL/SQL based need, here again a daily norows binary export backup may help a lot.


A data-pump norows backup example

# CREATE or REPLACE DIRECTORY backup_dir AS '/ods01/backup/daily_norows/dp';
/dwh0/app/oracle/product/11.1.0.1/bin/expdp $BACKUP_USERNAME/$BACKUP_PASSWORD 
DUMPFILE=daily_full_norows_dp_backup.dmp.$BACKUP_DATE LOGFILE=daily_full_norows_dp_backup.log.$BACKUP_DATE 
DIRECTORY=backup_dir FULL=y EXCLUDE=TABLE_DATA

A traditional export norows backup example

/dwh0/app/oracle/product/11.1.0.1/bin/exp $BACKUP_USERNAME/$BACKUP_PASSWORD 
FILE=/ods01/backup/daily_norows/exp/daily_full_norows_dp_backup.dmp.$BACKUP_DATE 
LOG=/ods01/backup/daily_norows/exp/daily_full_norows_dp_backup.log.$BACKUP_DATE 
FULL=y ROWS=n
   

And what if you are using Apex applications or Warehouse Builder repository, then again taking daily tool level export backups will help.

OWB 11g Repository Shell Script Backup Example

There can be additional thoughts of course, like saving database structure as a SQL file daily or immediately after any structure change.


A Simple Backup Control File to Trace Shell Script Example

/dwh0/app/oracle/product/11.1.0.1/bin/sqlplus -s > /ods01/backup/daily_norows/backup_control_file_to_trace.log.$BACKUP_DATE 
2> /ods01/backup/daily_norows/backup_control_file_to_trace.err.$BACKUP_DATE <<EOF
$BACKUP_USERNAME/$BACKUP_PASSWORD
@/ods01/backup/daily_norows/backup_control_file_to_trace.sql $BACKUP_DATE 
exit ;
EOF

fatih@oracle $ cat /ods01/backup/daily_norows/backup_control_file_to_trace.sql
alter session set tracefile_identifier = backup_controlfile_to_trace;
alter database backup controlfile to trace;
SET ECHO OFF              
SET FEEDBACK OFF
SET PAGESIZE 0
SET LINESIZE 1000
spool /ods01/backup/daily_norows/backup_control_file_to_trace.name.&1
-- available after 11.1
select value from v$diag_info where name like 'Default Trace%' ;
spool off
SET FEEDBACK ON
SET PAGESIZE 24
SET ECHO ON
   

But I think critical point of all is that whatever you do do not forget to use a scheduler to automatize these jobs and control the logs of them periodically :)


Simple unix cron settings to automatize daily backups

fatih@oracle $ crontab -l
40 17 * * * /ods01/backup/daily_norows/daily_norows.sh > /ods01/backup/daily_norows/daily_norows.log 
2> /ods01/backup/daily_norows/daily_norows.err
00 18 * * * /ods01/backup/backup_repos.sh > /ods01/backup/backup_repos.log 
2> /ods01/backup/backup_repos.err
   

2 Comments »

  1. with the new scheduler feature of 10.1 (dbms_scheduler), operating system commands can be run via database. And dbms_datapump package allows datapump export and imports can be done in the database with plsql scripts. After the version 10.1, less operating system interaction is needed to schedule jobs and export/import operations.

    Comment by ergemp — June 27, 2008 @ 10:45 am | Reply

  2. Automate Oracle Warehouse Builder Repository export with OMB+ – http://www.business-intelligence-quotient.com/?p=11

    Comment by H.Tonguç Yılmaz — July 17, 2009 @ 6:25 am | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: