Extract(Read) > Transform(Test&Apply) > Load(Learn) Blog

March 17, 2007

Some migration experiences to share

Filed under: Oracle How To,Oracle Performance — H.Tonguç Yılmaz @ 3:39 pm

Lets say you have a 7*24*365 available subscription database application, ~31 million customers’ data inside. A new version of the application is designed under the new needs and wants of the marketing team, so a new to-be table design developed, coded and tested.

Now it is the time to migrate the data from as-is table structure to the to-be table structure. Since the operations’ availability is critical you may have a very limited downtime window or maybe none, so you may have to design a migration method which will not require a downtime.

There important experiences I want to mention here related to the topic;

1- Never make assumptions or promises to the internal customer groups from some percent(lets say %20) of data migration timing done on a pre-production or test environment. Oracle may slightly do things different on your production environment, because of your database and especially cost based optimizer and parallelism settings. These differences usually are not in your best interest strangely :)

2- Migration means lots of I/O, in terms of especially redo and undo, so the size of the redo log buffer, redo log files(and numbers) and whether you are archiving or not, will have important impact on the total timing of the migration.

You have to choose a migration methodology which reduces the I/O, and remember the mantra on data intensive processing; Create table as select(CTAS), merge statement or conditional multi-table inserts and dbms_errlog with nologging and parallel options will have great help, but even after you may have to use pl/sql and then prefer bulk operations, trust me on this :)

But remember bulk operations produces redo, nologging is not option;

Code listing 71a : PL/SQL Bulk Insert(FORALL) and Redo amount produced demo

And you know any huge update or delete may be converted into a CTAS with nologging and parallel option;

Code listing 71b : Converting an Update or Delete into a NoLogging CTAS demo

3- Oracle 9iR2 and 10gR2 are very very different two platforms, somethings very common sense may not work for 9i, be careful and check this for example

So waiting your comments and experiences on the topic. Lately I did some research on the “near-real time datawarehousing” concept and fell in love with Oracle’s Change Data Capture feature which is based on Log Miner and Streams technologies. I will be sharing my experiences under my Oracle Information Integration Series.

Testing Information : the scripts mentioned are tested on Oracle Database 10g Enterprise Edition Release

Refences Used :


Create a free website or blog at WordPress.com.