Extract(Read) > Transform(Test&Apply) > Load(Learn) Blog

April 13, 2008

A DDL tacking application, if CASCADE option of DBMS_STATS needed everytime, ORA_HASH post update and others

Filed under: Oracle How To,Oracle Performance — H.Tonguç Yılmaz @ 1:06 pm

This post will be like a summary of last two weeks of mine. First of all, yesterday we celebrated my company‘s 15th year, I feel very lucky to be a part of this success story for my last 8 years :)

Lately I was and will be mostly working in two projects, in parallel;

1- Migrating our billing/crm database from Tru64 to AIX platform; there are several important lessons learned in this cross-platform xx TB Oracle database migration, I will be blogging about these in details later. For now I will be sharing below Oracle-L discussion of mine, a DDL tracking application which was requested to audit and control especially the database objects which are not supported by the change data capture product Goldengate in this post.

- Oracle 10gR2 Multi-Terabyte Database Cross-Platform Migration Method
- Cross-Platform Tablespace Transportation from Oracle documentation
- Oracle 10gR2 Multi-Terabyte Database Cross-Platform Migration a Case Study

You may find the DDL(Data Definition Language, statements used to define the database structure or schema) types in Oracle 10gR2 here. In this small application I used a database trigger and an autonomous transaction based procedure to log the operation, db_name, login_user, object_owner, object_name, object_type, sql_text, attempt_by and attempt_date informations.

A custom DDL tacking application for Oracle database

Also in one of my previous posts I discussed on 10g’s ORA_HASH SQL function as a data validation tool. During a row by row migration(unload on source>ftp text to traget>load texts parallely on target or like in our case insert into select from over dblink plus apply changes captured by a CDC tool on production during this logical data transfer kind of strategies) a validation need becomes very important.

This validation must involve three major parts in my point of view;

1- the metadata validation,
2- the data validation,
3- the performance validation.

I made several updates on the previous post, if you are interested on this topic please follow this link.

2- Design and development of an operational data source in front of our datawarehouse; this environment will differ from datawarehouse in two major areas, first is it will not be denormalized like a typical dwh and be very similar to the operational database’s tables, so transaformation will be very limited compared to a standard ETL. Second from this environment customers want to have near-real time reporting, so Change Data Capture(CDC) kind of data transfer methods are needed.

We will be first implementing an operational data source for our CRM database which runs Siebel application. There are over 60 typical operational near-real time reports need. Oracle Warehouse Builder 11g release has a new feature to integrate a warehouse with Siebel source databases, we will also be testing the performance and capabilities of this new feature.

But as I commented here we will also be working on a logical standby implementation to have the ETL window flexibility in order to be near-real time.

During this project we also had an interesting discussion on DBMS_STAT package’s two options; CASCADE and GRANULARITY. In Choosing An Optimal Stats Gathering Strategy post of Greg Rahn I commented about this discussion. For an 10g Release 2 datawarehouse CASCADE option may be left to FALSE since after each rebuild at the end of a typical daily ETL optimizer already collects statistics. And for the GRANULARITY option to leave it to its default usually will be enough. These two things may help you avoid unnecessary gathering, and reduce total ETL time of course. Below is a small demo for CASCADE option.

Is DBMS_STATS package’s CASCADE option need to be set to TRUE always?

And for the OTHERS part my three highlights are as following;

1- My team Fenerbahçe defeated English Chelsea in Istanbul, but in London we couldn’t make it to the Champions League semi-finals, in my opinion Chelsea isn’t playing good this year and we missed a very important opportunity. My team has young and unexperienced players, but with this much hungry for success for the coming years I am nearly sure we will force the first 4 again and again :)

2- This week at last wordpress domain is back in my country, nearly for a year now we have been playing hacking games.

3- Last words are for the Oracle Critical Patch Update Announcement April 2008 which also includes two nasty vulnerabilities for APEX.

Testing Information : the scripts mentioned are tested on Oracle Database 10g Enterprise Edition Release 10.2.0.3.0

References Used :

N/A

About these ads

2 Comments »

  1. [...] it dawned on me the other day after reading a post by H.Tonguç Yılmaz that it would be a good idea to keep track of all ddl on our production system for accountability [...]

    Pingback by Big Brother (with Little Brother props) « Aspiring Oracle Nerd — May 15, 2008 @ 8:31 pm | Reply

  2. Tracking DDL changes in 11g – http://oraclue.com/2009/08/20/tracking-ddl-changes-in-11g/

    Comment by H.Tonguç Yılmaz — August 21, 2009 @ 5:10 am | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

The Rubric Theme Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 76 other followers

%d bloggers like this: