Wednesday, August 30, 2006

Aloha

October 1997 was the last time I took more than a week off for vacation. At that time, I was somewhere south of Summerlands watching Ferry Penguins come ashore at dusk. Anybody (except HJR) know where I was?

Anyway, the moons have aligned and I am taking off to a remote spot in the South Pacific for two weeks and a little R&R. No cell phone, no laptop. I'm not even sure about TV. Since I won't have connectivity for a while, I've invited a few people to be "Guest Bloggers" to fill in while I am away. I know the readers will get a new perspective, and I hope the guest bloggers have a little fun.

Joey Vayo
Actually, this is all Joey's idea. He came up with the idea of "guest blogger" a little while ago and I thought it was a good one. Of course, he was the first one I asked. He's been a regular on DBASupport.com for years and always gives helpful advice.

Steve Prior
Steve's brainchild is http://www.geekster.com. Although he is a developer by trade (boo, hiss), I think he is a tinkerer by heart. A usual conversation with Steve might start out with "I am looking for ODBC drivers for a PROLOG engine that I am building into my smart-home system to automatically flush a toilet..."

Brian Byrd

Another DBASupport.com regular, Brian is a master at making SQL more efficient both from a coding perspective and from the perspective of the database. While Brian and I don't always agree, I certainly respect his knowledge and ability to think about problems using a different approach.

Jeffrey M. Hunter
OK everybody, there is more than one Jeff Hunter in the Oracle world. I usually differentiate us by saying he's the guy with the RAC Paper and I'm the guy with the blog. Someday we're going to run off and create our own consulting company, but until then, I'm sure Jeffrey will check-in with some very exciting information.

David Pittuck
Last, but not least, we have the world renowned DaPi. David likes to be know as the "Old Cranky Philosopher", but is quite enlightened in the world of IT. (He’s also proud of the fact he has a NerdScore of 93). You can check him out at http://www.pittuck.com.

A big thanks to my guest bloggers. I hope you guys have as much fun blogging as I do.

Regularly scheduled programming resumes around September 18th. Take it away guys...

Tuesday, August 22, 2006

Fun Blogs

I really enjoy peeking into other people's lives for a brief second. I've been reading a couple blogs lately that are very well written and entertaining. If you like reading the likes of Heather B. Armstrong, LeahPeah, or WaiterRant, I think you'll enjoy New York Hack and Everything is Wrong With Me.

Friday, August 18, 2006

9.2.0.8 Patchset

Has anybody heard when the 9.2.0.8 patchset for Solaris and/or Linux will be available? All I see on Metalink is 2006Q3, which we're about half way through...

Tuesday, August 15, 2006

Compute or Estimate

A question came by me the other day that I had to do a little head-scratching on.

"How can you tell if your table was analyzed with COMPUTE or ESTIMATE?"

Good question. In my enviornment, I know it was COMPUTE because that's how I do my stats. But how would I know otherwise? There's a column called sample_size in dba_tables that indicates how many rows were used for the last sample. Ah, just a simple calculation.

Lets see how it works out. First, create a decent sized table:

SQL> create table big_table as select * from dba_objects;

Table created.

SQL> insert into big_table select * from big_table;

42396 rows created.

SQL> /

84792 rows created.

SQL> /

169584 rows created.

SQL> /

339168 rows created.

SQL> /

678336 rows created.

SQL> commit;

Commit complete.

Of course, we don't have any stats yet, but lets just check it anyway.

SQL> select table_name, last_analyzed, global_stats, sample_size,
num_rows, decode(num_rows, 0, 1, (sample_size/num_rows)*100) est_pct
from dba_tables
where table_name = 'BIG_TABLE'
/

TABLE_NAME LAST_ANAL GLO SAMPLE_SIZE NUM_ROWS EST_PCT
------------------------------ --------- --- ----------- ---------- ----------
BIG_TABLE NO


Sounds good. Lets analyze our table using dbms_stats and see what we get:

SQL> exec dbms_stats.gather_table_stats('JH','BIG_TABLE');

PL/SQL procedure successfully completed.

SQL> select table_name, last_analyzed, global_stats,
sample_size, num_rows,
decode(num_rows, 0, 1, (sample_size/num_rows)*100) est_pct
from dba_tables
where table_name = 'BIG_TABLE'
/

TABLE_NAME LAST_ANAL GLO SAMPLE_SIZE NUM_ROWS EST_PCT
------------------------------ --------- --- ----------- ---------- ----------
BIG_TABLE 14-AUG-06 YES 1356672 1356672 100


Cool, that's what we expect. Now let's analyze and estimate with 1% of the rows:

SQL> exec dbms_stats.gather_table_stats('JH','BIG_TABLE', estimate_percent=>1);

PL/SQL procedure successfully completed.

SQL> select table_name, last_analyzed, global_stats, sample_size,
num_rows, decode(num_rows, 0, 1, (sample_size/num_rows)*100) est_pct
from dba_tables
where table_name = 'BIG_TABLE'
/

TABLE_NAME LAST_ANAL GLO SAMPLE_SIZE NUM_ROWS EST_PCT
------------------------------ --------- --- ----------- ---------- ----------
BIG_TABLE 14-AUG-06 YES 13371 1337100 1


That definitely is what I was expecting.

I think I remember that if you estimate more the 25%, Oracle will just COMPUTE. Lets see what happens:

SQL> exec dbms_stats.gather_table_stats('JH','BIG_TABLE', estimate_percent=>33);

PL/SQL procedure successfully completed.

SQL> select table_name, last_analyzed, global_stats, sample_size,
num_rows, decode(num_rows, 0, 1, (sample_size/num_rows)*100) est_pct
from dba_tables
where table_name = 'BIG_TABLE'
/

TABLE_NAME LAST_ANAL GLO SAMPLE_SIZE NUM_ROWS EST_PCT
------------------------------ --------- --- ----------- ---------- ----------
BIG_TABLE 14-AUG-06 YES 447748 1356812 33.0000029



Here I'm on 9.2.0.5. Previous to 9.2, this would have done 100% of the rows, but in 9.2, it will estimate all the way up to 99%. (Why you would estimate 99% vs. 100%, I have no clue, but it can be done).

Let's see what the block_sampling does:


SQL> exec dbms_stats.gather_table_stats('JH','BIG_TABLE',
estimate_percent=>10, block_sample=>true);

PL/SQL procedure successfully completed.

SQL> select table_name, last_analyzed, global_stats, sample_size,
num_rows, decode(num_rows, 0, 1, (sample_size/num_rows)*100) est_pct
from dba_tables
where table_name = 'BIG_TABLE'
/

TABLE_NAME LAST_ANAL GLO SAMPLE_SIZE NUM_ROWS EST_PCT
------------------------------ --------- --- ----------- ---------- ----------
BIG_TABLE 14-AUG-06 YES 132743 1327430 10



Eh, nothing new there.

Lastly, let's see what dbms_stats.auto_sample_size does.

SQL> exec dbms_stats.gather_table_stats('JH','BIG_TABLE',
estimate_percent=>dbms_stats.auto_sample_size);

PL/SQL procedure successfully completed.

SQL> select table_name, last_analyzed, global_stats, sample_size,
num_rows, decode(num_rows, 0, 1, (sample_size/num_rows)*100) est_pct
from dba_tables
where table_name = 'BIG_TABLE'
/

TABLE_NAME LAST_ANAL GLO SAMPLE_SIZE NUM_ROWS EST_PCT
------------------------------ --------- --- ----------- ---------- ----------
BIG_TABLE 14-AUG-06 YES 1356672 1356672 100



That's interesting, 9.2 thinks my best method is to compute. I try this on several 9.2 dbs and different sized tables and it picks 100% every time. So I go to the friendly Metalink and find a couple bugs in 9.2 that indicate auto_sample_size doesn't exactly work in 9.2.

Not sure where I'll use this in the future, but it was an interesting experiment anyway.

Monday, August 14, 2006

Done tweaking

I think I'm done tweaking for a while.  The format is a little less fancy that I envisioned, but to me, it's just as easy to read now as the white & grey.  Plus I like that the text area grows and shrinks to the size of your browser.  When I'm on my desktop with higher resolution I can read more.

Sunday, August 13, 2006

Why I got out of the GUI business

I started out life in the IT world as a programmer banging out GUIs for every type of system you can imagine. Every user wanted something different; this one wants green fields, this one wants white. Another wants fonts bigger, another wants them smaller.

On a totally unrelated note, I'm tempted to put the blog back to Grey & White.

Saturday, August 12, 2006

Waiting

What do you do when you're waiting for a 17 hour import to finish? Tweak you blog, of course!

Platform migration in a 9i world

One of my 4-way Solaris boxes just wasn't cutting it anymore. We've tuned what we can tune and tweaked what we can tweak. The box was running with an average run queue of 8 lately. Time to upgrade the hardware. We chose a nice fast Linux box from HP and hooked it to an HP EVA disk array.

So how do you move 660GB of data from Solaris to Linux? On Oracle 9i?

Moving to 10g was considered for about 30 seconds, but we decided it was too many changes at one time for one of our most important applications. The next logical choice was export/import. On the first round, it took about 14 hours to export and 4 days to import. OK, that can't be done in a weekend.

The we decided to break the tables up into four somewhat equally sized groups and run export/import in four threads. This actually worked pretty well for the overwhelming majority of tables. About 4 hours for the longest group to export, and 43 hours for the longest thread to import. Still not doable in a weekend, but we're closing in on a reasonable timeframe.

We noticed a lot of time was being taken up creating indexes. So we decided to pre-create the tables without indexes and create indexes in parallel with nologging. This shaved about 6 hours off our total time.

Then we looked and saw two Index Organized Tables were taking a lot of time relative the the number of rows they contained. I then did some testing with import on IOT tables vs. Heap tables and didn't see any significant difference. In fact, the testing I did proved the two IOT tables shouldn't have taken as long as they did. I looked at them a little more closely and found the tables contained a TIMESTAMP field. I did some more testing and compared import with a DATE field vs. import with a TIMESTAMP field and found the TIMESTAMP field was causing the slowdown. Once we identified the TIMESTAMP field was the problem, my boss found note 305211.1 on Metalink which explained our problem. We then timed the import with COMMIT=N for those two tables and found the time went from 26 hours to 17 hours. I thought, "lets try to fill the tables through a database link". The database link yielded the best results of all in about 6 hours.

The last piece of the pie was the FK constraints. We didn't want to re-validate all the constraints since we knew all the data was going to be copied over (after manual verification, that is). So we created the constraint with NOVALIDATE which took relatively no time.

After everything was back in, we estimated statistics on our biggest tables and computed statistics on our smaller tables. It took about three weeks to figure out the method, but the time researching the project paid off when the database was available after 34 hours.

Thursday, August 10, 2006

The Cashier and the Cell Phone

The cell phone is everywhere today. You can't walk two blocks without running into 3 people with something stuck to their ear. It's just part of today's world. I'm just as guilty as the next guy of using my cell phone when I probably shouldn't. But the other day, I experienced a person so rude, I just had to tell you about it.

I just finished grocery shopping at my local big-box store. All my items were on the belt for the cashier and he was flicking them by the scanner one by one. The particular market I frequent has about 1 bagger for every 3 cashiers. I'm perfectly able, so I don't mind starting to bag my own groceries while the cashier finishes scanning my order. After all, I'd rather have the dedicated bagger help an old lady than me. I have a mountain of groceries stacked up, but i figure the cashier will help once he's done running my stuff through the scanner.

He scans my last item, tells me $93.14, and whips open his Nextel phone. I continue to bag thinking he'll get right back to me in a second.

"Hey girl" he talks to the phone after the familiar Nextel beep-beep-beep.

"You comin' over?", she asks.

"Who's dere?"

"X and Y are already here, we're waitin' for Z and you."

"Oh. I don't get outa here 'til 8."

I'm done bagging at this point, just a little ticked off. I go to the EFT pad, swipe my ATM card and key in my PIN number. The display reads "Waiting for Cashier". Yeah, no kidding.

Their teen-age banter goes back and forth for another two minutes. The yuppy woman behind me gives the cashier the "What the F*$K are you doing?" look while her significant other blatently flashes his wrist with and obvious look at his Fauxlex.

Two minutes later, we hear "OK, we'll hookup afta I get outa dis place."

He hits the magical combination of buttons to let me pay and I'm off just shaking my head at the poor customer service.

Tuesday, August 08, 2006

Anti-Midas

Why is it that everytime I touch something with a Microsoft tool, it breaks? For example, I composed my last blog entry using MS Word. Instead of just publishing from Word, I highlighted the whole thing, copied it, and pasted it into blogger's editor. Everything looked fine to me, so I published.

On Monday, I realized I didn't get my feed via RSS. My feed provider said the feed had an invalid format and was complaining about invalid namespaces that were "Mso" something or another. I checked the HTML for my post and sure enough, there were these little tags with proprietary Microsoft stuff in them. Nice.

I edited the HTML and took all the MS garbage out and voila, my feed was back in business.

Friday, August 04, 2006

Funk

I’m in a funk.

A lot of small projects had to take a backseat while I was finishing up my two year project. Now it’s payback time. I have a list of about 20 things that I am working on that aren’t what I’d call real interesting, but they need to get done anyway. Things like cloning Oracle Applications. Yuck. But a necessary evil.

I thought getting some reading under my belt would get me back into the swing of things. I picked up Expert Oracle Database Architecture: 9i and 10g Programming Techniques and Solutions three times in the last two weeks, but just couldn’t get into it. Ventured to a different Barnes and Noble last night to peruse their computer section, but came out with People Magazine.

I get started on a bunch of things, but just can’t seem to finish. In fact, I started on this post about 4 days ago, but just never got around to finishing it. I have four good ideas in my blog “Drafts” area, but just can’t flush out the energy to fully develop them.Sometimes I feel like Sisyphus rolling my stone.

This too shall pass.