We had been getting a pretty persistent "WARNING: inbound connection timed out (ORA-3136)" error thrown to the alert.log on one of our 10.2.0.4 dbs. This error wasn't received once or twice, but about 40 times within a 2 minute window.
I did the research on it and found some sources that suggested my SQLNET.INBOUND_CONNECT_TIMEOUT needed to be adjusted upwards from the default value of 60 to 120 or something higher. I was pretty sure that wasn't the case as it was already set at 120. For the sake of progress, I bumped the timeout to 300 (5 minutes) and monitored closely.
About two days later, we received the same warning message in the alert.log file, except this time it happened about 90 times in a two minute period. Aha, a change. So that told me that something was going wrong during the authentication phase since increasing the connection timeout made the problem worse.
During the last occurrence, I also noticed that the mman process was consuming 100% of the CPU. At the time I wasn't sure if it was a symptom or a byproduct of the problem. However, as soon as the problem went away, the ora_mman process went back down to nearly 0%.
I setup a script to automatically do three systemstate dumps a minute apart when the mman process went to 100%. Oracle Support was able to tell me that during this time period, a bunch of my SQL had been invalidated as was waiting on a latch to be loaded back into the shared_pool. They also indicated that this massive amount of reparsing could happen because somebody did a DDL on a popular object or somebody flushed the shared pool.
I was pretty sure nobody flushed the shared pool, but that got me to thinking what would happen if Oracle shrank my shared pool due to the automatic memory management? I checked v$sga_resize_ops and found that around the times of my warning message, the shared_pool was being re-sized down in size. I brought this up to the support analyst and he suggested I set the shared_pool to the maximum size Oracle had re-sized it to.
That was three days ago, and we haven't had a warning since.
The theory is that while the shared_pool was being re-sized in the SGA, Oracle grabbed a latch. The memory resize operation took a while and while Oracle held that latch, nobody could login. Interesting theory, we'll see if it holds.
Monday, August 17, 2009
Thursday, August 06, 2009
PC Support for Oracle
So I'm working with Oracle Support on an Oracle Applications R12 issue. They want to see some data from one of our tables and give me a query to run. I run the query in SQL Developer and export the results as a .csv file and upload it to the TAR.
Three days later they get back to me and say they can't open the files. Hmm, that's funny, let me try the same thing.
I load it into Open Office in about 2 seconds, no problem.
Then I load it into a database using SQL*Loader with no issues.
Same thing with MySQL and LOAD DATA, no issues.
I mail the file to myself and again it works no problem.
I send the file to a collegue and Open Office works for them as well.
The support analyst asks for a plain .xls worksheet so I try to load it into MS Excel. Bingo, I get an error:
Aha! MS Excel can't handle anything with over 256 columns of data and my data is 325 columns! (Not my design, it comes from Oracle Applications).
I post my findings to the TAR and suggest they load the data into a piece of software that can handle 325 columns, like maybe Oracle. (I didn't have the heart to suggest MySQL, but I guess they wouldn't take that as a slight anymore ...)
Three days later they get back to me and say they can't open the files. Hmm, that's funny, let me try the same thing.
I load it into Open Office in about 2 seconds, no problem.
Then I load it into a database using SQL*Loader with no issues.
Same thing with MySQL and LOAD DATA, no issues.
I mail the file to myself and again it works no problem.
I send the file to a collegue and Open Office works for them as well.
The support analyst asks for a plain .xls worksheet so I try to load it into MS Excel. Bingo, I get an error:
This error is usually encountered when an attempt to open a file with more than 65,536 rows or 256 columns is made. Excel is limited to 65,536 rows of data and 256 columns per worksheet. You can have many worksheets with this number of rows and columns, but they are usually capable of fitting into one workbook (file). The number of worksheets you can have per workbook is limited only by the amount of available memory your system has. By default, Excel can manage 3 worksheets, more if there is available memory to support the quantity of data.
Truncation of rows or columns in excess of the limit is automatic and is not configurable. This issue can usually be remedied by opening the source file with a text editor, such as Microsoft Office Word, and then saving the file off into multiple files with row or column counts within the limits of an Excel worksheet. These files can then be opened or imported into Excel worksheets.
If you are using a data format that does not support use of a text editor, it may be easier to import the data into Microsoft Office Access and then use the export feature of Access to import the data to an Excel format. Other methods of importing large source material into multiple worksheets are available, but may be more complex than using either a text editor or Access.
Aha! MS Excel can't handle anything with over 256 columns of data and my data is 325 columns! (Not my design, it comes from Oracle Applications).
I post my findings to the TAR and suggest they load the data into a piece of software that can handle 325 columns, like maybe Oracle. (I didn't have the heart to suggest MySQL, but I guess they wouldn't take that as a slight anymore ...)
Subscribe to:
Posts (Atom)