Subscribe to Posts by Email

Subscriber Count

    699

Disclaimer

All information is offered in good faith and in the hope that it may be of use for educational purpose and for Database community purpose, but is not guaranteed to be correct, up to date or suitable for any particular purpose. db.geeksinsight.com accepts no liability in respect of this information or its use. This site is independent of and does not represent Oracle Corporation in any way. Oracle does not officially sponsor, approve, or endorse this site or its content and if notify any such I am happy to remove. Product and company names mentioned in this website may be the trademarks of their respective owners and published here for informational purpose only. This is my personal blog. The views expressed on these pages are mine and learnt from other blogs and bloggers and to enhance and support the DBA community and this web blog does not represent the thoughts, intentions, plans or strategies of my current employer nor the Oracle and its affiliates or any other companies. And this website does not offer or take profit for providing these content and this is purely non-profit and for educational purpose only. If you see any issues with Content and copy write issues, I am happy to remove if you notify me. Contact Geek DBA Team, via geeksinsights@gmail.com

Pages

Quick Question: Check for performance issues

Hello Everyone,

Thanks to Satish / Shanawaz for sharing this piece.

What if , if you been assigned on,

A brand new database with new application designed , new tables and short volume of data, asked you to verify the database performance and provide suggestions and recommendations for any improvements. You can assume this is just passed development stage and entered into UAT stage.

  • Where do you start?
  • What you would like to check?
  • Which approach you follow?
  • What kind of tools you would use?

Please comment your thoughts. Your comments will be more helpful to provide more piece of information than I have.

Stay tuned….

Update 18-Mar-2013,

Sorry Friends for the delay.

Here you go with answers (This may vary depends on environment to environment but this is how I start with)

Where do you start?

Memory Advisories and OS level stuff & individual statements to tune

What you would like to check?

Statements that need to tune

AWR Reports of the time that testing has happened

ADDM Reports of the time that testing has happened

Which approach you follow?

Top down approach

Start with OS level and drill down to Individual SQL Tuning

What kind of tools you would use?

AWR, ADDM, OS Tools orca, plexus, Sql Tuning advisory, SQL Performance Analyzer to compare the changes etc.

Well out of all above, two things for sure I would like to do,

1) Memory Advisories

http://db.geeksinsight.com/tools-scripts/memory-advisories/

This should be run when you have a peak load or you can get the same from the AWR report as well. With this we can find whether the SGA is adequate or not and any increase in sga would benefit to reduce the physical reads or gets etc.

We have SGA, Buffer Cache, PGA,Shared pool, Streams Pool advisories etc

2) Analyze the Database SQL Performance as its a new design (probably a prototype product database) using SQL Tuning advisor or the SQL access advisory

Basically, SQL Tuning advisor can run for a single statement or a set of SQL Statements which can give you the results as recommendations to create indexes, include parallelism, gather statistics, create profiles and show you the benefit of the recommendations when implemented.

Where in the SQL access advisory would not give you that it only show the recommendations mainly to create indexes or materialized view not even the percentage of benefits you get.

Let check how to use STA/SAA, (Thanks Shanawaz for providing this steps)

Steps here:- https://http://db.geeksinsight.com/tools-scripts/sql-tuning-advisory-sql-access-advisory-steps/

If you look at the outputs provided, more than 149 statements has been analyzed, 40+ index creation recommendations provided and more than 60+ objects gather statistics has been recommended, and some error out.

With this you can well analyze how the queries or performing in terms of optimizer perspective not from the database/developer perspective. Since the developer may not know how optimizer chooses its plan based on best available statistics.

-Geek DBA

9 comments to Quick Question: Check for performance issues

  • as the new database with low volume of data, we can not drill down much. but the views v$sysmetric gives the system metric values and v$system_event gives the wait events.

  • Reblogged this on Geek DBAGeek DBA and commented:

    Updated answers

  • To how much extent can we rely on AWR reports, as volume of data is low and how can we rely on buffer cache advisers and the data has just stepped in. Unless and until a whole cycle of operation is completed, the AWR reports do not focus much light on performance.

    • Regarding advisories, Thats why said, during the testing or peak period of the database, probably you have to sit with Developers and analyze the database for over a period of time and generate a graph for it. Even for this I would like depend on AWR rather writing or creating own tables and insert data daily.

      Secondly, AWR reports, again same case generate for the period when there is load / stress test which may give clues about your database performance, in general when you generate awr report, when asked about performance issue or when we want to look at database performance over a period of time.

      Thirdly, Tuning advisories will be very useful in this cases coz , this database as relatively new, ( a prototype database for a new product) developers will not sure how their queries performed, or even not sure how their physical schema design behaves with optimizer. The tuning advisory gives very important information on indexes statistics, profiles etc which can place in and again tested out.

      Further, the approach may vary and its not a thumb rule approach for performance tuning and its likely vary depends on situation.

  • Your approach is quite right, when the dba is involved during the testing/designing
    suppose DBA is not involved testing/design and given a brand new database with new application designed , new tables and short volume of data, asked to verify the database performance, then What would like to check

    • Still I will stick to above. Coz the tools are no where difference if DBA is closely monitor or not. I will rely on the historical views rather current system statistics which are not accurate to me now. I will ask the application team to tell me about the timings etc etc or just will gather a set of sql id say an example for a day and run the tuning advisory, what else we can do in this cases , right.

      Upon receiving the recommendation, the other blog follower said, review of parameters like cursor_sharing, optimizer stuff we can place depends on the behaviour of application

      -Thanks
      Geek DBA

  • vijay

    We can add 1)review of parameter file. 2)Do we have shareable sqls(bind variables) for the given load,as going forward as load increases, the sharedpool will be brought down to it’s knees if ignored 3) check disk rates for current load from awr and see if any deviation from standard numbers( what are those? Iam really confused when I talk about it and it’s different parameters used to measure it’s performance) 4) The awr for current load will point us to current bottlenecks and these may pose bigger problems,as load increases

    May be more to add… I will add after more discussions with my colleagues.

    • Yes Vijay,

      Quite number of things we need to do, well before to that I want to find whether my sql statements are good enough, then i will look at parameters, optimizer changes, parallelism, IO stuff (again here its not relevant since production environment may be different, but we can have baseline statistics of IO to compare the later stage)

      -Geek DBA