Re: Force specific plan to be used

From: Mladen Gogala <gogala.mladen_at_gmail.com>
Date: Tue, 5 Nov 2019 22:38:13 -0500
Message-ID: <bf59cbb0-a7d8-ed46-7357-80572c77f321_at_gmail.com>



On 10/31/19 2:36 PM, Chris Taylor wrote:
> That literally cannot make it better if the plans are identical. :/
>
> Its more likely the results are cached or other executions experienced
> some type of wait event that slowed them down.
>
> You can look at DBA_HIST_ACTIVE_SESS_HISTORY for the SQL_IDs and
> session_id/session_serial#s and look at EVENT, WAIT_CLASS, WAIT_TIME,
> SESSION_STATE, BLOCKING_SESSION_STATUS, BLOCKING_SESSION,
> BLOCKING_SESSION_SERIAL#.
>
> It could ALSO be DYNAMIC SAMPLING slowing it down - do you have stats
> on the tables involved in the query and how old are they (are they
> stale)?  What is your optimizer_dynamic_sampling parameter set to?
>

Speaking of collecting statistics and dynamic sampling, those are two different methods to achieve the same goal: give optimizer enough data so that it can optimize your SQL. Gathering stats using DBMS_STATS is horrendously expensive and time consuming. DBMS_STATS.AUTO_SAMPLE_SIZE means that the optimizer will use the new algorithm for the number of distinct values and that it will read the entire table. Now, I work for a company which occasionally encounters tables larger than 1TB. Fancy reading the entire table to collect stats! Of course, the table is partitioned by the date column.  Now, when the optimizer figures out that the maximum date in the table is larger than the maximum date in a histogram, it stops using histogram. You have to redo your stats all over again. You can deal with the calamity in several ways:

  • Decided that the benefits of new algorithm for NDV are not worth the time and resources needed to for the FTS of a ginormous table and revert to estimate percent of 1, with block sampling.
  • Drop all statistics on the table and allow dynamic sampling to take its course.

I've mostly been using smaller estimate percents until approximately a year ago when I tested the method with dynamic sampling. To my immense surprise, there were only a few statements that needed manual hinting. Optimizer dynamic sampling setting of 11 is pretty good. There was a bad bug in 12.1.0.2 when it caused latch contention and essentially killed the system, but as of 12.2 I have more confidence in it. I hope that in not so distant future, optimizer dynamic sampling will replace DBMS_STATS the same way that DBMS_STATS has replaced "ANALYZE TABLE ESTIMATE STATISTICS SAMPLE X PERCENT". Essentially, dynamic sampling is a sort of autopilot: you don't really know what it's doing but you don't care, as long as the results are good.

-- 
Mladen Gogala
Database Consultant
Tel: (347) 321-1217


--
http://www.freelists.org/webpage/oracle-l
Received on Wed Nov 06 2019 - 04:38:13 CET

Original text of this message