I have a job setup to run a procedure in a database. The user account
sa is the owner. The actual database resides on an NT Server 4.0
Terminal Services PC using SQL Server 7.
The first time I scheduled it the job took approx 6 minutes to
complete and allowing it to continue averaged at approx 6 minutes.
I realised that I was running SQL version 7.00.623 i.e. SQL Server 7
with no Service Packs applied. So I applied the Service Packs to the
environment and then restored the database I was using.
Running the same job again which executes the procedure took approx 20
minutes and allowing it to continue averaged at approx 20 minutes.
I need to find out what difference the Service Packs make to the SQL
environment as this is the only thing that has changed. I'm not saying
this is a SQL issue as it may be that the procedure needs amending in
some way to make it run more efficiently.
Any feedback or advice would be grateful.Try updating statistics on your tables, and see if that makes a difference.
Just a stab in the dark.
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"rzirpolo" <robert.zirpolo@.phoenixdb.co.uk> wrote in message
news:5320d330.0408050041.5c815b24@.posting.google.com...
> I have a job setup to run a procedure in a database. The user account
> sa is the owner. The actual database resides on an NT Server 4.0
> Terminal Services PC using SQL Server 7.
> The first time I scheduled it the job took approx 6 minutes to
> complete and allowing it to continue averaged at approx 6 minutes.
> I realised that I was running SQL version 7.00.623 i.e. SQL Server 7
> with no Service Packs applied. So I applied the Service Packs to the
> environment and then restored the database I was using.
> Running the same job again which executes the procedure took approx 20
> minutes and allowing it to continue averaged at approx 20 minutes.
> I need to find out what difference the Service Packs make to the SQL
> environment as this is the only thing that has changed. I'm not saying
> this is a SQL issue as it may be that the procedure needs amending in
> some way to make it run more efficiently.
> Any feedback or advice would be grateful.|||Looks a very good stab in the dark, thank you.
Upon enabling the job again the job has now reduced back down to 6
minutes.
I'm not to clued up on the SQL Server admin side and am wondering what
exactly are these statistics and what did this procedure do i.e. what
does updating them do ?
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!|||From SQL 200 BOL:
"The distribution statistics are used by Microsoft SQL ServerT to optimize
navigation through tables during the processing of Transact-SQL statements.
To build the distribution statistics automatically, SQL Server periodically
samples a percentage of the data in the corresponding table for each index."
From "Distribution Statistics" in SQL 2000 BOL:
"All indexes have distribution statistics that describe the selectivity and
distribution of the key values in the index. Selectivity is a property that
relates to how many rows are typically identified by a key value. A unique
key has high selectivity; a key value found in 1,000 rows has poor
selectivity. The selectivity and distribution statistics are used by
Microsoft SQL ServerT 2000 to optimize its navigation through tables and
indexed views when processing Transact-SQL statements. The distribution
statistics are used to estimate how efficient an index would be in
retrieving data associated with a key value or range specified in the query.
The statistics for each index are not limited to a single page but are
stored as a long string of bits across multiple pages in the same way image
data is stored. The column sysindexes.statblob points to this distribution
data. You can use the DBCC SHOW_STATISTICS statement to get a report on the
distribution statistics for an index.
Distribution statistics may also be maintained for unindexed columns. These
can be defined manually using the CREATE STATISTICS statement or created
automatically by the query optimizer. Statistics on unindexed columns count
against the limit of 249 nonclustered indexes allowed on a table.
To be useful to query optimizer, distribution statistics must be kept
reasonably current. The distribution statistics should be refreshed anytime
significant numbers of changes to keys occur in the index. Distribution
statistics can be updated manually using the UPDATE STATISTICS statement.
SQL Server 2000 can also detect when distribution statistics are out of date
and update the statistics automatically. This update is performed by the
task that detected that the statistics needed to be updated. The update is
performed using a complex sampling method that minimizes the effect of the
update on transaction throughput."
Appiles to SQL 7 and 2000 alike.
--
Mike Epprecht, Microsoft SQL Server MVP
Johannesburg, South Africa
Mobile: +27-82-552-0268
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"microsoft.public.sqlserver.server" <robert.zirpolo@.phoenixdb.co.uk> wrote
in message news:OHmihz7eEHA.2916@.TK2MSFTNGP12.phx.gbl...
> Looks a very good stab in the dark, thank you.
> Upon enabling the job again the job has now reduced back down to 6
> minutes.
> I'm not to clued up on the SQL Server admin side and am wondering what
> exactly are these statistics and what did this procedure do i.e. what
> does updating them do ?
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment