• Adam Thurgar

Statistics sampling


DBA's and developers know the importance of having up to date statistics to ensure that query performance is optimal. Most of the time the statistics sampling percentage is left to the default.

To understand how the default is calculated see the following:

https://blogs.msdn.microsoft.com/srgolla/2012/09/04/sql-server-statistics-explained/

We don't change this, but there may be circumstances that will benefit from doing a FULLSCAN (especially if you have the window available to do this). For large or heavily used tables this could improve query performance, but remember that changing from the default will take longer and use more resources to update the statistics.

Read more here:

https://docs.microsoft.com/en-us/sql/t-sql/statements/update-statistics-transact-sql?view=sql-server-2017


6 views

Recent Posts

See All

Cardinality estimator

Recently I was asked by a software vendor to review a particular query that ran in under a second on a SQL Server 2014 installation at a compatibility level of 110 (SQL Server 2012), but when run unde

Index fragmentation

A law firm client, occasionally has issues with their legal software, that is provided by the global leader in this field. The response from the software provider is always the same - you have index f

Deleting large amounts of data

I had a client call me about wanting to delete a large amount of data from their database. They knew what tables they wanted to delete from. They also knew that deleting large amounts of data causes t