Database Optimization

Overview

To maintain a healthy database performance or if your database is experiencing slow response time due to heavy usage, we suggest to improve its performance behavior by adding a SQL Server scheduled job (i.e. automated task) that will defragment the database table indexes and reduce the size of the SQL log file to free up disk space.

Applies to

  • SQL Server

Procedure

  1. In the Object Explorer of the Sql Server Management Studio (SSMS), expand the SQL Server Agent section to add a new job.

  2. Right-click on the Jobs folder and choose New job…

  3. Configure the new job’s General tab:

    • Insert a representative name such as Defrag and shrink log.
    • Leave the Owner and Category as-is.
    • Insert a representative description, such as Defrag the WorkflowGen database and shrink the WorkflowGen database log.
  4. After setting the general configuration, go to the Steps page and click New…

  5. On the General tab:

    • Give the step a name (e.g. Launch defrag).
    • Leave the Type value set to Transact-SQL script (T-SQL).
    • Leave Run as empty.
    • Select the database you want the job to run on.
    • Insert the following command:
    DECLARE @TableName nvarchar(255), @SqlQuery nvarchar(max)
    SELECT @TableName = min(TABLE_NAME) FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE'
                
    WHILE @TableName is not null
    BEGIN
    SET @SqlQuery='ALTER INDEX ALL ON [' + @TableName + '] REORGANIZE;'
    EXEC SP_EXECUTESQL @SqlQuery
    SELECT @TableName = min(TABLE_NAME) FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE' and TABLE_NAME > @TableName          
    END
    
  6. On the Advanced tab:

    • On success action select the Go to the next step option.
    • Retry and Interval can be set to 0.
    • For Failure action, select Quit the job reporting.
    • The rest can be left empty.
  7. On the Steps page, add a second step.

  8. On the General tab:

    • Give the step a name (Shrink Log).
    • Leave Type set to Transact-SQL script (T-SQL).
    • Leave Run as empty.
    • Select the database you want the job to run on.
    • Insert the following command:
    DBCC SHRINKFILE (WFGEN_Log, 1)
    
  9. On the Advanced tab:

    • For Success action, select the Quit the job reporting success option.
    • Retry and Interval can be set to 0.
    • For Failure action, select the Quit the job reporting failure.
    • The rest can be left empty.
  10. Go to the Schedules page and click New…, then configure as follows:

    • Insert a representative name for this new schedule.
    • Make sure the Schedule type is set to Recurring and Enabled is checked.
    • We recommend setting Frequency to Daily and Recurs every to 1 to 3 days, depending on the usage volume of the database. (The higher the volume, the smaller the occurrence interval should be.)
    • We recommend setting Daily frequency to Occurs once at a time outside of business hours.
    • We recommend setting Duration to No end date.

Summary of the scheduled job

I see your defrag command is using REORGANIZE vs REBUILD. I’m not a DBA but is there a preference between the two?

Hello, hcampeau,

The reason we are using REORGANIZE over REBUILD is mostly to minimize downtime.

REORGANIZE is a lightweight solution that doesn’t require a schema mod lock, that can be used not matter the SQL server edition you possess. If interrupted or stop it will just stop reorganizing at the point where it was.

REBUILD is heavy in its usage and also have restriction depending on the SQL server edition. If the rebuild is stopped it will roll back which can impact the performance and downtime.

  • With the standard edition, the rebuild is done as an offline operation and is single-threaded.
  • With the Enterprise edition the rebuild can be done online but will be using a schema modification lock, which can impact the downtime of the SQL server.

You may decide to use REBUILD instead of REORGANIZE, but do take into consideration the different impacts.

Regards,
Lynn