Category Archives: tsql

How to write T-SQL Geography data to a table

Below is some example code for writing the SQL Server geography data type to a table. Note by default geography data is stored in a binary format but it can be converted to a string to make it human readable.

/*Demo of geo data*/
DECLARE @g GEOGRAPHY;

SET @g = GEOGRAPHY::STPointFromText('POINT(53.578741 -6.611670)', 4326);

/*Geography data is in binary format*/
SELECT @g AS 'GeoBinaryFormat';

/*Convert binary data to a string*/
SELECT @g.ToString() AS 'ConvertingDataToString';


/*Inserting geo data into Table*/
CREATE TABLE #GeoTest ([CoordinateLocation] [geography] NULL);

INSERT INTO #GeoTest (CoordinateLocation)
SELECT GEOGRAPHY::STPointFromText('POINT(53.578741 -6.611670)', 4326);

SELECT *
FROM #GeoTest;

DROP TABLE #GeoTest;

How to sum time with T-SQL

Time cannot be summed directly in T-SQL. In order to sum two times they first need to be assigned a date. When a time data type is cast as a datetime data type, as it does not have a date element, the value defaults to the date of 1900-01-01.

As T-SQL does have the functionality to sum datetime and as the date element will be the same only the time value will be summed. This functionality allows us to sum time.

Below is example T-SQL:

IF OBJECT_ID('tempdb..#TimeTable', 'U') IS NOT NULL
BEGIN
DROP TABLE #TimeTable
END

CREATE TABLE #TimeTable(
	id INT
	,TimeRecord TIME(0)
	);

INSERT INTO #TimeTable
VALUES (
	1
	,'00:00:10'
	);

INSERT INTO #TimeTable
VALUES (
	1
	,'00:14:00'
	);

INSERT INTO #TimeTable
VALUES (
	2
	,'00:00:10'
	);

INSERT INTO #TimeTable
VALUES (
	2
	,'00:35:10'
	);

SELECT id
,TimeRecord
FROM #TimeTable;

/*demo of time converted to datetime*/
SELECT CAST(TimeRecord AS DATETIME) AS DateTimeRecord
FROM #TimeTable

SELECT id
	,CAST(DATEADD(MILLISECOND, SUM(DATEDIFF(MILLISECOND, 0, CAST(TimeRecord AS DATETIME))), 0) AS TIME(0)) AS SummedTime
FROM #TimeTable
GROUP BY id;

How to get a substring between two characters with T-SQL

This is a very common activity in the data world, i.e. there’s some data in a text string you need and the rest of the data in the string is just in your way. Some use cases might be you have a reference in a filename you need to extract, or you may need a snippet of data to create a composite key, or there’s an order number surrounded by other data that is not relevant to your needs etc.

The following is some simple T-SQL that will extract the data you want from a text string providing the data has specific delimiting characters on each side of it.

/*Delimiter variables, first and second position*/
DECLARE @dfp AS CHAR(1);
DECLARE @dsp AS CHAR(1);
DECLARE @text VARCHAR(MAX);

SET @dfp = ';';
SET @dsp = '@';
SET @text = 'I want you to ;Extract this@ substring for me please.';

SELECT SUBSTRING(@text, (CHARINDEX(@dfp, @text) + 1), (CHARINDEX(@dsp, @text) - 2) - CHARINDEX(@dfp, @text) + Len(@dsp))

How to pick random numbers between two numbers with T-SQL

Below is a T-SQL example that will pick a random number between 1 and 50.

SELECT CAST(RAND() * (51 - 1) + 1 AS INT) AS Random#

That’s a bit boring though.
What about parameters?
What about a use case?
Where’s the familiar glamour of coding with T-SQL?

I hear ya.

Below is a T-SQL example that could make you a multimillionaire!

This T-SQL code will pick random numbers for the Euromillions lottery.

Good luck.

DECLARE @a INT;
DECLARE @b INT;
DECLARE @count INT;
DECLARE @pick INT;

DROP TABLE IF EXISTS #Num;

CREATE TABLE #Num (
	Number INT
	,NumberType VARCHAR(255)
	);

SET @a = 1
SET @b = 51
SET @count = 1

WHILE @count < 6
BEGIN
	SET @pick = CAST(RAND() * (@b - @a) + @a AS INT)

	IF (
			SELECT Number
			FROM #Num
			WHERE Number = @pick
			AND NumberType = 'Main'
			) IS NULL
	BEGIN
		INSERT INTO #Num (
			Number
			,NumberType
			)
		SELECT @pick
			,'Main';

		SET @count = @count + 1
	END
END

SET @a = 1
SET @b = 13
SET @count = 1

WHILE @count < 3
BEGIN
	SET @pick = CAST(RAND() * (@b - @a) + @a AS INT)

	IF (
			SELECT Number
			FROM #Num
			WHERE Number = @pick
			AND NumberType = 'Lucky'
			) IS NULL
	BEGIN
		INSERT INTO #Num (
			Number
			,NumberType
			)
		SELECT @pick
			,'Lucky';

		SET @count = @count + 1
	END
END

SELECT Number
	,NumberType
FROM #Num
ORDER BY NumberType DESC
,Number ASC;

If you found this post helpful please like/share/subscribe.


How to create a job that will test whether SQL Server database mail is working

The following script will create a job that will run every minute to test if database mail can be sent from a job scheduled to run by the Sql Server Agent.

Simply find and replace the email address below with the email address you want to target:

testoperator@mail.com

Then run the script.

The operator ‘Test Operator’ and job ‘MailTest’ will be created.

The job is disabled by default, enable it to begin testing.

When you are finished run the commented out section at the bottom of the script to remove the test operator and job.

If you have just setup database mail for the first time the SQL Server Agent will need to be restarted.

/*
FIND AND REPLACE

testoperator@mail.com

*/
USE msdb;
GO

EXEC dbo.sp_add_operator @name = N'Test Operator'
	,@enabled = 1
	,@email_address = N'testoperator@mail.com'
GO

USE [msdb]
GO

BEGIN TRANSACTION

DECLARE @ReturnCode INT

SELECT @ReturnCode = 0

/****** Object:  JobCategory [[Uncategorized (Local)]]    Script Date: 31/07/2019 11:35:43 ******/
IF NOT EXISTS (
		SELECT NAME
		FROM msdb.dbo.syscategories
		WHERE NAME = N'[Uncategorized (Local)]'
			AND category_class = 1
		)
BEGIN
	EXEC @ReturnCode = msdb.dbo.sp_add_category @class = N'JOB'
		,@type = N'LOCAL'
		,@name = N'[Uncategorized (Local)]'

	IF (
			@@ERROR <> 0
			OR @ReturnCode <> 0
			)
		GOTO QuitWithRollback
END

DECLARE @jobId BINARY (16)

EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name = N'MailTest'
	,@enabled = 0
	,@notify_level_eventlog = 0
	,@notify_level_email = 3
	,@notify_level_netsend = 0
	,@notify_level_page = 0
	,@delete_level = 0
	,@description = N'No description available.'
	,@category_name = N'[Uncategorized (Local)]'
	,@owner_login_name = N'sa'
	,@notify_email_operator_name = N'Test Operator'
	,@job_id = @jobId OUTPUT

IF (
		@@ERROR <> 0
		OR @ReturnCode <> 0
		)
	GOTO QuitWithRollback

/****** Object:  Step [Step 1]    Script Date: 31/07/2019 11:35:44 ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id = @jobId
	,@step_name = N'Step 1'
	,@step_id = 1
	,@cmdexec_success_code = 0
	,@on_success_action = 1
	,@on_success_step_id = 0
	,@on_fail_action = 2
	,@on_fail_step_id = 0
	,@retry_attempts = 0
	,@retry_interval = 0
	,@os_run_priority = 0
	,@subsystem = N'TSQL'
	,@command = N'SELECT 1'
	,@database_name = N'master'
	,@flags = 0

IF (
		@@ERROR <> 0
		OR @ReturnCode <> 0
		)
	GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId
	,@start_step_id = 1

IF (
		@@ERROR <> 0
		OR @ReturnCode <> 0
		)
	GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id = @jobId
	,@name = N'Job Schedule'
	,@enabled = 1
	,@freq_type = 4
	,@freq_interval = 1
	,@freq_subday_type = 4
	,@freq_subday_interval = 1
	,@freq_relative_interval = 0
	,@freq_recurrence_factor = 0
	,@active_start_date = 20190731
	,@active_end_date = 99991231
	,@active_start_time = 0
	,@active_end_time = 235959
	,@schedule_uid = N'f0741db6-488e-44da-8f5e-a3f0ed13835e'

IF (
		@@ERROR <> 0
		OR @ReturnCode <> 0
		)
	GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId
	,@server_name = N'(local)'

IF (
		@@ERROR <> 0
		OR @ReturnCode <> 0
		)
	GOTO QuitWithRollback

COMMIT TRANSACTION

GOTO EndSave

QuitWithRollback:

IF (@@TRANCOUNT > 0)
	ROLLBACK TRANSACTION

EndSave:
GO

/*
REMOVE OPERATOR AND JOB
*/
/*
USE msdb;
GO

EXEC sp_delete_operator @name = 'Test Operator';

EXEC sp_delete_job @job_name = N'MailTest';

GO
*/

 

How to get SQL Server Network Information using SSMS

The following code will work for a remote client request to SQL 2008 and newer.

Note: The local machine address (local_net_address) is that of the SQL Server while client_net_address is the address of the remote computer you have used to make the request. 

SELECT @@SERVERNAME AS ServerName
	,CONNECTIONPROPERTY('net_transport') AS net_transport
	,CONNECTIONPROPERTY('protocol_type') AS protocol_type
	,CONNECTIONPROPERTY('auth_scheme') AS auth_scheme
	,CONNECTIONPROPERTY('local_net_address') AS local_net_address
	,CONNECTIONPROPERTY('local_tcp_port') AS local_tcp_port
	,CONNECTIONPROPERTY('client_net_address') AS client_net_address
 

 

How to find Missing Indexes for all databases in a SQL Server instance

This script is for SQL Server 2005 and up. The script will return all the missing indexes for a SQL Server instance, rating their impact and provide the T-SQL to create the missing indexes.

SQL Server 2005 was the first version of SQL Server to add DMV (Database Management View) and DMO (Database Management Objects) which this script requires to function.
DMV & DMO provide useful information about SQL Server like expensive queries, wait types, missing indexes etc.

WARNING!
Before you create the missing indexes on the referenced tables you must consider the following essential points:
• Find and assess all the queries that are using the table referenced. If the table has a heavy amount of Data Manipulation Language (DML) operations against it (SELECT, INSERT, UPDATE, or DELETE) then you must analyse what impact adding the missing index will have before you create it on the table. INSERTs on tables are slowed down by nonclustered indexes for example.
• You need to make sure that by creating the missing indexes you are not going to end up with duplicate indexes on tables. The duplicate or unwanted indexes can kill your database performance. Search for the blog “over-indexing can hurt your SQL Server performance” for more information.
• If you find there is already an existing index that has most of the columns of the missing index highlighted you should consider adding the missing columns to the current index rather than creating another index on the table. FYI making an index wider does not mean adding all columns from a table into the current index.

/*Script to find Missing Indexes for all databases in SQL Server*/
/*
This script is for SQL Server 2005 and up. 
The script will return all the missing indexes for a SQL Server instance, rating their impact 
and provide the T-SQL to create the missing indexes.

SQL Server 2005 was the first version of SQL Server to add DMV (Database Management View) 
and DMO (Database Management Objects) which this script requires to function. 
DMV & DMO provide useful information about SQL Server like expensive queries, wait types, missing indexes etc.

WARNING!
Before you create the missing indexes on the referenced tables you must consider the following essential points:
• Find and assess all the queries that are using the table referenced. If the table has a heavy amount of Data Manipulation Language (DML) 
operations against it (SELECT, INSERT, UPDATE, or DELETE) then you must analyse what impact adding the missing index will have before you create it on the table. 
INSERTs on tables are slowed down by nonclustered indexes for example.
• You need to make sure that by creating the missing indexes you are not going to end up with duplicate indexes on tables. 
The duplicate or unwanted indexes can kill your database performance. Search for the blog “over-indexing can hurt your SQL Server performance” for more information.
• If you find there is already an existing index that has most of the columns of the missing index highlighted you should consider adding the missing columns to 
the current index rather than creating another index on the table. FYI making an index wider does not mean adding all columns from a table into the current index.
*/
SELECT [EstIndexUses]
	,[EstIndexImpact%]
	,[EstAvgQueryCost]
	,[DbName]
	,[SchemaName]
	,[TableName]
	,[CreateIndex]
	,[EqualityColumns]
	,[InequalityColumns]
	,[IncludedColumns]
	,[UniqueCompiles]
	,[LastUserSeek]
FROM (
	SELECT migs.user_seeks AS [EstIndexUses]
		,migs.avg_user_impact AS [EstIndexImpact%]
		,migs.avg_total_user_cost AS [EstAvgQueryCost]
		,db_name(mid.database_id) AS [DbName]
		,OBJECT_SCHEMA_NAME(mid.OBJECT_ID, mid.database_id) AS [SchemaName]
		,OBJECT_NAME(mid.OBJECT_ID, mid.database_id) AS [TableName]
		,'CREATE INDEX [IX_' + OBJECT_NAME(mid.OBJECT_ID, mid.database_id) + '_' + REPLACE(REPLACE(REPLACE(ISNULL(mid.equality_columns, ''), ', ', '_'), '[', ''), ']', '') + CASE 
			WHEN mid.equality_columns IS NOT NULL
				AND mid.inequality_columns IS NOT NULL
				THEN '_'
			ELSE ''
			END + REPLACE(REPLACE(REPLACE(ISNULL(mid.inequality_columns, ''), ', ', '_'), '[', ''), ']', '') + ']' + ' ON ' + mid.statement + ' (' + ISNULL(mid.equality_columns, '') + CASE 
			WHEN mid.equality_columns IS NOT NULL
				AND mid.inequality_columns IS NOT NULL
				THEN ','
			ELSE ''
			END + ISNULL(mid.inequality_columns, '') + ')' + ISNULL(' INCLUDE (' + mid.included_columns + ') WITH (MAXDOP =?, FILLFACTOR=?, ONLINE=?, SORT_IN_TEMPDB=?);', '') AS [CreateIndex]
		,mid.equality_columns AS EqualityColumns
		,mid.inequality_columns AS InequalityColumns
		,mid.included_columns AS IncludedColumns
		,migs.unique_compiles AS UniqueCompiles
		,migs.last_user_seek AS LastUserSeek
	FROM sys.dm_db_missing_index_group_stats AS migs WITH (NOLOCK)
	INNER JOIN sys.dm_db_missing_index_groups AS mig WITH (NOLOCK) ON migs.group_handle = mig.index_group_handle
	INNER JOIN sys.dm_db_missing_index_details AS mid WITH (NOLOCK) ON mig.index_handle = mid.index_handle
	) AS a
WHERE 1 = 1
--AND [EstIndexUses] > 1000
--AND [EstIndexImpact%] > 10
--AND [EstAvgQueryCost] > 1
--AND DbName IN ('DatabaseName')
ORDER BY [EstIndexUses] DESC
	,[EstAvgQueryCost] DESC
	,[EstIndexImpact%] DESC
OPTION (RECOMPILE);

 

How to grant a User SELECT permission on multiple tables in SQL Server using T-SQL and Excel Formulas

Maybe you get emails from time to time saying something like “Hey can you grant so-and-so SELECT permission on” and then they list a few dozen tables.

There’s a couple of issues with this.

Firstly you shouldn’t be getting that as a simple email, it should come in as a formal access request.

Secondly User access should be defined in roles (or group logins if you want to manage access at an Active Directory level) that tie back to departments and seniority. Different roles have different permissions on different objects. This makes the subject of access more manageable and easily auditable. The access request should be “can you add so-and-so to this role” and ideally people should only exist in one role.

Thirdly this would be really annoying and, depending on the length of the table list, take too long to do via the SSMS GUI.

So if you are getting emails like the above try move your organisation along with regards the first two points. But to help you action the email I’ve created the Excel file DbaScripts_GrantSelect which can be downloaded here.

Grant Select Excel Sheet Snippet

The DbaScripts_GrantSelect file allows you to enter the Login (user name), Database name and Schema name in the first three columns. You can then copy and paste the table names into the fourth column called Table. Drag the first three columns down for as many table name entries there are. Then drag the SQL Command formula column down for as many table name entries there are and this will create the commands to grant SELECTs on the tables for the user specified.

If you can’t download the file above you can recreate it.

In an empty Excel sheet write the following into the cells as directed.

A1: Login
B1: Database
C1: Schema
D1: Table
E1: SQL Command

In E2 paste the following formula:

=”GRANT SELECT ON [“&B2&”].[“&C2&”].[“&D2&”] TO [“&A2&”];”

 

Find and drop every table in a SQL Server Database that contains specific text in the table name

GDPR compliance has given people working in the DBA space the exciting opportunity to drop tables! Tables once considered gold mines are now being treated like live grenades management want rid of fast. This is a prudent stance because if a table contains personal data and it’s not being used for some vital business process why keep it around now? If somehow the wrong person gained access to the data it could have severe reputational and financial consequences. Of course a business should have never kept unneeded personal data but in truth most companies have gathered as much data as they could up until this point even if it wasn’t used as the assumption has been it might be needed later.

Before dropping tables though it is still good practice to rename the table first for a period of time to make sure nothing breaks. Once a sufficient amount of time has passed and you are confident the tables can be dropped without adverse effects the script below can help drop the newly unwanted tables.

If you’ve followed a standard naming convention for renaming unwanted tables, for example prefixing all the targeted tables with “_DropThis_” or something to that effect, this script will provide commands to:

  • Count the number of rows in each targeted table
  • Drop each targeted table
  • Confirm each targeted table has been dropped.

Simply find & replace the text “DatabaseName” with the name of the database that contains the tables to be dropped and “TextTarget” with the text each table name should contain and run the script.

Running the script will produce two tables. The first table will contain the commands to count the number of rows for each targeted table. The second table will contain the commands to drop each targeted table. Open new windows in SSMS, referencing the database to run the commands against, and copy and paste the scripts of each table into the windows. Run the count script first obviously before the drop script. Once the drop script has been run you can run the commented out query at the end of the script below to confirm the tables have been dropped.

You can take the screen shots and/or copy and paste the results of the commands (and the commands themselves) into an email or text document as a simple report to confirm the number of rows dropped and that the tables have been dropped.

/*
Find & Replace:
DatabaseName
TextTarget
*/
USE [DatabaseName];
GO

DECLARE @keyword AS VARCHAR(MAX);
DECLARE @MaxRow AS INT;

SET @keyword = 'TextTarget'

IF OBJECT_ID('tempdb.dbo.#TableStats', 'U') IS NOT NULL
	DROP TABLE #TableStats;

IF OBJECT_ID('tempdb.dbo.#CountCommand', 'U') IS NOT NULL
	DROP TABLE #CountCommand;

/*
Confirm Table Existance
*/
SELECT s.NAME AS SchemaName
	,t.*
INTO #TableStats
FROM sys.tables AS t
LEFT JOIN sys.schemas AS s ON t.schema_id = s.schema_id
WHERE t.NAME LIKE '%' + @keyword + '%'
ORDER BY s.NAME ASC
	,t.NAME ASC;

/*
Create Count SQL Commands
*/
SELECT ROW_NUMBER() OVER (
		ORDER BY [--SqlCommand]
		) AS Row#
	,[--SqlCommand]
INTO #CountCommand
FROM (
	SELECT 'SELECT ''' + s.NAME + '.' + t.NAME + ''' AS SchemaTableName, COUNT (*) AS [TableRowCount] FROM ' + QUOTENAME(s.NAME) + '.' + QUOTENAME(t.NAME) + ' WITH (NOLOCK) UNION ALL' AS [--SqlCommand]
	FROM sys.tables AS t
	LEFT JOIN sys.schemas AS s ON t.schema_id = s.schema_id
	WHERE t.NAME LIKE '%zzz%'
	) AS SqlCommand
ORDER BY [--SqlCommand];

SET @MaxRow = (
		SELECT MAX(Row#)
		FROM #CountCommand
		);

UPDATE #CountCommand
SET [--SqlCommand] = LEFT([--SqlCommand], LEN([--SqlCommand]) - 9)
WHERE Row# = @MaxRow;

UPDATE #CountCommand
SET [--SqlCommand] = [--SqlCommand] + ';'
WHERE Row# = @MaxRow;

SELECT *
FROM #CountCommand
ORDER BY [--SqlCommand];

/*
Create Drop Table SQL Commands
*/
SELECT 'DROP TABLE ' + QUOTENAME(s.NAME) + '.' + QUOTENAME(t.NAME) + ';' AS '--DropTableSqlCommand'
FROM sys.tables AS t
LEFT JOIN sys.schemas AS s ON t.schema_id = s.schema_id
WHERE t.NAME LIKE '%' + @keyword + '%'
ORDER BY s.NAME ASC
	,t.NAME ASC;
/*
Run seperately
Confirm Tables are dropped
*/
	/*
SELECT s.NAME AS SchemaName
	,t.NAME AS TableName
FROM sys.tables AS t
LEFT JOIN sys.schemas AS s ON t.schema_id = s.schema_id
INNER JOIN #TableStats AS ts ON s.NAME = ts.SchemaName
	AND t.NAME = ts.NAME
ORDER BY s.NAME ASC
	,t.NAME ASC;
*/

 

How to schedule a job to restore the last backup made of a SQL Server database

This post provides you with a script that will generate a restore script for a database using the latest full backup file that exists in a directory. (No need to state the filename explicitly)

You need to provide the following at the start of the script:

  • The target database i.e. the database you will restore to
  • The directory where the backup file is saved

If you are using the excellent Ola Hallengren maintenance solution (see link) the directory path will look something like below. If you’re not using Ola’s solution, you should be.

\\ServerWhereBackupsAreSaved\DriveName\InstanceName\TargetDatabase\Full\

This restore script is designed to work with Ola’s solution as it segregates the backup directory structure such that each database has an allocated folder and each full backup file is named with the date and time of the file creation.

The restore script determines which backup file is the latest backup file based on the max name. So for the script to work it is assumed you have an appropriate backup strategy (i.e. using Ola’s solution) were backup types are segregated into different folders, backup names have a date reference and the backup location is dedicated to backups and nothing else, i.e. no trash files in the location.

Some use cases for this solution might be:

  • Restoring a nightly backup to another instance for reporting purposes
  • Restoring backups to a development environment
  • Restoring backups to another server to test the backups

You can use the logic in a stored procedure or as the T-SQL in a job step and schedule accordingly.

/*
You need to reference the following:
* The target database i.e. the database you will restore to.
* The directory where the backup file is saved.
If you are using the Ola Hallengren backup scripts the directory path will look 
something like below.
\\ServerWhereBackupsAreSaved\DriveName\InstanceName\TargetDatabase\Full\

Find & Replace the follow text for the target database and directory:

TARGET_DATABASE
DIR_PATH
*/
/*
Declare Variables
*/
DECLARE @DatabaseToRestore AS VARCHAR(MAX);
DECLARE @DirToSearch AS VARCHAR(MAX);
DECLARE @ShellCommand AS VARCHAR(MAX);
DECLARE @BackupFile AS VARCHAR(MAX);
DECLARE @Sql AS VARCHAR(MAX);

/*
Set User Variables
*/
SET @DatabaseToRestore = 'TARGET_DATABASE';
SET @DirToSearch = 'DIR_PATH';
SET @ShellCommand = 'dir ' + @DirToSearch;

/*
Create Temp Table To Hold xp_cmdshell Output
*/
IF OBJECT_ID('tempdb..#DirList') IS NOT NULL
	DROP TABLE #DirList;

CREATE TABLE #DirList (
	Id INT identity(1, 1)
	,line NVARCHAR(1000)
	);

/*
Enable Advanced Options To Enable xp_cmdshell Temporarily
*/
EXEC master.dbo.sp_configure 'show advanced options'
	,1;

RECONFIGURE
WITH OVERRIDE;

EXEC master.dbo.sp_configure 'xp_cmdshell'
	,1;

RECONFIGURE
WITH OVERRIDE;

/*
Run The Shell Command To Capture And Write Dir Info To Temp Table
*/
SET @Sql = '
INSERT INTO #DirList (line)
EXEC xp_cmdshell ' + '''' + @ShellCommand + '''' + ';';

EXEC (@Sql);

EXEC master.dbo.sp_configure 'xp_cmdshell'
	,0;

/*
Disable Advanced Options And xp_cmdshell Again
*/
RECONFIGURE
WITH OVERRIDE;

EXEC master.dbo.sp_configure 'show advanced options'
	,0;

RECONFIGURE
WITH OVERRIDE;

/*
Get The Last Backup File Name And Save To A Variable
*/
WITH CTE
AS (
	SELECT SUBSTRING(line, 37, 100) [FileName]
	FROM #DirList
	WHERE Id > (
			SELECT MIN(Id)
			FROM #DirList
			WHERE line LIKE '%<DIR>%..%'
			)
		AND Id < (
			SELECT MAX(Id) - 2
			FROM #DirList
			)
	)
SELECT @BackupFile = [FileName]
FROM CTE
WHERE [FileName] = (
		SELECT MAX(FileName)
		FROM CTE
		);

/*
Create The Restore Script
*/
SET @BackupFile = @DirToSearch + @BackupFile
SET @Sql = '
ALTER DATABASE ' + QUOTENAME(@DatabaseToRestore) + ' SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
RESTORE DATABASE ' + QUOTENAME(@DatabaseToRestore) + ' FROM DISK = ' + '''' + @BackupFile + '''' + '
WITH NORECOVERY
,REPLACE;
RESTORE DATABASE ' + QUOTENAME(@DatabaseToRestore) + ' WITH RECOVERY;
'

/*
To Directly Execute The Script Uncomment The EXEC Statement And Delete The Select Statement
*/
/*
EXEC (@Sql)
*/
SELECT @Sql