Category Archives: How To

How to convert a Markdown file to Word document with PowerShell: A One-Line Solution

Overview

Need to convert Markdown files to Word documents quickly and efficiently? 

This one line PowerShell leverages the Word COM object to transform your .md files into professionally formatted .docx documents – no external tools or dependencies required.

The Solution

This script provides a complete Markdown-to-Word converter that handles all common formatting elements. Simply update the file path, paste into PowerShell, and execute.

$mdPath = "C:\Users\Documents\report.md"; if (-not (Test-Path $mdPath)) { Write-Error "❌ Markdown file not found at: $mdPath"; return }; $docPath = [System.IO.Path]::ChangeExtension($mdPath, ".docx"); function Process-InlineFormatting { param($range, $text); $position = 0; $segments = @(); while ($position -lt $text.Length) { if ($text.Substring($position) -match '^\*\*\*(.+?)\*\*\*') { $segments += @{Text = $matches[1]; Bold = $true; Italic = $true}; $position += $matches[0].Length } elseif ($text.Substring($position) -match '^\*\*(.+?)\*\*') { $segments += @{Text = $matches[1]; Bold = $true; Italic = $false}; $position += $matches[0].Length } elseif ($text.Substring($position) -match '^\*(.+?)\*') { $segments += @{Text = $matches[1]; Bold = $false; Italic = $true}; $position += $matches[0].Length } else { $nextAsterisk = $text.IndexOf('*', $position); if ($nextAsterisk -eq -1) { $segments += @{Text = $text.Substring($position); Bold = $false; Italic = $false}; break } else { $segments += @{Text = $text.Substring($position, $nextAsterisk - $position); Bold = $false; Italic = $false}; $position = $nextAsterisk } } }; foreach ($segment in $segments) { if ($segment.Text) { $range.Text = $segment.Text; $range.Font.Bold = $segment.Bold; $range.Font.Italic = $segment.Italic; $range.Collapse(0) } } }; try { $word = New-Object -ComObject Word.Application; $word.Visible = $false; $doc = $word.Documents.Add(); $lines = Get-Content $mdPath -Encoding UTF8; $wdCollapseEnd = 0; $wdBorderBottom = 4; $wdLineStyleSingle = 1; $wdColorBlack = 0; $wdLineWidth150pt = 6; foreach ($line in $lines) { $range = $doc.Content; $range.Collapse($wdCollapseEnd); if ($line -match '^\s*$') { $range.Text = "`r"; $range.Style = "Normal"; continue }; if ($line -match '^\s*[-]{3,}\s*$') { $range.Text = "`r"; $range.Style = "Normal"; $range.ParagraphFormat.Borders.Item($wdBorderBottom).LineStyle = $wdLineStyleSingle; $range.ParagraphFormat.Borders.Item($wdBorderBottom).Color = $wdColorBlack; $range.ParagraphFormat.Borders.Item($wdBorderBottom).LineWidth = $wdLineWidth150pt } elseif ($line -match '^(#{1,6})\s+(.+)') { $headerLevel = $matches[1].Length; $headerText = $matches[2]; Process-InlineFormatting -range $range -text $headerText; $range.Style = "Heading $headerLevel"; $range.Collapse($wdCollapseEnd); $range.Text = "`r" } elseif ($line -match '^\s*[-*+]\s+(.+)') { $listText = $matches[1]; Process-InlineFormatting -range $range -text $listText; $range.Style = "List Bullet"; $range.Collapse($wdCollapseEnd); $range.Text = "`r" } elseif ($line -match '^\s*\d+\.\s+(.+)') { $listText = $matches[1]; Process-InlineFormatting -range $range -text $listText; $range.Style = "List Number"; $range.Collapse($wdCollapseEnd); $range.Text = "`r" } else { Process-InlineFormatting -range $range -text $line; $range.Style = "Normal"; $range.Collapse($wdCollapseEnd); $range.Text = "`r" } }; $doc.SaveAs([ref] $docPath); $doc.Close(); $word.Quit(); [System.Runtime.Interopservices.Marshal]::ReleaseComObject($word) | Out-Null; [System.GC]::Collect(); [System.GC]::WaitForPendingFinalizers(); Write-Host "✅ Markdown converted to Word document at $docPath" } catch { Write-Error "❌ An error occurred: $_"; if ($doc) { $doc.Close($false) }; if ($word) { $word.Quit(); [System.Runtime.Interopservices.Marshal]::ReleaseComObject($word) | Out-Null } }

How to Use

Step 1: Update the file path at the beginning of the script

$mdPath = "C:\Users\Documents\report.md"

Step 2: Copy the entire one-line script above

Step 3: Paste into PowerShell and press Enter

Step 4: Your Word document will be created in the same directory with a .docx extension

Supported Markdown Features

The script converts the following Markdown elements to their Word equivalents:

Headers

markdownDownloadCopy code# Heading 1
## Heading 2
### Heading 3

Converts to Word’s built-in Heading 1, 2, 3 styles (up to Heading 6)

Text Formatting

  • Bold: **text** → text
  • Italic: *text* → text
  • Bold + Italic: ***text*** → text

Lists

Bullet Lists:

markdownDownloadCopy code- Item one
* Item two
+ Item three

Numbered Lists:

markdownDownloadCopy code1. First item
2. Second item
3. Third item

Horizontal Rules

markdownDownloadCopy code---

Creates a horizontal line with bottom border formatting

Special Characters

UTF-8 encoding support ensures proper handling of international characters, accents, and special symbols.

How It Works

Inline Formatting Parser

The script includes a custom function that processes inline formatting by:

  1. Parsing text character-by-character
  2. Identifying markdown formatting markers (******)
  3. Extracting the text within markers
  4. Applying appropriate Word formatting (Bold, Italic, or both)
  5. Removing the markdown syntax

Document Processing

The script:

  • Reads the markdown file with UTF-8 encoding
  • Processes each line sequentially
  • Applies regex pattern matching to identify markdown elements
  • Uses Word COM object methods to apply appropriate styles
  • Handles blank lines to preserve document spacing
  • Properly manages Word styles to prevent formatting conflicts

COM Object Management

The script properly initializes and cleans up Word COM objects to prevent memory leaks and ensure smooth execution.

Key Features

✅ No External Dependencies – Uses built-in PowerShell and Word COM automation
✅ UTF-8 Support – Handles international characters correctly
✅ Inline Formatting – Processes bold and italic within any text element
✅ Professional Output – Uses Word’s native styles for consistent formatting
✅ One-Line Execution – Paste and run without saving to a file
✅ Error Handling – Includes try-catch blocks for robust execution
✅ Memory Management – Properly releases COM objects after execution

Use Cases

This script is perfect for:

  • Documentation Automation – Convert markdown documentation to Word format for distribution
  • Report Generation – Transform markdown reports into professional Word documents
  • Content Migration – Batch convert markdown files to Word format
  • Workflow Integration – Incorporate into larger PowerShell automation scripts
  • Quick Conversions – Ad-hoc conversion without installing additional tools

Customization Options

Change Output Location

Modify the $docPath variable to specify a different output location:

$docPath = "C:\Output\CustomName.docx"

Make Word Visible

Set $word.Visible = $true to watch the conversion process in real-time

Adjust Border Styling

Modify the Word constants to customize horizontal rule appearance:

$wdLineWidth150pt = 6  # Change border thickness
$wdColorBlack = 0 # Change border color

Requirements

  • Windows operating system
  • Microsoft Word installed
  • PowerShell 5.1 or later
  • Appropriate permissions to create COM objects

Technical Details

Word COM Constants Used:

  • wdCollapseEnd = 0 – Collapse range to end position
  • wdBorderBottom = 4 – Bottom border identifier
  • wdLineStyleSingle = 1 – Single line style
  • wdColorBlack = 0 – Black color value
  • wdLineWidth150pt = 6 – Border width (1.5pt)

Regex Patterns:

  • Headers: '^(#{1,6})\s+(.+)' – Requires space after #
  • Bullet Lists: '^\s*[-*+]\s+(.+)' – Supports -, *, + markers
  • Numbered Lists: '^\s*\d+\.\s+(.+)' – Matches numbered items
  • Horizontal Rules: '^\s*[-]{3,}\s*$' – Three or more hyphens

Conclusion

This PowerShell one-liner provides a powerful, dependency-free solution for converting Markdown to Word documents. Whether you’re automating documentation workflows or need a quick conversion tool, this script delivers professional results with minimal setup.

Simply update the file path, paste the script, and let PowerShell handle the rest!

Comprehensive Guide to Helping an Ai Coding Agent Identify and Avoid Common Coding Bad Practices

Introduction

In large projects, subtle anti-patterns can slip through reviews—like importing modules mid-file or conditionally. These non-standard import placements obscure dependencies, make static analysis unreliable, and lead to unpredictable runtime errors. This web article dives into that practice, outlines a broader set of coding bad practices, and even provides a ready-to-use AI coding agent prompt to catch every issue across your codebase.

What Is Non-Standard Import Placement?

Imports or require statements buried inside functions, conditional branches, or midway through a file violate expectations of where dependencies live. Best practices and most style guides mandate that:

  • All imports sit at the top of the file, immediately after any module docstring or comments.
  • Conditional or lazy loading only happens with clear justification and documentation.

When imports are scattered:

  1. Static analysis tools can’t reliably determine your project’s dependency graph.
  2. Developers hunting for missing or outdated modules lose time tracing hidden import logic.
  3. You risk circular dependencies, initialization bugs, or runtime surprises.

A Broader List of Coding Bad Practices

Below is a table of widespread anti-patterns—some classic hygiene issues and others that modern AI agents might inject or overlook:

Bad PracticeDescription
Spaghetti CodeCode with no clear structure making maintenance difficult.
Hardcoding ValuesEmbedding constants directly instead of using config or constants.
Magic Numbers/StringsUsing unexplained literals instead of named constants.
Global State AbuseOverusing global variables causing unpredictable side effects.
Poor Naming ConventionsUsing vague or misleading variable and function names.
Lack of ModularityWriting large monolithic blocks instead of reusable functions.
Copy-Paste ProgrammingDuplicating code rather than abstracting shared logic.
No Error HandlingIgnoring exceptions or failing to validate inputs.
OverengineeringAdding unnecessary complexity or abstraction.
Under-documentationFailing to comment or explain non-obvious logic.
Tight CouplingMaking modules overly dependent on each other.
Ignoring Style GuidesNot following language-specific conventions or style guides.
Dead CodeLeaving unused or unreachable code paths in the codebase.
Inconsistent FormattingMixing indentation styles or inconsistent code layout.
Not Using Version Control ProperlyCommitting broken code, poor commit messages, ignoring branching.
Non-standard Import PlacementPlacing imports mid-file or conditionally instead of at the top.
Missing Security ChecksOmitting authentication, authorization, or input sanitization.
Inefficient AlgorithmsUsing suboptimal logic that hurts performance.
Hallucinated DependenciesReferencing non-existent libraries or methods from AI suggestions.
Incomplete Code GenerationLeaving functions or loops unfinished due to AI cutoffs.
Prompt-biased SolutionsGenerating code that only fits the prompt and fails general cases.
Missing Corner CasesOverlooking edge cases and error conditions in logic.
Incorrect Error MessagesProviding vague or misleading error feedback to users.
Logging Sensitive DataWriting confidential information to logs without sanitization.
Violating SOLID PrinciplesBreaking single responsibility or open/closed design rules.
Race ConditionsFailing to handle concurrency leading to unpredictable bugs.

Crafting an AI Coding Agent Prompt

To ensure an AI auditor doesn’t skip files, ignore edge cases, or take shortcuts, use the following prompt. It instructs the agent to comprehensively scan every line, record each finding, and tally occurrences of every bad practice.

## Prompt

You are an expert AI code auditor. Your mission is to exhaustively scan every file and line of the codebase and uncover all instances of known bad practices. Do not skip or shortcut any part of the project, even if the code is large or complex. Report every finding with precise details and clear remediation guidance.

## Scope
- Analyze every source file, configuration, script, and module.
- Treat all code as in-scope; do not assume any file is irrelevant.

## Bad Practices to Detect
- Spaghetti Code
- Hardcoding Values
- Magic Numbers/Strings
- Global State Abuse
- Poor Naming Conventions
- Lack of Modularity
- Copy-Paste Programming
- No Error Handling
- Overengineering
- Under-documentation
- Tight Coupling
- Ignoring Style Guides
- Dead Code
- Inconsistent Formatting
- Improper Version Control Usage
- Non-standard Import Placement
- Missing Security Checks
- Inefficient Algorithms
- Hallucinated Dependencies
- Incomplete Code Generation
- Prompt-biased Solutions
- Missing Corner Cases
- Incorrect Error Messages
- Logging Sensitive Data
- Violating SOLID Principles
- Race Conditions

## Analysis Instructions
1. Traverse the entire directory tree and open every file.
2. Inspect every line—do not skip blank or comment lines.
3. Identify code snippets matching any bad practice.
4. For each instance, document:
   - File path
   - Line number(s)
   - Exact snippet
   - Bad practice name
   - Explanation of why it’s problematic
   - Suggested refactoring

5. Keep a running tally of occurrences per bad practice.

## Output Requirements
- Use Markdown with a section per file.
- Subheadings for each issue.
- End with a summary table listing each bad practice and its total count.
- If the repo is too large, process in ordered batches (e.g., by folder), confirming coverage before proceeding.
- Do not conclude until every file has been reviewed.

Begin the full project audit now, acknowledging you will not take shortcuts.

Next Steps

  • Integrate this prompt into your AI workflow or CI pipeline.
  • Pair it with linters and static analyzers (ESLint, Flake8, Prettier) for automated, real-time checks.
  • Enforce code review policies that catch both human and AI-introduced anti-patterns.

By combining clear style guidelines, automated linting, and an uncompromising AI audit prompt, you’ll dramatically improve code quality, maintainability, and security—project-wide.

How to fix no audio issues with RetroArch on Windows 10 or 11

To troubleshoot a sound issue with RetroArch, follow these steps:

  1. Check that DirectX 9 is installed on the PC. DirectX 9 is not part of Windows 11 by default and if you are running retroarch via an external drive, i.e. usb key, on a new machine it may not have the necessary runtime installed. It can be downloaded here: LINK Note: when the installer is run it will ask what folder to use. This folder is just a temp folder, essentially it is asking where the contents should be extracted. Once the process is finised navigate to the folder you chose and run the DXSETUP.exe file. Once this file has run you can delete the temp folder.
  2. Check if the RetroArch program is muted by pressing random keys on your computer. If it’s muted, go to Settings > Audio and turn off Mute.
  3. If the sound issue persists, try setting the default audio driver to Wasapi and unchecking WASAPI Exclusive Mode and WASAPI Float Format.
  4. If the volume mixer is the culprit, check if it’s listed on the volume mixer and check if RetroArch was set to low volume or muted. If it’s not listed, try downloading a new copy of RetroArch from their website and extracting it into a new directory.
  5. Restart after trying the solutions above.

How to write T-SQL Geography data to a table

Below is some example code for writing the SQL Server geography data type to a table. Note by default geography data is stored in a binary format but it can be converted to a string to make it human readable.

Note: Pass in Longitude and Latitude values in that order.

/*Demo of geo data*/
DECLARE @g GEOGRAPHY;

SET @g = GEOGRAPHY::STPointFromText('POINT(53.578741 -6.611670)', 4326);

/*Geography data is in binary format*/
SELECT @g AS 'GeoBinaryFormat';

/*Convert binary data to a string*/
SELECT @g.ToString() AS 'ConvertingDataToString';


/*Inserting geo data into Table*/
CREATE TABLE #GeoTest ([CoordinateLocation] [geography] NULL);

INSERT INTO #GeoTest (CoordinateLocation)
SELECT GEOGRAPHY::STPointFromText('POINT(53.578741 -6.611670)', 4326);

SELECT *
FROM #GeoTest;

DROP TABLE #GeoTest;
An icon depicting a calendar and clock

How to format SQL Server datetime as dd/mm/yyyy hh:mm:ss

If you are exporting the results of a SQL Server query to excel typically the recipient of the file wants the dates referenced in the format “dd/mm/yyyy hh:mm:ss” or “dd/mm/yyyy” not in the usual database format yyyy-mm-dd.

The below query formats the datetime as desired. Note that the letter m representing month is capitalised. If they are not the engine will interpret the lowercase letter m as minute so you will end up with days, minutes, years.

Also not that the letter h representing the hours is also capitalised. Capitalising the h makes the time output with the 24 hour format. Lowercase h will be 12 hour format. It is highly recommended not to use the lowercase h.

SELECT FORMAT(GETDATE(), 'dd/MM/yyyy HH:mm:ss', 'en-us')

If you only want the date and not time just remove the relevant text, i.e. just date dd/MM/yyyy or datetime without second dd/MM/yyyy HH:mm.

How to set a database to single user and back to multiuser

Here’s a quick script to set a database to single user and then back to multi user.

You’ll need to do this when altering the database in situations like database name changes, file name changes or even restores 

/*
Set Database to Single User
*/
USE [master];
GO

ALTER DATABASE [my_db]

SET SINGLE_USER
WITH

ROLLBACK IMMEDIATE;
GO

/*
Set Database to back to Multi User
*/
USE MASTER;

ALTER DATABASE [my_db]

SET MULTI_USER
WITH

ROLLBACK IMMEDIATE;
GO

 

How to get the date a web page was published

This posting is particularly useful for anyone looking to use the date a web page was published as part of academic referencing.

Step 1. In your browser open https://www.google.com/.

Step 2. Copy the url of the web page you want to get a publication date for.

Step 3. Paste this url into the search bar of the google.com page and search. The search should return a series of results.

Step 4. In the address bar of the results page go to the very end of the url text and paste &as_qdr=y15 and hit return.

This should update the results page with each link including a publication date.

How to find Missing Indexes for all databases in a SQL Server instance

This script is for SQL Server 2005 and up. The script will return all the missing indexes for a SQL Server instance, rating their impact and provide the T-SQL to create the missing indexes.

SQL Server 2005 was the first version of SQL Server to add DMV (Database Management View) and DMO (Database Management Objects) which this script requires to function.
DMV & DMO provide useful information about SQL Server like expensive queries, wait types, missing indexes etc.

WARNING!
Before you create the missing indexes on the referenced tables you must consider the following essential points:
• Find and assess all the queries that are using the table referenced. If the table has a heavy amount of Data Manipulation Language (DML) operations against it (SELECT, INSERT, UPDATE, or DELETE) then you must analyse what impact adding the missing index will have before you create it on the table. INSERTs on tables are slowed down by nonclustered indexes for example.
• You need to make sure that by creating the missing indexes you are not going to end up with duplicate indexes on tables. The duplicate or unwanted indexes can kill your database performance. Search for the blog “over-indexing can hurt your SQL Server performance” for more information.
• If you find there is already an existing index that has most of the columns of the missing index highlighted you should consider adding the missing columns to the current index rather than creating another index on the table. FYI making an index wider does not mean adding all columns from a table into the current index.

/*Script to find Missing Indexes for all databases in SQL Server*/
/*
This script is for SQL Server 2005 and up. 
The script will return all the missing indexes for a SQL Server instance, rating their impact 
and provide the T-SQL to create the missing indexes.

SQL Server 2005 was the first version of SQL Server to add DMV (Database Management View) 
and DMO (Database Management Objects) which this script requires to function. 
DMV & DMO provide useful information about SQL Server like expensive queries, wait types, missing indexes etc.

WARNING!
Before you create the missing indexes on the referenced tables you must consider the following essential points:
• Find and assess all the queries that are using the table referenced. If the table has a heavy amount of Data Manipulation Language (DML) 
operations against it (SELECT, INSERT, UPDATE, or DELETE) then you must analyse what impact adding the missing index will have before you create it on the table. 
INSERTs on tables are slowed down by nonclustered indexes for example.
• You need to make sure that by creating the missing indexes you are not going to end up with duplicate indexes on tables. 
The duplicate or unwanted indexes can kill your database performance. Search for the blog “over-indexing can hurt your SQL Server performance” for more information.
• If you find there is already an existing index that has most of the columns of the missing index highlighted you should consider adding the missing columns to 
the current index rather than creating another index on the table. FYI making an index wider does not mean adding all columns from a table into the current index.
*/
SELECT [EstIndexUses]
	,[EstIndexImpact%]
	,[EstAvgQueryCost]
	,[DbName]
	,[SchemaName]
	,[TableName]
	,[CreateIndex]
	,[EqualityColumns]
	,[InequalityColumns]
	,[IncludedColumns]
	,[UniqueCompiles]
	,[LastUserSeek]
FROM (
	SELECT migs.user_seeks AS [EstIndexUses]
		,migs.avg_user_impact AS [EstIndexImpact%]
		,migs.avg_total_user_cost AS [EstAvgQueryCost]
		,db_name(mid.database_id) AS [DbName]
		,OBJECT_SCHEMA_NAME(mid.OBJECT_ID, mid.database_id) AS [SchemaName]
		,OBJECT_NAME(mid.OBJECT_ID, mid.database_id) AS [TableName]
		,'CREATE INDEX [IX_' + OBJECT_NAME(mid.OBJECT_ID, mid.database_id) + '_' + REPLACE(REPLACE(REPLACE(ISNULL(mid.equality_columns, ''), ', ', '_'), '[', ''), ']', '') + CASE 
			WHEN mid.equality_columns IS NOT NULL
				AND mid.inequality_columns IS NOT NULL
				THEN '_'
			ELSE ''
			END + REPLACE(REPLACE(REPLACE(ISNULL(mid.inequality_columns, ''), ', ', '_'), '[', ''), ']', '') + ']' + ' ON ' + mid.statement + ' (' + ISNULL(mid.equality_columns, '') + CASE 
			WHEN mid.equality_columns IS NOT NULL
				AND mid.inequality_columns IS NOT NULL
				THEN ','
			ELSE ''
			END + ISNULL(mid.inequality_columns, '') + ')' + ISNULL(' INCLUDE (' + mid.included_columns + ') WITH (MAXDOP =?, FILLFACTOR=?, ONLINE=?, SORT_IN_TEMPDB=?);', '') AS [CreateIndex]
		,mid.equality_columns AS EqualityColumns
		,mid.inequality_columns AS InequalityColumns
		,mid.included_columns AS IncludedColumns
		,migs.unique_compiles AS UniqueCompiles
		,migs.last_user_seek AS LastUserSeek
	FROM sys.dm_db_missing_index_group_stats AS migs WITH (NOLOCK)
	INNER JOIN sys.dm_db_missing_index_groups AS mig WITH (NOLOCK) ON migs.group_handle = mig.index_group_handle
	INNER JOIN sys.dm_db_missing_index_details AS mid WITH (NOLOCK) ON mig.index_handle = mid.index_handle
	) AS a
WHERE 1 = 1
--AND [EstIndexUses] > 1000
--AND [EstIndexImpact%] > 10
--AND [EstAvgQueryCost] > 1
--AND DbName IN ('DatabaseName')
ORDER BY [EstIndexUses] DESC
	,[EstAvgQueryCost] DESC
	,[EstIndexImpact%] DESC
OPTION (RECOMPILE);

 

How to schedule a job to restore the last backup made of a SQL Server database

This post provides you with a script that will generate a restore script for a database using the latest full backup file that exists in a directory. (No need to state the filename explicitly)

You need to provide the following at the start of the script:

  • The target database i.e. the database you will restore to
  • The directory where the backup file is saved

If you are using the excellent Ola Hallengren maintenance solution (see link) the directory path will look something like below. If you’re not using Ola’s solution, you should be.

\\ServerWhereBackupsAreSaved\DriveName\InstanceName\TargetDatabase\Full\

This restore script is designed to work with Ola’s solution as it segregates the backup directory structure such that each database has an allocated folder and each full backup file is named with the date and time of the file creation.

The restore script determines which backup file is the latest backup file based on the max name. So for the script to work it is assumed you have an appropriate backup strategy (i.e. using Ola’s solution) were backup types are segregated into different folders, backup names have a date reference and the backup location is dedicated to backups and nothing else, i.e. no trash files in the location.

Some use cases for this solution might be:

  • Restoring a nightly backup to another instance for reporting purposes
  • Restoring backups to a development environment
  • Restoring backups to another server to test the backups

You can use the logic in a stored procedure or as the T-SQL in a job step and schedule accordingly.

/*
You need to reference the following:
* The target database i.e. the database you will restore to.
* The directory where the backup file is saved.
If you are using the Ola Hallengren backup scripts the directory path will look 
something like below.
\\ServerWhereBackupsAreSaved\DriveName\InstanceName\TargetDatabase\Full\

Find & Replace the follow text for the target database and directory:

TARGET_DATABASE
DIR_PATH
*/
/*
Declare Variables
*/
DECLARE @DatabaseToRestore AS VARCHAR(MAX);
DECLARE @DirToSearch AS VARCHAR(MAX);
DECLARE @ShellCommand AS VARCHAR(MAX);
DECLARE @BackupFile AS VARCHAR(MAX);
DECLARE @Sql AS VARCHAR(MAX);

/*
Set User Variables
*/
SET @DatabaseToRestore = 'TARGET_DATABASE';
SET @DirToSearch = 'DIR_PATH';
SET @ShellCommand = 'dir ' + @DirToSearch;

/*
Create Temp Table To Hold xp_cmdshell Output
*/
IF OBJECT_ID('tempdb..#DirList') IS NOT NULL
	DROP TABLE #DirList;

CREATE TABLE #DirList (
	Id INT identity(1, 1)
	,line NVARCHAR(1000)
	);

/*
Enable Advanced Options To Enable xp_cmdshell Temporarily
*/
EXEC master.dbo.sp_configure 'show advanced options'
	,1;

RECONFIGURE
WITH OVERRIDE;

EXEC master.dbo.sp_configure 'xp_cmdshell'
	,1;

RECONFIGURE
WITH OVERRIDE;

/*
Run The Shell Command To Capture And Write Dir Info To Temp Table
*/
SET @Sql = '
INSERT INTO #DirList (line)
EXEC xp_cmdshell ' + '''' + @ShellCommand + '''' + ';';

EXEC (@Sql);

EXEC master.dbo.sp_configure 'xp_cmdshell'
	,0;

/*
Disable Advanced Options And xp_cmdshell Again
*/
RECONFIGURE
WITH OVERRIDE;

EXEC master.dbo.sp_configure 'show advanced options'
	,0;

RECONFIGURE
WITH OVERRIDE;

/*
Get The Last Backup File Name And Save To A Variable
*/
WITH CTE
AS (
	SELECT SUBSTRING(line, 37, 100) [FileName]
	FROM #DirList
	WHERE Id > (
			SELECT MIN(Id)
			FROM #DirList
			WHERE line LIKE '%<DIR>%..%'
			)
		AND Id < (
			SELECT MAX(Id) - 2
			FROM #DirList
			)
	)
SELECT @BackupFile = [FileName]
FROM CTE
WHERE [FileName] = (
		SELECT MAX(FileName)
		FROM CTE
		);

/*
Create The Restore Script
*/
SET @BackupFile = @DirToSearch + @BackupFile
SET @Sql = '
ALTER DATABASE ' + QUOTENAME(@DatabaseToRestore) + ' SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
RESTORE DATABASE ' + QUOTENAME(@DatabaseToRestore) + ' FROM DISK = ' + '''' + @BackupFile + '''' + '
WITH NORECOVERY
,REPLACE;
RESTORE DATABASE ' + QUOTENAME(@DatabaseToRestore) + ' WITH RECOVERY;
'

/*
To Directly Execute The Script Uncomment The EXEC Statement And Delete The Select Statement
*/
/*
EXEC (@Sql)
*/
SELECT @Sql

 

An icon of a computer folder with a visible file and two arrows facing up and down

How to move/rename Database files in SQL Server

An example use case for the process below could be you need to move database files to a new drive. Another example might be your organisation intends to run a legacy database along side a new updated database with both sharing the same database name in the same instance with the files located in the same directory with the same names. Obviously this cannot be done and requires the database names to differ and the files to be renamed or not exist in the same directory.

For example AdventureWorks might become AdventureWorks_Legacy while a new and improved AdventureWorks database retains the original database name. The associated database file names would also need to be changed/moved to reflect this.

Someone might also want to do something like this for test purposes but obviously having test resources in a live environment would not be recommended if avoidable.

The first step to moving and renaming the files is to copy and modify the script below. Note the script below assumes you want to move and change the names of the files. To avoid any database conflicts you only need to do one or the other.

/* 
Find & Replace DbName with the name of the Database you are working with
*/
USE [DbName];

/*
Changing Physical names and paths
Replace 'C:\...\NewDbName.mdf' with full path of new Db file to be used
*/
ALTER DATABASE DbName MODIFY FILE (
	NAME = ' DbName '
	,FILENAME = 'C:\...\NewDbName.mdf'
	);

/*
Replace 'C:\...\NewDbName_log.ldf' with full path of new Db log file to be used
*/
ALTER DATABASE DbName MODIFY FILE (
	NAME = ' DbName _log'
	,FILENAME = 'C:\...\NewDbName_log.ldf'
	);

/*
Changing logical names
*/
ALTER DATABASE DbName MODIFY FILE (
	NAME = DbName
	,NEWNAME = NewDbName
	);

ALTER DATABASE DbName MODIFY FILE (
	NAME = DbName_log
	,NEWNAME = NewDbName_log
	);
Once the script has been set up as desired follow the steps below:
  1. Open Microsoft SQL Server Management Studio (SSMS).
  2. Connect to the server that houses the Db you are working with.
  3. Run the modified script
  4. Right click on the Db in SSMS and select Tasks > Take Offline
  5. If you are moving the database files log into the server that houses the database files and copy and move the MDF and LDF files to the location you specified in first two alter commands. If the script specifies new names rename the copied files to match the names given in the script exactly.
  6. Go back to SSMS and right click on the Db and select Tasks > Bring Online.
  7. If you have moved the files once the database is back online and confirmed working as expected the unused original files can be deleted.
  8. Now you can rename the Db to the new name if you wish using SSMS.