Below is some example code for writing the SQL Server geography data type to a table. Note by default geography data is stored in a binary format but it can be converted to a string to make it human readable.
The script below will provide SQL Server Instance login details such as:
- The login name
- The role membership the login is part of such as sysadmin etc.,
- Whether or not the login is enabled
- Whether or not the login is a SQL Server Login or Windows Login etc.
- Login created date
- Login modified date
The following TSQL when executed will create the TSQL needed to set all databases in a SQL Server instance to read only mode.
Typical use cases for this might include creating a copy of databases for reporting purposes or when migrating databases from one server to another.
Before you run the outputted script you may need to kill open connections to the databases or certain running activities. Obviously you would want to make sure it is safe to do so before killing any activity.
To kill activities on mass you can use the script in this post.
The following script will create a job that will run every minute to test if database mail can be sent from a job scheduled to run by the Sql Server Agent.
Simply find and replace the email address below with the email address you want to target:
Then run the script.
The operator ‘Test Operator’ and job ‘MailTest’ will be created.
The job is disabled by default, enable it to begin testing.
When you are finished run the commented out section at the bottom of the script to remove the test operator and job.
If you have just setup database mail for the first time the SQL Server Agent will need to be restarted.
If you have a password protected SSIS package you may encounter this error.
First off this seems like a very buggy problem and the solution may not work for everyone. Also, as typical, you may find that you are trying to solve more than one problem at once, for instances going from one domain to another seems to confuse matters. So if this solution doesn’t work it may be just part of the solution to your particular puzzle.
If you go to the job step properties you actually see that there are two spaces between /DECRYPT and /CHECKPOINTING.
If you reduce the space to one and run the job again you might get this error:
“Enter decryption password: Missing argument for option “decrypt””
This is because you’ve removed the password from the job associated with the package.
I’ve found that when you edit the command manually by first removing the spaces and then retyping the command with the package password this solves the error.
/DECRYPT YourPackagePassword /CHECKPOINTING
If this doesn’t work try scripting out the job, deleting it, and then running the script again.
The following code will work for a remote client request to SQL 2008 and newer.
Note: The local machine address (local_net_address) is that of the SQL Server while client_net_address is the address of the remote computer you have used to make the request.
Here’s a quick script to set a database to single user and then back to multi user.
You’ll need to do this when altering the database in situations like database name changes, file name changes or even restores
This script is for SQL Server 2005 and up. The script will return all the missing indexes for a SQL Server instance, rating their impact and provide the T-SQL to create the missing indexes.
SQL Server 2005 was the first version of SQL Server to add DMV (Database Management View) and DMO (Database Management Objects) which this script requires to function.
DMV & DMO provide useful information about SQL Server like expensive queries, wait types, missing indexes etc.
Before you create the missing indexes on the referenced tables you must consider the following essential points:
• Find and assess all the queries that are using the table referenced. If the table has a heavy amount of Data Manipulation Language (DML) operations against it (SELECT, INSERT, UPDATE, or DELETE) then you must analyse what impact adding the missing index will have before you create it on the table. INSERTs on tables are slowed down by nonclustered indexes for example.
• You need to make sure that by creating the missing indexes you are not going to end up with duplicate indexes on tables. The duplicate or unwanted indexes can kill your database performance. Search for the blog “over-indexing can hurt your SQL Server performance” for more information.
• If you find there is already an existing index that has most of the columns of the missing index highlighted you should consider adding the missing columns to the current index rather than creating another index on the table. FYI making an index wider does not mean adding all columns from a table into the current index.
So the vendors for the call system software ran the code below on the live system on a Friday night. Thanks guys. I’ve changed the name of the tables in the code for security reasons if anyone would like to use this it as an example of how not to write T-SQL code!
The vendors claim the code came from the company that developed the call system and if this is the case I think someone in head office is trying to get fired or get someone fired.
Scroll down for a review of why the code is such a mess.
Here are some of the issues with the above in no particular order.
The command is running against a 312 million row table, a 55 million row table and a 22 million row table. Although is was only planned to run once IT STILL NEEDS TO BE EFFICIENT!!!!!
It uses Select Star or (Select *) which raises the probability that SQL Server will query the whole table rather than accessing the data through indexes. Only an Id is needed to begin with.
There are subqueries used when a join would be much faster.
The subqueries are completely unnecessary when WHERE conditions could have been listed in an IN statement.
Correlated subquery!!! Literally designed to bring a server to its knees! Evaluated once for each row processed. See more on correlated sub queries here https://en.wikipedia.org/wiki/Correlated_subquery
There are EXISTS Statements used due to the poor design of the command.
A ridiculously giant case statement used! Typically the comparison data should be written to a temp table and assessed as a join when there are an excessive number of case statements.
Needless to say this query ballooned the tempdb and I’ll let you guess what happened next.
The lesson for today’s posting, trust no one!
The code below will run against every online (local i.e. not Azure SQL Server) database (excluding the system databases bar the master database) and return the size of each index in each database. Knowing these values is especially important as it is recommended that the free space available on disk should be at least 1.5 times the size of the largest index. This is only one guideline regarding disk space however and use cases vary from database to database.
The output will include:
- Database Name
- Schema Name
- Table Name
- Index Id
- Index Name
- Index Size Mb
You can uncomment the last three lines to focus on nonclustered indexes.