Tracking activity to word document for audit purpose or to prove the compliance

More than once we need to provide a prove of some of the steps, if not all, that we took while implementing a change or when solving a ticket. Sometimes this can be accomplished easily using the screen capture software or just the Microsoft built in print screen while in other cases depending on the length of the activity that we need to track it becomes heavily intrusive.

Because of this I started to search how I can do it easier by the mean of automation and powershell. The problems/questions to solved were:

  • how often I should take a capture
  • what the capture size should be
  • which capture I should write to the document

The answers to those questions were

  • every second I am checking if a mouse movement occurred and if yes a capture will be taken at the position where the mouse was
  • I chose a size that will show me enough area surrounding the mouse. This area can be changed to fit your requirements but it will affect the document size.
  • all captures taken when the mouse moves, movement of the mouse means we did something. I am not using the keyboard too much while performing my day to day tasks.

We need of course to have Word and before saving the document we need to stop the script execution.

Below is how the document look like

The script can be downloaded from here

Display folder size in a graphical way using SQL Server

Goal

To obtain the size of folders and their path in graphical way. To have the output of tree command, dir command and the folder size in only one output.

My way of achieving it

Using tsql, xp_cmdshell in order to call the tree, dir and powershell.exe command s.  The desire to have something like that came when I have to troubleshoot and see who is the biggest consumer in case of free space issues. Of course that multiple approaches and solutions can solve this question but I stopped on using the tree command because I liked the look of the output. The first output shows the information as the tree command does while the other tries to sort them by size in order to easy see the biggest consumer. One problem with this is that for locations where sql server doesn’t have access the size returned is 0. Even so I decided to finish the script and use it like this because of the output. The script has also a deep switch that will let you decide how deep you want to traverse the folders, maximum is 3.

The output of the script looks like this

sqlservertreelikeoutput

The code can be downloaded from here

Load balancing for database files among the mount points

Goal

To load balancing the database files among the number of mount points or paths that we have available for this database.

My way of achieving it

Using tsql, xp_cmdshell and powershell command. From time to time, because databases grow, more databases are created we need more and more space for storage and because of this we end up adding more disks or mount points (the case that this script, post will address) but these are not used evenly by the databases from our instance. Some of the databases that were created  in the beginning will use only the mountpoints available when the database was created. Because of this from time to time we need a cleaning method or a method to load balancing the files among the mountpoints. The script below is my attempt to load balancing those files. By default is displaying information for databases that have 4 database files. You must be aware that the script is not checking for the available free space at destination. The script is only displaying the information without executing anything. The output of the script for a test database will look like below

loadbalancingdbfiles_17July2016

The code can be downloaded from here

Duplicate objects belonging to a schema to another schema

Goal

To duplicate all the tables, views, storedprocedures, userdefinedfunctions that belongs to a schema to another schema.  Of course that there are more objects that may need to be duplicated but as a start, those are the main ones that the script does. Other objects can be added with little modification to the script.

My way of achieving it

Using powershell and scripting

The script tries to do what generate script option at the database level does but on top is trying to address the following

 

  • It tries to resolve the name of the objects inside of all the generated code to point to the objects in the right schema( new schema) – this is where the script might have an advantage over the generate script option at the database level.
    • Meaning that if the object definition was in dbo schema – SELECT * from software_definition – it will transform it in SELECT * from testuser.software_definition. As you can guess this string replacement might take a lot of time if we have a lot of objects that are scripted and the object definition is long. More than that, that string replacement might be wrong sometimes. This is why some more tests needs to be conducted.
  • It tries to decrease the need of the manual work

The output of the script looks like below

duplschema2 duplschema1

Please use it first on your virtual machines or your test instances, like adventure works sample databases or others. As I said it is not a replacement for generate script option it is just an attempt to avoid some

– clicking operations

– selecting some scripting options

– finding and replacing some strings, schema.name to newschema.name

– adding new schema name for the objects that are missing the schema prefix ( this needs to be adjusted to work as good as possible)

The code can be downloaded from here

Check what type of storage your SQL Server is attached to

Goal

To group the storage of our sql server in 4 main categories that will help us when seeking for information about the speed of the storage, how and if the storage can be increased.

My way of achieving it

Using powershell.

The easiest and more convenient way was to create an oneliner that we can run in powershell and output the desired information. I don’t know how good will work with all the other hardware configurations since the info that I used to create the oneliner was containing only the following information

Disks

  • NETAPP LUN  Multi-Path Disk Device
  • HP LOGICAL VOLUME SCSI Disk Device
  • PowerDevice by PowerPath
  • HP LOGICAL VOLUME SCSI Disk Device
  • SanDisk SD7TB3Q-256G-1006 ATA Device
  • HGST HTS545050A7E380
  • VMware Virtual disk SCSI Disk Device

Controllers

  • QLogic Fibre Channel Adapter
  • Emulex OneConnect OCe11100, iSCSI Storport Miniport Driver
  • QLogic Fibre Channel Adapter
  • EMS Device Bus
  • Microsoft Multi-Path Bus Driver
  • Smart Array P410i Controller
  • Microsoft Storage Spaces Controller

but I will update the script in time.

The onliner is this

Get-WmiObject Win32_DiskDrive | group-object model | foreach {if($_.Name -like “*virtual*”) {“1. Virtual Machine”} else {Get-WmiObject Win32_SCSIController | group-object name |foreach {switch -Wildcard ($_.Name) {“*smart array*” {“2. Smart Array”}; “*fibre*” {“3. HBA’s”}; default {“4. Buit-in Controllers”}}}}} | group-object $_ | select Name |sort-object Name

Below is a picture of running the script on 3 different servers

groupcontrollers

SQL Server database restore – preparations and tests that we must perform before starting the restore

Goal

To obtain information about the target database files , to check the backup file that we have to use, to estimate the space required and to have a list with steps done and a list with steps to do

My way of achieving it

Using linked server and xp_cmdshell. The restore operation of a SQL Server database is something that requires most of the time careful attention especially when it comes to big databases when the restore operations is time consuming and we would like to have all in place before the restore. Not only that requires our full attention, the restore requires also preparation which is in many cases time consuming and prone to errors and mistakes or something that usually I forget to check. Because of this I decided to create a script that will check some of the things required for a potentially successful restore. The script performs the following verifications:
– checks if the bak file is accessible from the target instance
– checks the drive information of the target instance for space available for the restore
– checks if target instance is older than source instance since backups from newer versions of SQL cannot be restored on old versions
– checks the full bak file for information about the size of the files that will be restored
– check if target database exists on target instance
– check if we have space for individual database files restores and if not provides some alternative locations for relocation
– generate the restore command, is very basic one that needs to be changed in case target database doesn’t exist on target instance

Before using the script we must modify the @sourceinstance, @targetinstance, @sourcedb and @targetdb with the right information.

Th output of the script is below

restoreprerequistes

The script can be downloaded from here

Display group membership (including nested groups) of a domain account using SQL Server

Goal

To obtain a list of groups that a domain account is member of.

My way of achieving it

Using SSMS and Ad Hoc Distributed Queries. In order to use AD Hoc Distributed queries temporary you must enable it using sp_configure.

Sometimes when we need to troubleshoot why a domain account doesn’t have the requested permissions we have to check his group membership. Most of the time the access to SQL server is given by making an user member of a specific active directory domain group and have that group added as a login or by adding that domain account as a login. Although most of the time this action will solve the request there might be cases when the users say that they don’t have the same rights as other users. This post address this situation by discovering the group membership of those domain accounts. Although we can use net user command with domain switch in a command prompt window

– net user domain_account /domain

the problem is that this will not return any nested groups.

When comparing 2 domain accounts for differences that they might have in SQL Server in regards to permissions we want to be sure that those accounts have the same group membership.

The output of the script is below and can be downloaded from here

groupmembership

 

SQL Server troubleshooting – what happened with my database

Goal

Obtaining in the same SSMS window, information about why a database might have been unavailable or having its status changed.

My way of achieving it

Almost all of the monitoring deployed solutions these days raise tickets every time a database becomes unavailable and from the DBA operational point of view this means that we must connect and check what happened with the database and take the required actions. Of course that depending of the monitoring implementation some tickets will be closed automatically if during the second check the database becomes online but this doesn’t mean that we don’t need to try to see why the database status changed. Although we can do our investigation using GUI tools that SQL Server provides this approach has some limitations that the script used here tries to overcome. Below are some of the limitations:

– the account we use to connect to the server where SQL Server instance is running might not have rights to access the location where default trace files are stored which will make almost impossible the usage of SQL Server Profiler

– filtering or searching of SQL Server Error log files using SQL Server Log File Viewer was not designed to search or filter after multiple strings in the same time which makes the filtering or searching of the logs after string a and after string b impossible.

Because of this and other limitations I turned my attention and I tried to find other ways of searching and filtering SQL Server Error Log files and default trace and display the required information in only one window.

The output of the script in some cases will provide us enough information to see what happened while in other cases might give us only the name of the logins that were performing activities during that time.

dbstatus

The main benefit of this approach is that we can have in one window the information pertaining to that database from SQL Server Error log files and the default trace.

The script can be downloaded from here

SQL Server troubleshooting using performance counters

Goal

Sometimes when I need to troubleshoot one local or remote SQL Server instance I need to have information from inside SQL Server instance but also information outside of  it in order to start my investigation or take some conclusions. Most of the time this is a very time consuming operation and not always straight forward hence I the need to have something that I can re-utilize over and over again in these situations.

My way of achieving it

After searching and seeing what other people are doing when it comes to this I decided that I can combine some of the ideas found and put my own ideas in one script that I can use to have access to performance counters outside of  SQL Server but obtainable from SQL Server. I already mentioned in another post and I would like to mention it again that every script that I will post might have flaws or shortcomings and should be perceived as such. More than that the scripts can be considered the result of collective effort of different people  from the internet since I am taking and using what they were sharing over the internet.

The output of the script provides this type of information but the script can be modified to return the kind of information that you would like to have and use.  For me is important when troubleshooting a SQL Server instance to know:

– the processor utilization

– available memory

– disk utilization for the drives where sql server has files

– network utilization

Counter    Value
“\\WIN-666BDQE0KVL\Memory\Commit Limit”    4292546560
“\\WIN-666BDQE0KVL\Memory\Available MBytes”    1511
“\\WIN-666BDQE0KVL\Network Interface(Intel[R] PRO_1000 MT Network Connection)\Output Queue Length”    0
“\\WIN-666BDQE0KVL\Network Interface(isatap.{F5634C4F-D7A9-4921-924B-C112B6BC5377})\Output Queue Length”    0
“\\WIN-666BDQE0KVL\Network Interface(Local Area Connection* 11)\Output Queue Length”    0
“\\WIN-666BDQE0KVL\Network Interface(Intel[R] PRO_1000 MT Network Connection)\Bytes Total/sec”    0
“\\WIN-666BDQE0KVL\Network Interface(isatap.{F5634C4F-D7A9-4921-924B-C112B6BC5377})\Bytes Total/sec”    0
“\\WIN-666BDQE0KVL\Network Interface(Local Area Connection* 11)\Bytes Total/sec”    0
“\\WIN-666BDQE0KVL\Processor(_Total)\% User Time”    -1
“\\WIN-666BDQE0KVL\Processor(_Total)\% Privileged Time”    0
“\\WIN-666BDQE0KVL\Processor(_Total)\% Processor Time”    0
“\\WIN-666BDQE0KVL\LogicalDisk(C:)\Current Disk Queue Length”    0
“\\WIN-666BDQE0KVL\LogicalDisk(G:)\Current Disk Queue Length”    0
“\\WIN-666BDQE0KVL\Process(_Total)\Page File Bytes”    2915581952
“\\WIN-666BDQE0KVL\Process(sqlservr#3)\Page File Bytes”    434040832
“\\WIN-666BDQE0KVL\Process(sqlservr#3)\% User Time”    0
“\\WIN-666BDQE0KVL\Process(sqlservr#3)\% Privileged Time”    0
“\\WIN-666BDQE0KVL\Process(sqlservr#3)\% Processor Time”    0

typeperf

I will not explain here how the above output should be used since the goal of the post was only to provide a method to obtain performance counters outside of  SQL Server

The script can be downloaded from here

Interpreting sp_WhoIsActive stored procedure output for beginers

I will start my first post by apologizing in advance for any mistakes that most probably I will do but I hope that I will learn and educate myself on the way.

The idea behind this post came to me after I first heard about the stored procedure that Adam Machanic wrote and after I saw how useful was when troubleshooting or seeing what is happening to the SQL Server. Because depending on our role in the company and our day to day activities,  I realized that I need to have something that will refresh the meaning of the output till this output will become a second nature for me.  This was my attempt to take some of the information that he already made available in his blog ( http://sqlblog.com/blogs/adam_machanic/archive/2011/04/01/a-month-of-monitoring-part-1-of-30-a-brief-history-of-monitoring.aspx ) and put them in a format that in the beginning was easier for me to understand and communicate it to other colleagues. As with everything that I will post here this was my attempt and of course it has some flaws, some shortcomings but overall I believe it makes some sense for someone that is learning the output of this stored procedure and how to use it in the beginning. In order to use it we have to follow these steps

1. Create the stored procedure using the latest version of the SP from here http://sqlblog.com/files/default.aspx

2. Create a powershell script using the code provided at the end of this post.

3. Run the stored procedure with the parameters that you want but using this output column list. This is a prerequisite because the last columns depending on the parameters received by the SP will contain more than one line and the script cannot parse it correctly. We want to have those columns at the end in order to select all the columns expect those ones.

exec sp_whoisactive
@output_column_list = ‘[session_id][dd hh:mm:ss.mss][dd hh:mm:ss.mss (avg)][physical_io][reads][physical_reads][writes][tempdb_allocations][tempdb_current][CPU][context_switches][used_memory]
[physical_io_delta][reads_delta][physical_reads_delta][writes_delta][tempdb_allocations_delta][tempdb_current_delta][CPU_delta][context_switches_delta][used_memory_delta][tasks]
[status][wait_info][locks][tran_start_time][tran_log_writes][open_tran_count][blocking_session_id][blocked_session_count][percent_complete][host_name][login_name][database_name]
[program_name]start_time][login_time][request_id][collection_time][additional_info][sql_text][sql_command]’

4. Run the powershell script but not using the ISE. We will be prompted to select the command that you ran in SSMS and press enter after you made it available in the clipboard

PS C:\Users\Administrator\Downloads> .\sp_whoisactive10august2015.ps1
Select the command that you ran in SSMS and copy it in order to be available in the clipboard (Ctrl+C).
Press Enter to continue …:

5. After pressing enter we will have to provide the output of the command  by providing also the column names. Usually I am selecting only one row and all the column headers except the additional_info, sql_text and sql_command since these as we mentioned before, sometimes, have more lines and are not parsed right by the script

6. After pressing enter again the output is parsed and it provides more information about the columns and what those means

Below is a picture with step 4, 5 and 6 and because the script is not displaying the query the first line seems to be out of context since we are not pasting the query that is captured in the columns (sql_text or sql_command).

sp_whoisactive

The script can be downloaded from here