Over the past 6 months I have been trying to push myself to use extended events (XEvents) over SQL trace, once you get past the learning curve it’s probably the way to go. If you are operating in the Azure space then you have no choice. Extended events are what you will need to use if you want to collect information against SQL database. There are some subtle differences on how you write the T-SQL between SQL database (Azure) and your locally installed SQL Servers.
This is a new feature where you can issue TSQL commands via a TSQL editor straight from the Azure portal, why this is great is because sometimes you do not want to keep switching between your SSMS (SQL Server Management Studio) connection and the Azure portal to issue basic commands and that assumes that you have the right version of tools installed and correct ports opened locally.
This is in public preview and you can do the following:
- Query dynamic management views for real-time workload insights (Which is what I will be doing).
- Issue ad-hoc queries.
- Manage your user authentication.
To access the editor do the following: Go to your SQL database select the Tools button and then select the Query editor button.
Here you will see the Query editor option.
You will need to login, I have only SQL authentication setup but if you use Active Directory then you have the choice to use the other options.
If you fail you will obviously be told.
You want to see a blue authentication image!
Now let’s issue a query – instead of switching over to my local SSMS connection. How about some database level wait stats? I really need to look into the IO_completion wait type!
Then if you really want you can save your query (which saves it to your local machine as a SQL file):
Saved locally as per below.
Or you can open a SQL file locally (Open query option) and let that be imported into your Azure portal session, for example you could have the waits stats script pre-written and just open it up whenever you like.
It will not create a new window, it basically overwrites the contents of the current open query window, I can work with that.
I was building some basic queries around time zone manipulation and I am happy to say that I enjoyed myself as I found a way to get a time based on a geographic region.
This new feature is available to you from SQL 2016 which obviously means SQL database too (Azure). It is called AT TIME ZONE.
AT TIME ZONE implementation relies on following registry hive where time zones are installed: KEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Time Zones. This is exposed via the following query
SELECT * FROM sys.time_zone_info
Looking at books on line via https://msdn.microsoft.com/en-us/library/mt612790.aspx it is wrong.
Look at the screen shot, it states it is supported from SQL 2008 – well it is not, it only works on 2016.
Running it on a non SQL 2016 server you will get the following message:
Msg 208, Level 16, State 1, Line 1 Invalid object name ‘sys.time_zone_info’.
On a correct SQL version it will return the following. (Snippet)
This information is key as we will use it to build the time zone queries (which is very basic).
SELECT SYSDATETIMEOFFSET() AT TIME ZONE 'GMT Standard Time' AS [My Location] GO SELECT SYSDATETIMEOFFSET() AT TIME ZONE 'US Mountain Standard Time' AS [US Mountain Standard Time] GO SELECT SYSDATETIMEOFFSET() AT TIME ZONE 'Central Asia Standard Time' AS [Central Asia Standard Time] GO SELECT SYSDATETIMEOFFSET() AT TIME ZONE 'AUS Central Standard Time' AS [AUS Central Standard Time]
Based on the zone offset you will get the relevant correct time.
Have you ever wanted to start a SQL trace (server-side trace!) from SQL agent and let it stop after X minutes and write the TRC to a location? Well this is how you would do it:
ALTER PROCEDURE [dbo].[ScriptTrace] @RunTime INT, @FileName NVARCHAR(100) AS SET NOCOUNT ON DECLARE @rc INT DECLARE @TraceID INT DECLARE @maxfilesize BIGINT -- Specify Max File Size DECLARE @Now DATETIME DECLARE @StopTime DATETIME DECLARE @FileStamp NVARCHAR(25) DECLARE @COMMAND NVARCHAR(125) SET @Now = GETDATE() SET @StopTime = DATEADD (Mi , @RunTime , @Now ) SET @FileName = @FileName+'_' SET @FileStamp = CAST(DATEPART(YEAR, GETDATE()) AS NVARCHAR) + CAST(DATEPART(MONTH, GETDATE()) AS NVARCHAR) + CAST(DATEPART(DAY, GETDATE()) AS NVARCHAR) + CAST(DATEPART(HOUR, GETDATE()) AS NVARCHAR) + CAST(DATEPART(MINUTE, GETDATE()) AS NVARCHAR) SET @FileName = @FileName + @FileStamp SET @maxfilesize = 18000 EXEC @rc = sp_trace_create @TraceID OUTPUT, 0, @FQFileName, @maxfilesize, @StopTime IF (@rc != 0) GOTO error SET @COMMAND = 'SQL Trace Log @' + @FQFileName + ' activated for ' + CAST(@RunTime AS VARCHAR) + ' minute(s)' PRINT @COMMAND --Set the events for example DECLARE @on BIT SET @on = 1 EXEC sp_trace_setevent @TraceID, 14, 1, @on EXEC sp_trace_setevent @TraceID, 14, 9, @on EXEC sp_trace_setevent @TraceID, 14, 6, @on EXEC sp_trace_setevent @TraceID, 14, 10, @on EXEC sp_trace_setevent @TraceID, 14, 14, @on EXEC sp_trace_setevent @TraceID, 14, 11, @on EXEC sp_trace_setevent @TraceID, 15, 10, @on EXEC sp_trace_setevent @TraceID, 15, 14, @on EXEC sp_trace_setevent @TraceID, 15, 18, @on EXEC sp_trace_setevent @TraceID, 15, 11, @on EXEC sp_trace_setevent @TraceID, 15, 12, @on --Set the Filters DECLARE @intfilter INT DECLARE @bigintfilter BIGINT -- filter 1 include databaseId 5 EXEC sp_trace_setfilter @TraceID, 3, 1, 0, 5 -- filter 2 EXEC sp_trace_setfilter @TraceID, 10, 0, 7, N'SQL Profiler' EXEC sp_trace_setstatus @TraceID, 1 SELECT TraceID=@TraceID GOTO finish ERROR: SELECT ErrorCode=@rc FINISH: --Now call it EXEC dbo.ScriptTrace '1','G:\Logs\Trace'
output example: SQL Trace Log @G:\Logs\Trace_20153101419 activated for 1 minute(s)