Do you want to identify the correct Service Tier and Compute Size ( was once known as performance level) for your Azure SQL Database? How would you go about it? Would you use the DTU (Database Transaction Unit) calculator? What about the new pricing model vCore? How would you translate you current on-premises workload to the cloud?
It can be a form of trial and error especially if you are new to this but I really do recommend trying out the PowerShell script that you can access once you have installed DMA – Database Migration Assistant.
The process is two-fold. Call the PowerShell script to run where it will capture important data via perfmon counters and even via T-SQL. You will then take the output in the form of a CSV file and run a second command to give you a report. You can find the script at the following location: C:\Program Files\Microsoft Data Migration Assistant\ which you will need to CD to.
First step run the counter collector.
.\SkuRecommendationDataCollectionScript.ps1 -ComputerName Arunsql1 -OutputFilePath c:\localcounters.csv -CollectionTimeInSeconds 600 -DbConnectionString 'Server=Arunsql1;Initial Catalog=master;Integrated Security=SSPI;'
The script it pretty cool, even digging into the cache to get some details around physical and logical I/O etc. This is a snippet of the code, in case you are interested.
SELECT qs.sql_handle, qs.plan_generation_num, qs.creation_time, qs.last_execution_time, qs.plan_handle, qs.query_hash, qs.query_plan_hash, DB_NAME(CONVERT(int, pa.value)) AS db_name, qs.execution_count, qs.total_worker_time, qs.min_worker_time, qs.max_worker_time, qs.total_physical_reads, qs.min_physical_reads, qs.max_physical_reads, qs.total_logical_reads, qs.min_logical_reads, qs.max_logical_reads, qs.total_logical_writes, qs.min_logical_writes, qs.max_logical_writes, qs.total_clr_time, qs.min_clr_time, qs.max_clr_time, qs.total_elapsed_time, qs.min_elapsed_time, qs.max_elapsed_time FROM sys.dm_exec_query_stats AS qs CROSS APPLY sys.dm_exec_plan_attributes(qs.plan_handle) AS pa
Anyways, back to the main post. You will want to run this when there is a true reflection of your on-premises SQL Server work-load to paint a true picture. Once it completes expect to see a CSV file. We need that CSV file as input.
.\DmaCmd.exe /Action=SkuRecommendation /SkuRecommendationInputDataFilePath="c:\localcounters.csv" /SkuRecommendationOutputResultsFilePath="C:\localprices.html" /SkuRecommendationTsvOutputResultsFilePath="C:\localprices.tsv" /SkuRecommendationPreventPriceRefresh=true
This is what you can expect from the HTML file. These are minimum recommended levels.
I have 2 databases where one is recommended to be set at S0 10 DTUs using the DTU model and the other General Purpose Gen 5 using the vCore model. If you want to see some reasoning or I should say, the tools’ thought process check the TSV file. For example, it says that for my rsc_test database that “this database appears to have very light traffic. It will be most cost-effective to host it on the Standard Tier”. I know this is quite true.
You can also use the sliders to get an idea of the costs too. Lets say for my rsc_test database that I want a P1 service tier and compute size it will adjust the pricing.
I like this as a really solid starting point to fine tune.