It has been nearly 6 months since my last presentation on Azure SQL Database Monitoring and Performance Tuning session at Oregon SQL Server User Group. I have redelivered this session a couple more times and accumulated another set of interesting questions from SQL Saturday Redmond (Apr 2, 2016) and SQL Saturday Madison (Apr 9, 2016). Continue reading below to find out more about what’s new with Index Advisor and QPI, as well as more DTU related questions.
This is a follow-up to my Part 1 of Azure SQL Database Q & A with SQL Community.
Q & A
Q: Would turning on Index Advisor cost me anything?
The (computation that produces) recommendations come “for free” for you. The process of applying and verifying the recommended actions is billed against your DB resources (which is the same behavior when you apply the recommendations yourself).
Please note that, the benefits gained by applying the recommendations are generally high (compared to the cost), including:
- Reduced resource utilization.
- Increased query throughput / reduced query latency.
Note: Since the talk at SQL Saturday Redmond and Madison, Index Advisor has just been revamped. There is no official announcement from Microsoft that I can find at this stage – but will blog about it as soon as I hear from the team. The new experience is displayed under Settings > Performance > Recommendations, as shown below.
Q: What does High, Substantial, Moderate and Low Impacts mean in Index Advisor?
The rating is determined by Azure SQL DB tuning models, and is based on the expected benefit that this index will provide to the overall performance of the workload.
See the above image, the impact column says “Low Impact” under the Recommendations blade. This is what the new Index Advisor looks like now.
Q: Does QPI cost me any DTUs?
Query Performance Insights (QPI) queries the data from Query Store for your DB – these queries are billed against your resources.
Note: Since the talk at SQL Saturday Redmond and Madison, QPI is now displayed under Settings > Performance > Queries.
Q: Using the DTU Calculator cannot narrow down the estimate to a single database workload in my on-prem server, is that right?
Yes, but it’s a good start. Remember that you can scale up / down anytime you need to, and the bill is pro-rated on hourly basis.
Q: Is there going to be down time when I scale up/down? What’s going to happen to my existing connections?
Note that changing the service tier and/or performance level of a database creates a replica of the original database at the new performance level, and then switches connections over to the replica. No data is lost during this process but during the brief moment when we switch over to the replica, connections to the database are disabled, so some transactions in flight may be rolled back. This window varies, but is on average under 4 seconds, and in more than 99% of cases is less than 30 seconds. Very infrequently, especially if there are large numbers of transactions in flight at the moment connections are disabled, this window may be longer.
The duration of the entire scale-up process depends on both the size and service tier of the database before and after the change. For example, a 250 GB database that is changing to, from, or within a Standard service tier, should complete within 6 hours. For a database of the same size that is changing performance levels within the Premium service tier, it should complete within 3 hours.
Video by Joe Idziorek on Service Tiers and how to scale up and down using Azure Portal is available here.
Q: Is there a way to automatically scale up / down?
Not out of the box. You will need to use several different techniques:
- Using Azure Automation. Sample script is available.
- Using Azure SQL DB Alert. This allows administrators (or any designated email address recipients) to get notified.
Note: Azure SQL DB Alert can also be linked to webhooks.
Q: How do you know what query gets that execution plan?
You can query it through the DMVs. A sample query to get which query id and query plan:
SELECT TOP 100 q.query_id, p.plan_id, p.query_plan, qt.query_text_id, qt.query_sql_text FROM sys.query_store_plan AS p INNER JOIN sys.query_store_query AS q ON p.query_id = q.query_id INNER JOIN sys.query_store_query_text AS qt ON q.query_text_id = qt.query_text_id ;
Read Monitoring Performance By Using the Query Store for more details.
Tip: If you are using SSMS, you can also view the top consumed resources like below:
Q: I have over 1TB database with 2000x transactions per seconds. Is there anything on Azure SQL DB that can fit it?
Currently no – the highest we have is 1TB which is P11. We recommend assessing if there is a way to do more compression or optimizing the design of the database. The service / performance tiers over time have had changes (for better) to reflect customers feedback. Please stay tuned – Azure Service Updates. Or check Azure SQL DB Service Tiers for the latest offerings.
Q: Is there a program that I can use to try out the Azure SQL DB for large database and heavy workload like the above.?
If you have an MSDN subscription, it is possible to use the credit that comes with it. There are also other programs such as free trials, etc.
Q: Can DTU Calculator give a good estimate at database granularity?
Yes, it works reasonably well and provides a good estimate that you can start with. Remember, you can scale up or down easily in Azure SQL Database. This post by Justin Henriksen (the author of DTU Calculator) describes what you need to capture (essentially what the estimate will be based on); take a particular attention to the Resource Utilization section of the blog post. Note that if you are extracting resource utilization of a multi-purpose server (read: laptop), DTU Calculator will take it into account in the Analysis. Check out SQL Azure Performance Benchmarking by Paul Brewer which reviews the DTU Calculator quality.
Q: What’s the max storage of per pool in Elastic Pool?
Currently 750GB per pool. Please check Elastic pool service tiers and performance in eDTUs for updates.
Q: Is there SQL Server Agent Job?
Q: Does DTU represent CPU, read, write and memory?
In Azure SQL DB Benchmark overview, DTU is described as below.
DTUs provide a way to describe the relative capacity of a performance level based on a blended measure of CPU, memory, and read and write rates offered by each performance level. Doubling the DTU rating of a database equates to doubling the database power.
Special thanks to…
My colleagues at Microsoft who I often bounce feedback from SQL Saturday events:
- Vladimir Ivanovic (Program Manager) – follow him on Azure blog for interesting discussion on Azure SQL Database Performance Recommendations.
- Srini Acharya (Program Manager) – interact with him at StackOverflow on Azure SQL Database topics especially Performance Tuning.
- Shantanu Khurekar (Program Manager) – follow him on Azure blog and interact with him at StackOverflow on Azure SQL Database topics.
Continuous innovation is the core of Azure SQL Database. Some features or offerings evolve, so the above is accurate at the time of writing. We receive users feedback and are able to improve on it, thanks to the Cloud infrastructure. Hope the Q & A above has been helpful. Remember, there is a Part 1 version of this too for more Q & A.