What level of concurrency performance can I expect using Presto as part of the AWS Athena service?

I’m getting a lot of my workloads queued up when I use AWS Athena. What are my other options?

Background:

Concurrency is an important factor when it comes to submitting a lot of query workloads. Many data platform teams depend on being able to handle multiple requests from multiple users at the same time. In AWS Athena you can submit queries but they won’t always be running. The number of concurrently running queries provides you with your overall performance. As you may know, AWS Athena is serverless, so you don’t see much behind the curtains. As such depending on what amount the load is on the shared service (with other customers), you’ll see different operating characteristics, leading to non-deterministic performance. 

When your number of running queries hits a limit, any additional queries will be put into a queue to wait for queries to finish. We’ve heard that queuing can occur when there are as low as 3 or 5 concurrently running queries. This can make it difficult to provide consistent levels of performance to your users. 

Screen Shot 2021 03 09 at 6.35.45 PM

But I thought you could increase the service limits to address the queuing behavior?

Yes and no. AWS Athena has soft limits of the number of active queries. But it turns out that raising those limits don’t change the number of concurrently running queries, instead it seems to allow you to submit more requests without getting the “TooManyRequest” exception and there may be more queries that are in a RUNNING/0KB state, meaning they’re queued. 

Other alternatives

If you need to have consistent performance across a large number of queries that another approach would be to run your own Presto service. AWS Athena is a serverless version of Presto but as mentioned, has limits on performance. In addition, the pay per TB scanned model of Athena can also be cost prohibitive. There are 2 options for running your own query service: 

  1. Operating your own Presto clusters with AWS EMR or by yourself using AMIs
  2. Using a managed service like Ahana Cloud for Presto

On #2, Ahana Cloud for Presto provides you with a scalable amount of concurrent query performance but without the operations burden of managing your own clusters. Because the Presto is a distributed SQL query engine, by adding more Presto instances you can increase the amount of queries that are running at the same time:

Screen Shot 2021 03 09 at 6.33.20 PM

To compare the performance of a set of query workloads, it’s always best to look at the overall throughput, which means including the wait time associated with queued workloads in Athena. It is for this reason that running your own Presto service will always allow you to have higher performance with more flexibility than AWS Athena. 

This also has price-performance considerations and taking into account the wait time associated with queued workloads, you’ll see a much higher price-performance ratio when comparing AWS Athena to your own Presto service. 

Check out the case study from ad tech company Carbon on why they moved from AWS Athena to Ahana Cloud for better query performance and more control over their deployment.