If yes, then you likely have been experiencing a lot of issues with data presenting in your dashboards.
A few weeks back, our team noticed some GA4 issues with data loading into Looker Studio.
And by issues, well, we mean none of the data was loading.
Instead of seeing your pretty reports in Looker Studio, are you toggling between your dashboards and receiving error messages like these?
- The underlying data quota limit was exceeded. Please try again later.
- Failed to fetch the data from the underlying dataset.
- Quota Error: the data set has been accessed too many times.
- Exhausted concurrent request Quota, please send fewer requests concurrently.
- Sorry we have encountered an error and were unable to complete your request.
If so, then you have errors in your dashboard due to the fact that Looker Studio was unable to load data because your GA4 API has reached its quota limit.
Learn everything you want to know about these Looker Studio and GA4 issues and what options we see as potential long-term and short-term solutions for you below.
So What Caused These Looker Studio/GA4 Errors To Happen?
We know the biggest question you probably have right now – how can you get your Looker Studio reports that use GA4 data working again?
Sadly, the answer is not as simple as “change this to that and you’ll get your reports back.”
The week of November 14 – 18, 2022, Looker Studio users noticed they were receiving error messages in their reports that use GA4 data. Specifically, users began receiving data set configuration errors indicating that Looker Studio users could not connect to their data sets.
Coupled with the upcoming transition for websites to use GA4 as the default Google Analytics platform, this has left many of you (and basically all of us in the analytics industry) looking for solutions with that looming hard deadline coming up in July 2023.
Why Are These GA4 Errors Happening?
Many analysts first thought the errors were caused by Looker Studio. However, with some more time to reflect on this, the source is actually related to the GA4 API.
More specifically, the cause of these errors is that Google started enforcing their Google Analytics Data API (GA4) quota limits without prior notice. This means that the errors are not unique to Looker Studio, but are applicable to any product using the GA4 API – even Google Sheets.
While the issues apply to other tools, they are exacerbated by how Looker Studio processes requests.
Every element (that’s right – every single chart, table, and scorecard) in the report represent one separate API request. These requests come through every time the data has to be refreshed.
If you (or anyone with access to the dashboards on your team) engage in normal user behaviors such as:
- Changing report date ranges
- Toggling filters on and off
- Refreshing your report throughout the day
You are very likely to burn through your allotted API Calls in quick order.
GA4 Quota Limits
For the purposes of this addressing the errors you are likely facing with your reports, we will only be covering quota limits for standard GA4 properties. Google’s GA4 API quota documentation can be found here.
The quota limit can be broken down into two main categories:
- Concurrent requests
- Tokens used per time period (hour / day).
GA4 Concurrent Request Quota Errors
Errors related to concurrent requests are the most common quota errors. They are measured by the number of requests being simultaneously executed, which is limited to 10. Since every element in Looker Studio represents a separate API request, having 10 different elements on a report page can be enough to trigger a concurrent request error.
GA4 Tokens Used Per Time Period Errors
The other category of quota errors is based on the number of tokens used per hour or per day.
Every API query costs a certain number of tokens based on factors such as the complexity of the queries or the size of the source data table. The more data Google has to process in your query, the higher the expense to your API quota.
There are token limits at the property and project level and have both hourly and daily limits. These limits are compounded by the fact that API requests are processed every time data has to be refreshed.
So if you have several robust Looker Studio reports being used by many users, you can hit these limits very quickly.
What Google Is Saying About These GA4 Errors
On 11/23, Google provided an update on its Looker Studio community forum to help address issues related to the quota limits.
Their plan has two components related to how Looker Studio retries queries to mitigate the impact of concurrent requests on the quota limits and showing users the amount of quota tokens consumed by different report components.
They have also provided a troubleshooting guide.
Short Term Solution To Your GA4 Errors: Intermediary Storage
Unfortunately, analysts are currently limited on steps they can take to eliminate errors related to quota limits as Looker Studio is currently configured. The steps Google outlined above can potentially help mitigate the impact of the query limits on your reporting, but the underlying issue – the large number of queries Looker Studio makes to the GA4 API – still remains.
If you wish to continue to use the GA4 API for your reporting, the best scalable way to circumvent the quota issue is to leverage an intermediary storage solution to limit the number of API queries. This method reduces the number of queries by essentially creating a static snapshot of the data which you will then use as the data source for your reporting.
This can be accomplished using Google Sheets or Looker Studio’s Extract Data native connector.
This solution has two major drawbacks: date range limitations and user duplication.
Date Range Limitations
Both Google Sheets and the Extract Data connector have limits related to the size of the data (Google Sheets is limited to 10M cells, Extract Data can only have 100MB of data).
While these limits may be manageable for basic reporting, they are insufficient for the robust reporting that users have come to expect from GA4.
The other, arguably more troublesome, drawback to using an intermediary storage solution is that static data sources will not dedupe users across days.
For example, if you visit a site three days in a reporting period (from the same device & browser, assuming cookies persist), you should be counted as one (1) user. But in a static data source, you will be counted as three (3) users.
The only way to solve this? Count the unique number of User IDs across those days. However, this compounds the previously mentioned issues related to the size of the data source and also introduces issues with cardinality.
Because of these limitations, if you are looking to utilize an intermediary storage solution, we recommend exploring more robust third-party solutions such as Analytics Canvas.
Long Term Solution: Ditch The API And Integrate GA4 With BigQuery
This change to the GA4 API signals to a broader audience what many analysts have already realized:
Google BigQuery must be leveraged in order to get the most out of your GA4 implementation.
While the landscape of digital analytics is rapidly changing, it is clear that analysts, agencies, and businesses as a whole will need to come up with a viable solution for data storage and data analysis that isn’t dependent only on GA4.
How Can You Get Your Reports Working Again?
As documented above, you really only have two viable options for avoiding these GA4 errors in your Looker Studio reports.
- Intermediary storage
- BigQuery integration
If you already have GA4 setup – have no fear, it is still collecting data and can provide you with insights. The challenge you are facing right now is related to extraction of the data and visualization of the data in Looker Studio.
Saying all of that, if you are involved in any element of web analytics at a company that wants to make actionable, data-driven decisions using GA4 and Looker Studio – all you likely have right now are errors in your reports.
To get granular insights from your large, complex datasets, your list of options really boils down to one option: get your data into BigQuery.
While the short term solutions may work for now, it is probably best to get connected with an analytics team that knows how to handle the challenges you are currently facing.