AI-generated Key Takeaways
-
Google Ads API v19 and later use a fixed page size of 10,000 rows, and the
page_size
field is removed and cannot be set. -
GoogleAdsService.Search
splits results into multiple responses, each containing 10,000 objects. -
To retrieve subsequent pages of results, you need to send the request again, updating the
page_token
with thenext_page_token
from the previous response. -
Client libraries automatically handle paging, while REST users must explicitly request each new page.
-
The Google Ads API caches the entire dataset internally, making subsequent requests faster, provided the query remains exactly the same.
-
For manual paging needs, retrieve and store results locally to build custom pagination logic.
GoogleAdsService.Search
supports
paging in fixed page sizes of 10,000 rows. The result set of the query is split
into multiple responses, each of which 10,000 objects.
As an example, consider the following query:
SELECT
ad_group.id,
ad_group_criterion.type,
ad_group_criterion.criterion_id,
ad_group_criterion.keyword.text,
ad_group_criterion.keyword.match_type
FROM ad_group_criterion
WHERE ad_group_criterion.type = KEYWORD
If your account contains 50,000 keywords, the result set will contain 10,000
GoogleAdsRow
objects in the first response,
along with a
next_page_token
.
To retrieve the next 10,000 rows, send the request again, but update the
request's page_token
to the response's
next_page_token
.
Note that
next_page_token
is not populated in the response that contains the last batch of rows.
Our client libraries handle paging automatically. You only have to iterate through the rows of the response. When all rows in the current page have been returned, the client library fetches a new page of rows automatically on your behalf until the entire dataset is retrieved. If using REST instead of gRPC, you must explicitly make a request for each new page.
The Google Ads API internally caches the entire dataset, so subsequent requests are faster than the initial one.
Your query must remain exactly the same in subsequent requests to take advantage of the cached data. The requests won't contribute towards your quota, particularly for Basic Access level. If the query differs and is sent along with the same page token, an error is returned.