google-indexing-script
google-indexing-script copied to clipboard
429 error
site has about 6 million web pages, and a 429 error occurs during batch processing.
same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests
The API is limited to 200 requests per day (not per second) as per Google Documentation
Since this quota is at the service account level, an option would be for the script to support multiple service accounts credentials.
same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests
50 million ??? how?
site has about 6 million web pages, and a 429 error occurs during batch processing.
6 million? That's the number of articles in English on Wikipedia 😆
same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests
50 million??? That's the number of all articles in all languages on Wikipedia lol
This was also a problem for a site with 15k pages. It couldn't get all the batches. This needs to be more robust with long term local caching of urls and 200 requests per day to the indexing api.
@Pab450 @edoardolunardi
I own a math calculator website with solutions to almost all possible textbook equations, for the benefit of the doubt, I make around 1 million pages a month and I need an automated indexer, I own several websites with 10M+ pages as well
so a good client side rate limiter would be a huge help for sites like mine
yes, we have the same problem with 25k sites. "429 error"
Is there a way to increase the quota for "money"? From a maximum of 200 to more?
is that normal, the Web Search Indexing API has no "Traffic" only the Google Search Console API ? Thanks
I noticed that too on my project, I guess it’s normal or a bug on Google’s side
you can ask google more quota, but i a dont think 50m will be accepted lol
❌ Failed to request indexing.
Response was: 429
{
"error": {
"code": 429,
"message": "Quota exceeded for quota metric 'Publish requests' and limit 'Publish requests per day' of service 'indexing.googleapis.com' for consumer 'project_number:xxx'.",
"status": "RESOURCE_EXHAUSTED",
"details": [
{
"@type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "RATE_LIMIT_EXCEEDED",
"domain": "googleapis.com",
"metadata": {
"quota_location": "global",
"consumer": "projects/xxx",
"quota_limit_value": "200",
"service": "indexing.googleapis.com",
"quota_limit": "DefaultPublishRequestsPerDayPerProject",
"quota_metric": "indexing.googleapis.com/v3_publish_requests"
}
},
{
"@type": "type.googleapis.com/google.rpc.Help",
"links": [
{
"description": "Request a higher quota limit.",
"url": "https://cloud.google.com/docs/quota#requesting_higher_quota"
}
]
}
]
}
}
A great feature to implement would be to:
- [ ] 1. Stop the process while this error occur (#29)
- [ ] 2. Ask the user if we want to continue or stop the process here
- [ ] 3. Tell the user he can request a higher quota limit or retry in xx hours
Agree @AntoineKM I think option 1 is the best one, but also save the progress so the next run would not have to go through the same URLs.
Would you like to send a PR? I don't have such large sites to test it :)
I also agree with @AntoineKM The first option is the best. I also made the script run on a schedule. It would be nice to run it and forget it.