google-indexing-script icon indicating copy to clipboard operation
google-indexing-script copied to clipboard

429 error

Open davidpan opened this issue 1 year ago • 12 comments

site has about 6 million web pages, and a 429 error occurs during batch processing.

davidpan avatar Jan 26 '24 00:01 davidpan

same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests

AlizerUncaged avatar Jan 26 '24 05:01 AlizerUncaged

The API is limited to 200 requests per day (not per second) as per Google Documentation

Since this quota is at the service account level, an option would be for the script to support multiple service accounts credentials.

tguillemaud avatar Jan 26 '24 06:01 tguillemaud

same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests

50 million ??? how?

Pab450 avatar Jan 26 '24 13:01 Pab450

site has about 6 million web pages, and a 429 error occurs during batch processing.

6 million? That's the number of articles in English on Wikipedia 😆

same problem for me, my site has reached around 50 million pages as well, all in the sitemap, maybe we can implement a rate limit? let's say allow this script to only send 200 requests per minute since that's the maximum allowed request speed. i will run this script indefinitely so id appreciate if we can limit the requests

50 million??? That's the number of all articles in all languages on Wikipedia lol

edoardolunardi avatar Jan 26 '24 17:01 edoardolunardi

This was also a problem for a site with 15k pages. It couldn't get all the batches. This needs to be more robust with long term local caching of urls and 200 requests per day to the indexing api.

rmens avatar Jan 26 '24 23:01 rmens

@Pab450 @edoardolunardi

I own a math calculator website with solutions to almost all possible textbook equations, for the benefit of the doubt, I make around 1 million pages a month and I need an automated indexer, I own several websites with 10M+ pages as well

image image

so a good client side rate limiter would be a huge help for sites like mine

AlizerUncaged avatar Jan 30 '24 03:01 AlizerUncaged

yes, we have the same problem with 25k sites. "429 error"

Is there a way to increase the quota for "money"? From a maximum of 200 to more?

ingeniumdesign avatar Feb 07 '24 14:02 ingeniumdesign

is that normal, the Web Search Indexing API has no "Traffic" only the Google Search Console API ? Thanks

direct-indexing-01

ingeniumdesign avatar Feb 11 '24 16:02 ingeniumdesign

I noticed that too on my project, I guess it’s normal or a bug on Google’s side

goenning avatar Feb 11 '24 17:02 goenning

you can ask google more quota, but i a dont think 50m will be accepted lol

Noext avatar Feb 16 '24 09:02 Noext

❌ Failed to request indexing.
Response was: 429
{
  "error": {
    "code": 429,
    "message": "Quota exceeded for quota metric 'Publish requests' and limit 'Publish requests per day' of service 'indexing.googleapis.com' for consumer 'project_number:xxx'.",
    "status": "RESOURCE_EXHAUSTED",
    "details": [
      {
        "@type": "type.googleapis.com/google.rpc.ErrorInfo",
        "reason": "RATE_LIMIT_EXCEEDED",
        "domain": "googleapis.com",
        "metadata": {
          "quota_location": "global",
          "consumer": "projects/xxx",
          "quota_limit_value": "200",
          "service": "indexing.googleapis.com",
          "quota_limit": "DefaultPublishRequestsPerDayPerProject",
          "quota_metric": "indexing.googleapis.com/v3_publish_requests"
        }
      },
      {
        "@type": "type.googleapis.com/google.rpc.Help",
        "links": [
          {
            "description": "Request a higher quota limit.",
            "url": "https://cloud.google.com/docs/quota#requesting_higher_quota"
          }
        ]
      }
    ]
  }
}

A great feature to implement would be to:

  • [ ] 1. Stop the process while this error occur (#29)
  • [ ] 2. Ask the user if we want to continue or stop the process here
  • [ ] 3. Tell the user he can request a higher quota limit or retry in xx hours

antoinekm avatar Feb 23 '24 18:02 antoinekm

Agree @AntoineKM I think option 1 is the best one, but also save the progress so the next run would not have to go through the same URLs.

Would you like to send a PR? I don't have such large sites to test it :)

goenning avatar Feb 23 '24 18:02 goenning

I also agree with @AntoineKM The first option is the best. I also made the script run on a schedule. It would be nice to run it and forget it.

Taimerlan avatar Feb 29 '24 12:02 Taimerlan