Hitting the rate limit skies
Heya. I just don't know how to start.
Before 20-30 hours i just realized that i got 0 remaining request to github, just because few commits to some of my repos. Such thing didn't happened until now. I was so shocked and angry that i got a rest and slept for last 20 hours. But before that i just debugged the problem.
- I recently added few more extensions to the Chrome, that wanted a personal access token. So i thought: ookey, i'll disable and remove them.
Answer: That wasn't the problem. I even reviewed their code.
- Hm.. probably is the semantic-release-cli. Oookey, i'll go and review the code more deeply.
Answer: hm, yea probably. i'll reimplement a small part so i'll see
- I really just opened my editor on new fresh folder and installed
pifyandtravis-ci.
index.js
const pify = require('pify');
const delay = require('delay');
const Travis = require('travis-ci');
async function isSyncing (travis) {
try {
const res = await pify(travis.users.get.bind(travis))();
return res.user.is_syncing;
} catch (e) {}
return false;
}
async function syncTravis (travis) {
try {
await pify(travis.users.sync.post.bind(travis))();
} catch (e) {
if (e.message !== 'Sync already in progress. Try again later.') throw e;
}
while (await isSyncing(travis)) {
await delay(1000);
}
}
async function init () {
const travis = new Travis({ version: '2.0.0', headers: { 'User-Agent': 'foobar/1.0' } })
await pify(travis.authenticate.bind(travis))({ github_token: 'my token' })
await syncTravis(travis)
}
init()
And what? Happened that I still got very huge amount of requests. So .. hm.. oookey maybe it isn't exactly the semantic-release problem, but travis-ci package problem.
But what really is the problem
I have some repo, no matter what - it happens to every repo, and i have two builds for two nodejs versions in the Travis yaml - node 6 and node 8. When build of node 6 ends, because it is not supported - semantic release get only 2 requests, which is okey. But the interesting thing is when node 8 build ends - it gets around 700 and 1500 requests. And how i know this? I commit, open the https://api.github.com/rate_limit endpoint with access_token=myTOken query param, and refreshing while the build is doing its job. When the build ends, the requests are still decreasing for few more minutes.
So.. in the end of the day, okey, probably not Semantic Release problem, but i'm opening it here just for the start. We probably can fix this if just directly use the api and not that travis-ci package. I even tried with just some request module like simple-get and request, but for some reason i got you are currently not allowed to perform this request. please contact [email protected]. response from the api. So i probably will contact them too in next days, but this in any way isn't related to this issue.
In any way, it's freaking strange, cuz 1) i use semantic-release for last few months, and 2) travis-ci has not release in last year, so nothing is changed.
duh.. sorry this issue probably was for the https://github.com/semantic-release/semantic-release, not here, since it is executed in the travis.
Thanks for opening the issue with all the information. I'm sorry this all caused you so much trouble and frustration, we have all been there ...
I can't tell when we will be able to look into it, but pull requests are always welcome. And the issue being here is fine, don't worry about it.
I use semantic-release myself excessively and I did not run into this problem myself, but hope we can find out what is causing the huge amount of requests in your case
When the build ends, the requests are still decreasing for few more minutes.
can you link to the build where you logged the requests? I think that might be the root of the problem, maybe https://github.com/semantic-release/travis-deploy-once keeps sending requests even though it identified the build leader and should just stop
Yea. Last few builds of this one https://travis-ci.org/tunnckoCore/rollup-config-tunnckocore/builds
And i don't think it is a travis-deploy-once problem.
I now ran https://travis-ci.org/tunnckoCore/hela-config-tunnckocore/builds/292912591 with 8.0.0 version and it still getting requests. For now is 4800, they were 4996.
4624...
4514...
4370...
Stopped getting requests at 4301 :D Okey.
I don't see the request logs
I'm talking about the rate_limit endpoint.
But in any way, the whole case is very strange cuz it is happening even with older versions of semantic release, hm. I don't see how all this can be debugged..
Okey, one more thing to note. Now i just reinstalled the cli locally, to 3.1.0. And even just after the commit, github requests started to decrease, even before the build was triggered. So probably, really the problem is here too.
After analysis it doesn't seems there is any code in the cli or in the core that would do all those requests. I might miss something though...
Do you still experience the issue? Did you figure out exactly where the requests come from? There seems to be different/conflicting possible sources mentioned throughout this issue.
Absolutely, still. And no didn't figure it out. Initially i was shocked and while looking over the code i still didn't realize how this is possible. Actually, realized that i might be a travis package problem.
Don't know. But i fixed all the things just by creating new and new packages for automating these things - and that works perfectly for now :) Talking about the new-release package and the new-release github app.
@pvdlg @olstenlarck I ran into this issue just now, and have located the problem:
The problem occurs at exactly this line: https://github.com/semantic-release/github/blob/master/lib/success.js#L23
By mere chance I tried semantic-release on an old test repo I have, which contains 200 commits and no tags were ever created. So as you can imagine, the github success plugin is doing ~200 requests while searching for the issues and hits the rate limit.
I added a console.log to the request library and here is the output:
[Semantic release]: Published GitHub release: https://github.com/Pavel910/webiny-lerna/releases/tag/pavel910-test%40v1.0.0
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+5b1fa893430e13802ac9fb18370b6299c3f8a276+0d78ab9a3ea29dfee4c0cff3906e5b84db803179+264d8f4b5bcf3b8b2d97e377802e11a4e9deea1e+1e9448cf43a926bd82233466e69c98ed73fa3234+f12644c9c6a26f512421ea2953c0cb0cffb24148
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+9c58c2cacbcb68dc39649d00efcfaef03a45b596+9a034543659a4e6ec75cc2064c05c355316b7e14+bbeafdc2b9461eb5290be8231f7c0c86e3b0287e+d2a62db7c4ebf5b360d39fe4c7810cbab584cd64+2f4d506a44dfa8261e91dcac95d60ddce37f566c
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+358a67b01a4f44d78cf3c8fceb6dd757ba356eb9+5117a8c0a4d582a604f1c2c0b28edc5f6ca004b4+ad9c8b536b7b9328e894f8666db7eda6a9cbcf34+b57f456fadce0c0bef8210c98aace1c0f4ba8f26+e463a90d7ae3de71a5e3ad7a60bc5a2e7ead09d2
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+8b8d765d2e969a6de4046a958baad6596fcfac67+ca05444775ed2066af10a0a1f0e9c5713d1963b4+e85266b2bbd54eb545c3090675aecd495d3a6816+813255538c5cad8b72425ae99d6d4789a2333406+b2554bca176905c3d477986c7d56063a1cf87bb2
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+a7c3f4b3fb01ce5c4fd8ff17f2337fd1ae0b3cdf+740eb876f9566501fe5bb6da7ec9dae9f146b4eb+a648077a474a9425fcd109f50979160701e90707+d7ede76565944158c4d63c491435b42ff6f3f8dd+8d727bb1319ce5d77653a4ab6e697cef7e92e354
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+849bc91319611c15efce74b7e887fbf94c2a15d3+4d3c0c1d66f5a1884f1b85fefb529aff9b0a57d6+7357b706471af67006434a245d91af1defb8a076+32858790ee9ef4a163afd90a70e42adafc5a3e24+333fc91f0d0109b683a225d2ffa180a67e4ab76b
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+5b20bac02e953f64652b324c2c445b5b3948ada3+eaa137ebf54e12b05324af77e11f4888ebbe452f+8ee8f9bc7274b448851947c0bf9049d8eaba060b+e9d0990610a18ec6c818adc327c1e3aa3bb1a112+4eff6d57c9d9e366aeacc26f0b5dd4f9082e2a25
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+862aa758d11d12dadfbd72e2a7ba342f838ab5dd+4cf0547a7f84f8eaa831aca645e1fcd612ba0fad+884de603fae56c6e05d53e9a4151f629ed8ad58b+aa2273d0e7d3cd110fef8a682befb25445dab987+8bab2276202504e83f475c33a61f50e785fb9e09
REQUESTING null://api.github.com/search/issues?q=repo%3APavel910%2Fwebiny-lerna+type%3Apr+653eabd62b8ed67befdaa0692bcd9b5c8f244ba1+f706505821cbece8d2f9ac142f60ecd937f77126+49ecb9643b49f7a4672b70b6fc0073d3e7c0178c+b17381b10f47bed16f45f4a38f3e8054b6e14fc4+796ccbccfaa6a2d3633275650c5ce8eae036ce4e
// ... and so on
until finally it throws
HttpError: {"message":"API rate limit exceeded for user ID XXXXXXXX.","documentation_url":"https://developer.github.com/v3/#rate-limiting"}
and in the output I found that my rate limit is 30 so I easily hit it:
...
'x-ratelimit-limit': '30',
'x-ratelimit-remaining': '0',
...
People who start with semantic-release right away will probably not notice this behavior because they will never have that many unprocessed commits.
In my case I simply disabled the success plugin for the first release and everything is fine now.
@Pavel910 the code you mention has been written a week ago, while the this issue mention the setup cli and was opened 4 month ago. So it's definitely not the same problem.
That said the rate-limit issue you are bringing up is a problem. That's going to be addressed in semantic-release/github#37 (we'll add throttling in addition of retry).