Adrian
Adrian
> I ran the whole thing now (removed stars:2), and ended up at a total of 89245 Clojure repositories. So cool! How long did that take? Maybe that's an even...
Do you have an enterprise account? I thought the normal rate limiting was around 5k/hr.
Does this method include github's rate limiting? Otherwise, I'm trying to figure out how it gets the data so quickly while staying under github's rate limit.
Oh, got it. I read 30 repos per second as 30 requests per second for some reason. It all makes sense now 👍
This is really great stuff. I'm pretty excited about getting it integrated. This will expand the dataset quite a bit which might require some additional changes. Just brainstorming a bit:...
> GitHub only allows for 100M per file. I think that only applies to objects in a git repository. Currently, data dumps are only being uploaded as part of releases...
In version 1.0, support was added for marking bitfields in `easy-api` and making sure bitfields don't break anything, but accessing fields of structs that have bitfields is not yet supported...
I'm happy to be convinced otherwise, but here's how I was thinking about it: - except for `analysis.edn.gz`, all of the data files fit comfortably in memory. - I think...
> would you be open to adding {:var-definitions {:meta [:arglists]}} to the current analysis config? Yes! > do you have any thoughts about addressing such requirement gaps in general? perhaps...
I think there is a place for something like `:analysis :all`, but I'm not sure it's a good idea for dewey. I've been working on automating more of the dewey...