Support for local bulk file access
Nice library!
I’ve been playing around with the scryfall API lately and noticed their mention of a bulk JSON file being available every 12 hours. This mostly contains price changes, apparently, as they claim card information hardly ever changes. You can likely get away with one of these files once per release if you only care about card information.
That kind of raises a point on if there should be a client option for loading this file once upfront (many ways to go about this), and then regularly accessing it as an in memory database. There are of course more complex options for a local sqlite file as well for example.
To start, I think it would be awesome to come up with some sort of in memory client backed by a local JSON file. I have a hobby based use case for high throughput scryfall calls and this seems like the way to go.
Have you put any thought toward something like this? Thoughts on expanding this to a Client interface and providing a secondary implementation?
thanks!
At the moment this library only provides the ability to get the URLs for bulk data but does not help you in downloading them. We could provide an additional utility that helps download and cache them. Nothing is stopping people from doing that at a higher level though.
I think the download of the data is not really an issue, anyone could do this themselves. The more interesting question is fluently interacting with the bulk data. Including ideas such as:
- ClientOption to download and use bulkData as cache for each of the kinds of bulk data scryfall provides
- ClientOption to use user provided bulk data jsons
- read-thru cache (bulk data is stale up to 12hrs):
- for direct lookups only, indexing/implementation of search api queries seems more complicated
There is currently no golang library that will let you interact with scryfall bulk-data in any way. Would you accept a pull request to do the above if i were to provide it? For reference, here's all golang scryfall repos with more than zero stars