Scott Chamberlain
Scott Chamberlain
> Something large to exercise lazy loading we could think about leveraging `dplyr` to deal with large files - and has nice summary print object for a dataset
see if this is gone in new database version
from ropensci/rfishbase#90
cool, will make sure it won't break anything first
yeah, seems best to secure these services
Hmm. We could index the whole thing in elasticsearch, and search that way. May not be appropriate, just off the top
Something like this, playing with my installation locally: ``` SELECT TABLE_NAME,COLUMN_NAME FROM information_schema.`columns` limit 300, 5; ``` ``` +-------------+--------------------+ | TABLE_NAME | COLUMN_NAME | +-------------+--------------------+ | TABLESPACES | TABLESPACE_TYPE |...
@cboettig working on this, not quite sure what the use case is right now Are users going to query for fields X, Y, and Z, and return those in any...
@cboettig that makes sense I think. Already have a working route for getting all tables and fields. Will try to add support for a `fields` parameter passed in to filter...