Thomas Sibley
Thomas Sibley
A couple thoughts, re-reading thru this now. I generally concur with most things above. +1 for supporting remote URLs (`s3://`, `https://`, etc.), but though the interfaces overlap, I see _implementing_...
> My thinking was that the sequences and metadata and their downstream processed files in `tests/functional/filter/` should be logically consistent such that you could always regenerate the downstream files from...
> I am confused as to how [we do this correctly for TB](https://nextstrain.org/tb/global?gmax=1693074&gmin=1665254) point_down We did, apparently, but don't any more. That TB build hasn't been updated since 2018, and...
Looks like 74125f5cf9fdc1b2b9e20687be83c5d11ec3e580 is what broke that, first released in Augur 6.0.0, when [we moved from `+1`/`-1` in v1 JSONs to `+`/`-` in v2](https://docs.nextstrain.org/projects/augur/en/stable/releases/v6.html#move-to-gff-style-annotations).
@ammaraziz Could you share a example input tree, i.e. what you're passing for the `--tree` option to `augur refine` or `augur ancestral`?
Broadly agree with @emmahodcroft here. `record` is also as generic as it gets, second only to `data`, so I'm reticent to prefer that. If we do choose to further converge...
Upstream support in `xopen` (option 3) seems like the best option here, assuming a patch is accepted. Seems to me very worth submitting a patch upstream earlier than later and...
A related issue which might push us to not using the `lzma` stdlib for .xz files is supporting multi-threaded compression. But again, this could also fit into a patch for...
PR with a small patch for xopen to support `compresslevel` → https://github.com/pycompression/xopen/pull/102 And multi-threaded xz compression in xopen is already being worked on https://github.com/pycompression/xopen/pull/101.
@mvolz Thanks for your question. I don't believe such a command exists in Augur. All the information you need to produce a Newick tree is there though, and you could...