Nicolas Hug
Nicolas Hug
Is speed absolutely critical in these scenarios? If it is, would it make sense to let user enable that kind of parallelism themselves? `joblib` seems like the perfect tool for...
I highly recommend you to consider `joblib` @vadimkantorov . I think it does what you need (and more) with 0.01% of the effort it would take to implement a parallel...
@oliverdain reported in https://github.com/pytorch/vision/issues/7063 that there is a similar need for KeyPoint models
Thanks for the feedback @adamjstewart . The registrators are private right now because they weren't intended to work for external packages. What kind of workflow would you like to enable?...
I'm closing this one and opening https://github.com/pytorch/vision/pull/7990 as a replacement so we can have the pytorch bot post at the top, instead of burried in the middle of the page...
Thanks for the feature request @vahvero I think what you're trying to do should be reasonably achievable by: - manually looping over all images in the batch and calling `draw_bounding_boxes()`...
@kartikayk this isn't related to testing and this isn't a "quality of life" thing, this is user-facing. It should probably not be part of https://github.com/pytorch/torchtune/issues/691.
@RdoubleA how would we set an env variable on install in a persistent way? @kartikayk should this issue be re-opened? I'm not sure torchtune needs to do much more than...
I don't understand. > Using /tmp to store temporary outputs such as model checkpoints, tokenizers, logs, etc is not good practice because /tmp is often deleted and shared across users...
The premise of this issue is that torchtune is storing stuff somewhere, and that: - "somewhere" is `/tmp` - "stuff" denote models, temporary outputs, logs, checkpoints, etc. Is this not...