Geoffrey Angus
Geoffrey Angus
Adding a more permanent solution in this PR: #2328
Hi @fire! I worked on some of the torchscript stuff so can help out here a bit. Next step is probably checking that the exported model works as expected. You...
Hi @fire, I appreciate the detailed report– it's super helpful. Documentation for torchscript is currently in progress. That said, I'm taking a look at what you have right now and...
Hi @fire, Thanks for your patience– I've put together a [sample notebook](https://gist.github.com/geoffreyangus/e65c1c93c4b3b65518fbf6cf3f6c7316) that demonstrates inference with the provided torchscript model. Hopefully this is along the lines of what you're looking...
Hi fire, Good question! For this particular use case, if you inspect the outputs of `to_inference_module_input_from_dataframe`, the only step taken by the function is creating a List[str] object of length...
From a simplicity standpoint, it would be nice to have one global dropout param... that said, we shouldn't constrain the user if they have a particular use case (i.e. reproducing...
Hi @fire, good to hear from you again 🙂 What version of Ludwig are you running?
Hi @fire! Sorry for the delay. I'm out of town this week, but am working on reproducing your issue and will get back to you shortly.
Hi @fire! Getting started on this again. Where does `mnist_dataset.csv` come from / what are some sample rows in there? I ran `ludwig datasets download mnist -o ~/Downloads/issue2292/mnist` in the...
Hi @fire, just wanted to follow up with a [sample notebook](https://gist.github.com/geoffreyangus/6996c1cd7b0311385673d876c94740c8) that goes through my own process of downloading MNIST, then training and exporting a TorchScript model. The main things...