Pete Walsh
Pete Walsh
Hey @lgessler, we do have a mechanism for accessing a specific key in the output of another step. This works for things that act like dictionaries, tuples, or lists. For...
Now that I think of it, we could probably also support attributes in addition to keys. If anyone is interested in making a PR for that, have a look at...
Absolutely! That would be great
Hey @critocrito, thanks for this! I would LOVE to get this working with async, but also feel strongly about keeping backwards compat by continuing to support sync. Maybe we use...
`maybe_async` looks promising! But if that proves too complex, I'm fine with just duplicating functions for now. Obviously that's not a great way to write code in general, but we...
Hey @alexandre-dos-reis right now there's a bunch of places in our codebase hardcoded for checking ".md" extensions. S we'd have to grep through all of those and add ".mdx". I'm...
Hey @dduenker, are you still planning on wrapping this up?
Hey @ljudina not at the moment. Personally I just have my own shortcut command that I call periodically to create commits via fugitive.
Hey @pvcastro, a couple questions: 1. In all experiments (BERT-AllenNLP, RoBERTa-AllenNLP, BERT-transformers, RoBERTa-transformers) were you using the same optimizer? 2. When you used transformers directly (for BERT-transformers and RoBERTa-transformers) was...
Gotcha! Oh yes, I meant `BertForTokenClassification`, not `BertForSequenceClassification` 🤦 So I think the most likely source for a bug would be in the `PretrainedTransformerMismatched(Embedder|TokenIndexer)`. And any differences between BERT and...