yoyodyne icon indicating copy to clipboard operation
yoyodyne copied to clipboard

Small-vocabulary sequence-to-sequence generation with optional feature conditioning

Results 50 yoyodyne issues
Sort by recently updated
recently updated
newest added

Throwing this WIP up to store all vocabs in a single embeddings matrix shared between source, target, and features. This will fix the current pointer-generator issues when we have disjoint...

PyTorch Lightning supports a mode where it automatically computes the maximum batch size on your accelerator (by collecting gradients over a few batches and then using binary search to find...

enhancement

Wu and Coterrell's hard monotonic transduce.. Closes https://github.com/CUNY-CL/yoyodyne/issues/165. It builds off lstm module. To call you pass `hmm_lstm` for `--arch`. Hard monotonic constrain is enforced with `--enable_monotonic` while context window...

It would be convenient to allow the encoder [output_size](https://github.com/CUNY-CL/yoyodyne/blob/master/yoyodyne/models/modules/lstm.py#L99) to be different from the TransformerDecoder embedding size. To illustrate the issue with this, the below code snippet ```python import torch...

enhancement

It would be nice if our RNN encoders and decoders, which are currently LSTMs, could be replaced with GRUs. A simple CLI option would be something like this. Rename the...

enhancement

We currently store separate vocabularies for source, target, and features, and have separate embeddings matrices for the encoder and the decoder. We propose to flatten this distinction the following way:...

enhancement

The library does not work with Lightning (and one suspects that Torch itself is also an issue) > 2.0.0. The first issue I encounter when running `yoyodyne-train` with no arguments...

bug
release blocker

Beam search is generally not supported by our models, though the flag exists. It appears to be supported in `lstm`; it is unclear to me whether it's supported by `feature_invariant_transformer`,...

enhancement

(Adding on issues board for documentation, this PR will be out over the week.) Wu and Cotterell's papers on strong alignment seem just up our alley for the library. There...

enhancement