texar-pytorch icon indicating copy to clipboard operation
texar-pytorch copied to clipboard

Integrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation. This is part of the CASL project: http://casl-project.ai/

Results 36 texar-pytorch issues
Sort by recently updated
recently updated
newest added

i installed it from command line now how do i use it how do i do GPT-2

setting `pretrained_model_name` will not only define the model arch but also load the pre-trained checkpoint. We should have another `hparam` to control whether to load pre-trained checkpoint or not.

enhancement
good first issue
topic: modules

https://github.com/asyml/texar-pytorch/blob/0ba18bff28cd8fff2640021354c15dfd4aef2f72/examples/vae_text/config_lstm_yahoo.py#L62 The output is the follwoing: RuntimeError: Input batch size 128 doesn't match hidden[0] batch size 256 The issue is due to the "initial_state=lstm_states" when the decoder is forwarded.

question
topic: examples

A BART model (https://arxiv.org/pdf/1910.13461.pdf) is implemented here: https://github.com/tanyuqian/texar-pytorch/tree/master/examples/bart It has passed the test of text classification (MNLI) and summarization (CNN/DM) with greedy decoding, but it fails to run CNN/DM with...

question
topic: modules

The code blocks in the docstring are not rendered as expected int he sphinx doc. The `code block` documentation In the following [example ](https://github.com/asyml/texar-pytorch/blob/master/texar/torch/run/executor.py#L613 )(and a few following), are not...

bug
topic: docs

* initial commit * bug fixes and adjusting conv inputs * separate forward function for Discriminator and Generator and disable Gen training for debugging * remove debugger statement * texar...

Requested by Forte Team. @hunterhector

enhancement
topic: modules

I'm really enjoying this library, thanks for your work. Just curious, are there any plans to implement some sort of copying mechanism for decoding, e.g. CopyNet (https://arxiv.org/abs/1603.06393)?

enhancement
topic: modules

Adapted from [sequence_tagging](https://github.com/asyml/texar/tree/master/examples/sequence_tagging) in `texar-tf`.

Add texar-styled ELMo encoder adapted from allennlp. The corresponding tokenizer will be in another PR. Resolve some comments in #298 I checked the implementation of `ELMo` in `allennlp`, It seems...