transfer-learning-conv-ai icon indicating copy to clipboard operation
transfer-learning-conv-ai copied to clipboard

🦄 State-of-the-Art Conversational AI with Transfer Learning

Results 72 transfer-learning-conv-ai issues
Sort by recently updated
recently updated
newest added

I downloaded the pretrained and fine-tuned model from https://s3.amazonaws.com/models.huggingface.co/transfer-learning-chatbot/finetuned_chatbot_gpt.tar.gz tokenizer_class, model_class = GPT2Tokenizer, GPT2DoubleHeadsModel tokenizer = tokenizer_class.from_pretrained(args.model_checkpoint) tokenizer.encode('good morning') The output is [3454, None, 1054, 40164]. Get None for space...

How can i run it over localhost ?

The ```input_ids``` tensor that is provided as input for the language modeling task contains the ground-truth label. Doesn't this mean the model uses the label (in addition to the other...

In your docs, you mention that whenever the GPT2 tokenizer is used, there should be a space prefixed to the input string: https://huggingface.co/transformers/main_classes/tokenizer.html#transformers.PreTrainedTokenizer However, in the `get_dataset()` function in `utils.py`,...

How can I use other pretrained models like XLNet from huggingface database of pretrained models?

Is it possible to assign personal traits like a name, age and interests? I currently use a pretained model, will i need to retrain?

Hello, sorry if this is a silly question. Can I somehow use the multilingual model with this code? A changed the tokenized and model to tokenizer = AutoTokenizer.from_pretrained("bert-base-multilingual-cased") model =...

Hello, I followed the steps of your article and I have install pytorch with Cuda like this pip3 install torch torchvision I have python 3.7, torch 1.1.0 , ubuntu 18.04....

When setting args.num_candidates to 1, and the actual length of the candidates list of each entry is 1, I get this error during validation: ``` ERROR:ignite.engine.engine.Engine:Current run is terminating due...

I'm playing around with this wonderful code but I'm running into a curious issue when I try to train the model with my own data. I replicated the `personachat_self_original.json` file...

good first issue