Transformers-Tutorials
Transformers-Tutorials copied to clipboard
did you work on finetuning Tapex too? I see a finetuned model on huggingface
did you work on finetuning Tapex too? I see a finetuned model on huggingface
Hi,
All examples regarding TAPEX can be found here: https://github.com/huggingface/transformers/tree/main/examples/research_projects/tapex
Thanks a lot. I'll take a look. Also, I was curious to know, have you figured out a way to get logits and cell information from tapex wtq model?
You can get logits by specifying output_scores=True
to the generate method.
output_scores=True
It doesn't have that parameter. I am using the latest transformers version. Any idea about other alternative? tokenizer_tapex = TapexTokenizer.from_pretrained("microsoft/tapex-large-finetuned-wtq") model_tapex = BartForConditionalGeneration.from_pretrained("microsoft/tapex-large-finetuned-wtq") encoding = tokenizer_tapex(table=tmp_table, query=query, return_tensors="pt") outputs = model_tapex.generate(**encoding, output_scores=True)
Here, outputs don't have logits. @NielsRogge
output_scores=True
It doesn't have that parameter. I am using the latest transformers version. Any idea about other alternative? tokenizer_tapex = TapexTokenizer.from_pretrained("microsoft/tapex-large-finetuned-wtq") model_tapex = BartForConditionalGeneration.from_pretrained("microsoft/tapex-large-finetuned-wtq") encoding = tokenizer_tapex(table=tmp_table, query=query, return_tensors="pt") outputs = model_tapex.generate(**encoding, output_scores=True)
Here, outputs don't have logits. @NielsRogge
@NielsRogge can you please help here or at least guide me in a direction
They include the scores
, which are the raw logits in case you use greedy decoding.