fairseq-detect-hallucination icon indicating copy to clipboard operation
fairseq-detect-hallucination copied to clipboard

Detect hallucinated tokens for conditional sequence generation.

Results 4 fairseq-detect-hallucination issues
Sort by recently updated
recently updated
newest added

I tried to run the predict_hallucination_mt.py file by downloading the pre-trained xsum model. But when, xlmr = XLMRModel.from_pretrained( model_path, checkpoint_file='checkpoint.pt' data_name_or_path=datapath ) is called, a key error occurs as follows,...

Hi, I personally find your work very interesting! There is a little question though. I wonder how much synthetic data did you generate to train the final predictor? Are the...

The xsum_wordnet.py script reads the "pos" files on lines 148-149: ``` tgt_tok_sents, tgt_tok_pos_tags = read_pos_tags_and_tokenizations(os.path.join(root, prefix.lower()+"target.pos")) src_tok_sents, src_tok_pos_tags = read_pos_tags_and_tokenizations(os.path.join(root, prefix.lower()+"source.pos")) ``` How where these files generated? Best

Hello @violet-zct Thanks for the repo. I am trying to understand the working of [`convert_spm_labels_to_raw_labels`](https://github.com/violet-zct/fairseq-detect-hallucination/blob/master/util_scripts/eval_predict_hallucination_mt.py) in the utils prediction scripts. None of the BPE tokens will start with `\u2581`. Thus...