Xuefeng Bai

Results 17 comments of Xuefeng Bai

Hi, @Zoher15 Our AMRBART model needs to do preprocessing and postprocessing when transforming a sentence into an AMR graph, thus your code can not work as expected. You can refer...

Hi, @Zoher15 The simplest way to use our code is to convert your downstream task data into the jsonl format [here](https://github.com/goodbai-nlp/AMRBART/blob/ff004044e0d3e75f8356dccaca05318a20ed7eb7/examples/data4parsing.jsonl#LL1C1-L1C1), then run ``` bash inference_amr.sh "xfbai/AMRBART-large-finetuned-AMR3.0-AMRParsing-v2" ``` If you...

@Zoher15 Thanks, and hope your project goes well :)

Yes, you need to follow [AMR-process](https://github.com/goodbai-nlp/AMR-Process) to linearize AMRs before feeding them into AMRBART-AMR2Text.

Hi, @ting-chih Our code does not support ``transformer pipeline``, you can try [inference-text.sh](https://github.com/goodbai-nlp/AMRBART/blob/main/fine-tune/inference-text.sh) to generate text from AMR graphs.

Hi, @flipz357 AMRBART does not support the huggingface inference api as we have modified the tokenizer and input format. You can follow the instructions [here](https://github.com/goodbai-nlp/AMRBART#inference-on-your-own-data) to inference on your own...

Hi, Thanks for your rapid reply, I'll try that. | | | | | On 9/16/2019 14:34,Amazing-J wrote: font{ line-height: 1.6; } ul,ol{ padding-left: 20px; list-style-position: inside; } First, you...