lanyihuai

Results 17 comments of lanyihuai

we appreciate your suggestions for the toolkit!!!

If you want to combine Roberta encoder with gts decider, my suggestion is writing a new model class under the folder mwptoolkit/model/Seq2Tree, you can reference most code of gts. Several...

> Failure # 1 (occurred at 2022-05-23_16-22-42) Traceback (most recent call last): File "/home/marzieh/anaconda3/envs/mwptoolkit/lib/python3.7/site-packages/ray/tune/ray_trial_executor.py", line 901, in get_next_executor_event future_result = ray.get(ready_future) File "/home/marzieh/anaconda3/envs/mwptoolkit/lib/python3.7/site-packages/ray/_private/client_mode_hook.py", line 105, in wrapper return func(*args, **kwargs)...

I recommend version 1.3.0 of ray

yes, mawps-s is 5-fold setting.

SVAMP is just a dataset for test, according to SVAMP paper, trainset consists of mawps and asdiv-a. And the setting is train-test split.running it with k-fold cross validation may not...

you can change it with command line like --model=MWPBert --dataset=math23k --task_type=single_equation --gpu_id=0 --equation_fix=prefix --pretrained_model=bert-base-chinese i hope this will help you

there may be something wrong with my code when i update v0.0.6, i will check it. and i'm so sorry for that.

I got value acc 82.5, the latest result of MWPBert on math23k. here is my here is my instruction: ``` python run_mwptoolkit.py --model=MWPBert --dataset=math23k --equation_fix=prefix --task_type=single_equation --pretrained_model=hfl/chinese-bert-wwm-ext --test_step=5 --gpu_id=0 --train_batch_size=32...