Multi-CPR
Multi-CPR copied to clipboard
能不能修复下BUG,../data/"$data_name"/train/",没有这个文件夹啊?
在ecom下新建了train文件夹,把文件放进去,提示json报错, ERROR - datasets.packaged_modules.json.json - Failed to read file '/home/xuwei/Downloads/Multi-CPR-main/data/ecom/train/corpus.tsv' with error <class 'pyarrow.lib.ArrowInvalid'>: JSON parse error: Column() changed from object to number in row 0 Traceback (most recent call last): File "/home/xuwei/anaconda3/lib/python3.8/site-packages/datasets/packaged_modules/json/json.py", line 144, in _generate_tables dataset = json.load(f) File "/home/xuwei/anaconda3/lib/python3.8/json/init.py", line 293, in load return loads(fp.read(), File "/home/xuwei/anaconda3/lib/python3.8/json/init.py", line 357, in loads return _default_decoder.decode(s) File "/home/xuwei/anaconda3/lib/python3.8/json/decoder.py", line 340, in decode raise JSONDecodeError("Extra data", s, end) json.decoder.JSONDecodeError: Extra data: line 1 column 3 (char 2)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "run_training.py", line 152, in
main()
File "run_training.py", line 76, in main
train_set = TextMatchingDataset(load_dataset(
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/datasets/load.py", line 1691, in load_dataset
builder_instance.download_and_prepare(
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/datasets/builder.py", line 605, in download_and_prepare
self._download_and_prepare(
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/datasets/builder.py", line 694, in _download_and_prepare
self._prepare_split(split_generator, **prepare_split_kwargs)
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/datasets/builder.py", line 1151, in _prepare_split
for key, table in logging.tqdm(
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/tqdm/std.py", line 1183, in iter
for obj in iterable:
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/datasets/packaged_modules/json/json.py", line 146, in _generate_tables
raise e
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/datasets/packaged_modules/json/json.py", line 122, in _generate_tables
pa_table = paj.read_json(
File "pyarrow/_json.pyx", line 259, in pyarrow._json.read_json
File "pyarrow/error.pxi", line 144, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 100, in pyarrow.lib.check_status
pyarrow.lib.ArrowInvalid: JSON parse error: Column() changed from object to number in row 0
ERROR:torch.distributed.elastic.multiprocessing.api:failed (exitcode: 1) local_rank: 0 (pid: 64255) of binary: /home/xuwei/anaconda3/bin/python
Traceback (most recent call last):
File "/home/xuwei/anaconda3/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/home/xuwei/anaconda3/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/torch/distributed/launch.py", line 193, in
main()
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/torch/distributed/launch.py", line 189, in main
launch(args)
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/torch/distributed/launch.py", line 174, in launch
run(args)
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/torch/distributed/run.py", line 715, in run
elastic_launch(
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/torch/distributed/launcher/api.py", line 131, in call
return launch_agent(self._config, self._entrypoint, list(args))
File "/home/xuwei/anaconda3/lib/python3.8/site-packages/torch/distributed/launcher/api.py", line 245, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
run_training.py FAILED
[retrieval] 这个run.sh里面的
这个文件夹下面存放的是经过数据预处理的训练数据 请阅读readme里面的Data preprocess步骤,电商数据训练可以参考下面的流程
- mkdir ../data/ecom/train
- python create_train.py --qrels_file ../data/ecom/qrels.train.tsv --query_file ../data/ecom/train.query.txt --collection_file ../data/ecom/corpus.tsv --save_to ../data/ecom/train/train.json --tokenizer_name bert-base-chinese
- sh run_train.sh