请问集成oneke时执行NNInvoker.from_config("local_sft.json").local_sft()报错如何解决,报错信息如下:
Parameter 'function'=<bound method HFLLMExecutor.map_fn of <nn4k.executor.huggingface.hf_decode_only_executor.HFDecodeOnlyExecutor object at 0x7f3db6be1000>> of the transform datasets.arrow_dataset.Dataset._map_single couldn't be hashed properly, a random hash was used instead. Make sure your transforms and parameters are serializable with pickle or dill for the dataset fingerprinting and caching to work. If you reuse this transform, the caching mechanism will consider it to be different from the previous calls and recompute everything. This warning is only showed once. Subsequent hashing failures won't be showed.
Map: 0%| | 0/1 [00:00<?, ? examples/s]
Traceback (most recent call last):
File "/home/deploy/ai/oneke/code/config/NN4K.py", line 2, in
NNInvoker.from_config("local_sft.json").local_sft()
File "/home/deploy/ai/oneke/code/openspg/openspg-master/python/nn4k/nn4k/invoker/base.py", line 151, in local_sft
LLMExecutor.from_config(sft_args).execute_sft()
File "/home/deploy/ai/oneke/code/openspg/openspg-master/python/nn4k/nn4k/executor/huggingface/base/hf_llm_executor.py", line 63, in execute_sft
train_dataset, eval_dataset = self._init_dataset(hf_sft_args)
File "/home/deploy/ai/oneke/code/openspg/openspg-master/python/nn4k/nn4k/executor/huggingface/base/hf_llm_executor.py", line 163, in _init_dataset
self._load_dataset(args.train_dataset_path, "train")
File "/home/deploy/miniconda3/envs/onekey_v3/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 591, in wrapper
out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
File "/home/deploy/miniconda3/envs/onekey_v3/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 556, in wrapper
out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
File "/home/deploy/miniconda3/envs/onekey_v3/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 3089, in map
for rank, done, content in Dataset._map_single(**dataset_kwargs):
File "/home/deploy/miniconda3/envs/onekey_v3/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 3442, in _map_single
example = apply_function_on_filtered_inputs(example, i, offset=offset)
File "/home/deploy/miniconda3/envs/onekey_v3/lib/python3.10/site-packages/datasets/arrow_dataset.py", line 3345, in apply_function_on_filtered_inputs
processed_inputs = function(*fn_args, *additional_args, **fn_kwargs)
File "/home/deploy/ai/oneke/code/openspg/openspg-master/python/nn4k/nn4k/executor/huggingface/base/hf_llm_executor.py", line 144, in map_fn
output_text = dataset["output"]
File "/home/deploy/miniconda3/envs/onekey_v3/lib/python3.10/site-packages/datasets/formatting/formatting.py", line 270, in getitem
value = self.data[key]
KeyError: 'output'