openpi
openpi copied to clipboard
Data conversion error occurred
The "raw_dataset = tfds.load(raw_dataset_name, data_dir=data_dir, split="train")" code in "convert_libero_data_to_lerobot.py" reports an error. How to solve it
whats the error?
whats the error?
tyro.cli(main) tensorflow_datasets.core.registered.DatasetNotFoundError: Dataset libero_10_no_noops not found.
When converting libero dataset,ValueError: Feature mismatch in frame dictionary: Missing features: {'task'} was encountered.How to solve it?
When converting libero dataset,ValueError: Feature mismatch in
framedictionary: Missing features: {'task'} was encountered.How to solve it?
I also encounter this error, Could you resolve it?
When converting libero dataset,ValueError: Feature mismatch in
framedictionary: Missing features: {'task'} was encountered.How to solve it?I also encounter this error, Could you resolve it?
在examples/libero/convert_libero_data_to_lerobot.py中修改第88和第91行代码如下: for raw_dataset_name in RAW_DATASET_NAMES: raw_dataset = tfds.load(raw_dataset_name, data_dir=data_dir, split="train") for episode in raw_dataset: for step in episode["steps"].as_numpy_iterator(): dataset.add_frame( { "image": step["observation"]["image"], "wrist_image": step["observation"]["wrist_image"], "state": step["observation"]["state"], "actions": step["action"], "task": step["language_instruction"].decode(), #补上这句 } ) # dataset.save_episode(task=step["language_instruction"].decode()) dataset.save_episode() # 记得这里参数有个task=xxx的,也要删掉
whats the error?
tyro.cli(main) tensorflow_datasets.core.registered.DatasetNotFoundError: Dataset libero_10_no_noops not found.
Do you find the solution of it ?
When converting libero dataset,ValueError: Feature mismatch in
framedictionary: Missing features: {'task'} was encountered.How to solve it?
This seems to happen after the project owner bumped up the lerobot version. 994a098 pyproject.toml
The older version lerobot handles task per episode. The newer version handles it per step.
I made the following changes and am able to at least run the conversion script.
for step in episode["steps"].as_numpy_iterator():
dataset.add_frame(
{
"image": step["observation"]["image"],
"wrist_image": step["observation"]["wrist_image"],
"state": step["observation"]["state"],
"actions": step["action"],
"task": step["language_instruction"].decode(),
}
)
# dataset.save_episode(task=step["language_instruction"].decode())
dataset.save_episode()
Sorry for the late response here -- just pushed a PR that should fix this: https://github.com/Physical-Intelligence/openpi/pull/569