ImportError: cannot import name 'clean_data_dir' from 'sklift.datasets'
🐛 Bug
To Reproduce
Steps to reproduce the behavior:
1.from sklift.datasets import fetch_x5 1.dataset = fetch_x5() 1.
Expected behavior
Environment
- scikit-uplift version (e.g., 0.1.2):
- scikit-learn version (e.g., 0.22.2):
- Python version (e.g., 3.7):
- OS (e.g., Linux):
- Any other relevant information:
Additional context
ValueError Traceback (most recent call last) Cell In[7], line 1 ----> 1 dataset = fetch_x5() 2 dataset.data.keys()
File ~/mambaforge/envs/main/lib/python3.11/site-packages/sklift/datasets/datasets.py:333, in fetch_x5(data_home, dest_subdir, download_if_missing) 327 csv_purchases_path = _get_data(data_home=data_home, url=x5_metadata['url_purchases'], dest_subdir=dest_subdir, 328 dest_filename=file_purchases, 329 download_if_missing=download_if_missing, 330 desc=x5_metadata['desc_purchases']) 332 if _get_file_hash(csv_purchases_path) != x5_metadata['hash_purchases']: --> 333 raise ValueError(f"The {file_purchases} file is broken, please clean the directory " 334 f"with the clean_data_dir() function, and run the function again") 336 purchases = pd.read_csv(csv_purchases_path) 337 purchases_features = list(purchases.columns)
ValueError: The purchases.csv.gz file is broken, please clean the directory with the clean_data_dir() function, and run the function again
from sklift.datasets import fetch_x5 dataset = fetch_x5(data_home='./my_data')
вот так проблема решилась