AdaSeq
AdaSeq copied to clipboard
NotImplementedError
Is your feature request related to a problem?
2023-06-08 14:43:53,296 - INFO - modelscope - Checkpoints will be saved to NER/koubei/230608144347.471030
2023-06-08 14:43:53,296 - INFO - adaseq.training.hooks.text_logger_hook - Text logs will be saved to: NER/koubei/230608144347.471030/metrics.json
2023-06-08 14:43:53,369 - INFO - modelscope - tensorboard files will be saved to NER/koubei/230608144347.471030/tensorboard_output
/data/ningfeim/.myconda/envs/modelscope/lib/python3.7/site-packages/adaseq/modules/decoders/crf.py:293: UserWarning: where received a uint8 condition tensor. This behavior is deprecated and will be removed in a future version of PyTorch. Use a boolean condition instead. (Triggered internally at ../aten/src/ATen/native/TensorCompare.cpp:402.)
score = torch.where(mask[i].unsqueeze(1), next_score, score)
2023-06-08 14:44:00,504 - INFO - modelscope - epoch [1][50/478] lr: 5.000e-05, eta: 0:05:33, iter_time: 0.143, data_load_time: 0.013, memory: 2636, loss: 24.8958
2023-06-08 14:44:05,806 - INFO - modelscope - epoch [1][100/478] lr: 5.000e-05, eta: 0:04:44, iter_time: 0.106, data_load_time: 0.011, memory: 2689, loss: 6.3790
2023-06-08 14:44:11,301 - INFO - modelscope - epoch [1][150/478] lr: 5.000e-05, eta: 0:04:27, iter_time: 0.110, data_load_time: 0.012, memory: 2689, loss: 4.3439
2023-06-08 14:44:16,351 - INFO - modelscope - epoch [1][200/478] lr: 5.000e-05, eta: 0:04:11, iter_time: 0.101, data_load_time: 0.012, memory: 2689, loss: 3.1956
2023-06-08 14:44:22,130 - INFO - modelscope - epoch [1][250/478] lr: 5.000e-05, eta: 0:04:06, iter_time: 0.116, data_load_time: 0.014, memory: 2800, loss: 3.9149
2023-06-08 14:44:28,207 - INFO - modelscope - epoch [1][300/478] lr: 5.000e-05, eta: 0:04:02, iter_time: 0.122, data_load_time: 0.015, memory: 2800, loss: 3.5845
2023-06-08 14:44:33,858 - INFO - modelscope - epoch [1][350/478] lr: 5.000e-05, eta: 0:03:55, iter_time: 0.113, data_load_time: 0.014, memory: 2800, loss: 2.7284
2023-06-08 14:44:38,887 - INFO - modelscope - epoch [1][400/478] lr: 5.000e-05, eta: 0:03:46, iter_time: 0.101, data_load_time: 0.012, memory: 2800, loss: 2.6073
2023-06-08 14:44:44,202 - INFO - modelscope - epoch [1][450/478] lr: 5.000e-05, eta: 0:03:39, iter_time: 0.106, data_load_time: 0.012, memory: 2800, loss: 2.7159
Total test samples: 0%|▏ | 1/463 [00:00<00:46, 9.89it/sTotal test samples: 4%|███▏ | 17/463 [00:00<00:14, 30.77iTotal test samples: 11%|█████████▎ | 49/463 [00:00<00:04, Total test samples: 17%|███████████████▏ | 81/463 [00:00<00Total test samples: 24%|████████████████████▉ | 113/463 [00:Total test samples: 31%|██████████████████████████▉ | 145/46Total test samples: 38%|████████████████████████████████▉ | Total test samples: 45%|██████████████████████████████████████▊ Total test samples: 52%|████████████████████████████████████████████▊ Total test samples: 62%|█████████████████████████████████████████████████████▋ Total test samples: 69%|███████████████████████████████████████████████████████████▌ Total test samples: 76%|████████████████████████████████████████████████████████████Total test samples: 87%|████████████████████████████████████████████████████████████Total test samples: 97%|████████████████████████████████████████████████████████████Total test samples: 100%|██████████████████████████████████████████████████████████████████████████████████████| 463/463 [00:02<00:00, 227.71it/s]
Traceback (most recent call last):
File "/data/ningfeim/.myconda/envs/modelscope/bin/adaseq", line 8, in
环境: python3.7 ubuntu 22.04 cuda11.7
Name Version Build Channel
_libgcc_mutex 0.1 main
_openmp_mutex 5.1 1_gnu
absl-py 1.4.0 pypi_0 pypi
adaseq 0.6.2 pypi_0 pypi
addict 2.4.0 pypi_0 pypi
aiohttp 3.8.4 pypi_0 pypi
aiosignal 1.3.1 pypi_0 pypi
albumentations 1.3.0 pypi_0 pypi
aliyun-python-sdk-core 2.13.36 pypi_0 pypi
aliyun-python-sdk-kms 2.16.0 pypi_0 pypi
apex 0.1 pypi_0 pypi
async-timeout 4.0.2 pypi_0 pypi
asynctest 0.13.0 pypi_0 pypi
attrs 22.2.0 pypi_0 pypi
beautifulsoup4 4.11.2 pypi_0 pypi
blis 0.7.9 pypi_0 pypi
boto3 1.26.89 pypi_0 pypi
botocore 1.29.89 pypi_0 pypi
bs4 0.0.1 pypi_0 pypi
ca-certificates 2023.01.10 h06a4308_0
cachetools 5.3.0 pypi_0 pypi
catalogue 2.0.8 pypi_0 pypi
certifi 2022.12.7 py37h06a4308_0
cffi 1.15.1 pypi_0 pypi
charset-normalizer 3.1.0 pypi_0 pypi
click 8.1.3 pypi_0 pypi
confection 0.0.4 pypi_0 pypi
crcmod 1.7 pypi_0 pypi
cryptography 39.0.2 pypi_0 pypi
cycler 0.11.0 pypi_0 pypi
cymem 2.0.7 pypi_0 pypi
datasets 2.8.0 pypi_0 pypi
deepspeed 0.8.2 pypi_0 pypi
dill 0.3.6 pypi_0 pypi
einops 0.6.0 pypi_0 pypi
en-core-web-sm 3.5.0 pypi_0 pypi
filelock 3.9.0 pypi_0 pypi
fonttools 4.38.0 pypi_0 pypi
frozenlist 1.3.3 pypi_0 pypi
fsspec 2023.1.0 pypi_0 pypi
ftfy 6.1.1 pypi_0 pypi
gast 0.5.3 pypi_0 pypi
google-auth 2.16.2 pypi_0 pypi
google-auth-oauthlib 0.4.6 pypi_0 pypi
grpcio 1.52.0 pypi_0 pypi
hjson 3.1.0 pypi_0 pypi
huggingface-hub 0.13.1 pypi_0 pypi
idna 3.4 pypi_0 pypi
imageio 2.26.0 pypi_0 pypi
importlib-metadata 6.0.0 pypi_0 pypi
jieba 0.42.1 pypi_0 pypi
jinja2 3.1.2 pypi_0 pypi
jmespath 0.10.0 pypi_0 pypi
joblib 1.2.0 pypi_0 pypi
jsonplus 0.8.0 pypi_0 pypi
kiwisolver 1.4.4 pypi_0 pypi
langcodes 3.3.0 pypi_0 pypi
ld_impl_linux-64 2.38 h1181459_1
libffi 3.4.2 h6a678d5_6
libgcc-ng 11.2.0 h1234567_1
libgomp 11.2.0 h1234567_1
libstdcxx-ng 11.2.0 h1234567_1
markdown 3.4.1 pypi_0 pypi
markupsafe 2.1.2 pypi_0 pypi
matplotlib 3.5.3 pypi_0 pypi
megatron-util 1.3.1 pypi_0 pypi
mmdet 2.28.2 pypi_0 pypi
mock 5.0.1 pypi_0 pypi
modelscope 1.4.2 pypi_0 pypi
multidict 6.0.4 pypi_0 pypi
multiprocess 0.70.14 pypi_0 pypi
murmurhash 1.0.9 pypi_0 pypi
ncurses 6.4 h6a678d5_0
networkx 2.6.3 pypi_0 pypi
ninja 1.11.1 pypi_0 pypi
nltk 3.8.1 pypi_0 pypi
numpy 1.21.6 pypi_0 pypi
nvidia-cublas-cu11 11.10.3.66 pypi_0 pypi
nvidia-cuda-nvrtc-cu11 11.7.99 pypi_0 pypi
nvidia-cuda-runtime-cu11 11.7.99 pypi_0 pypi
nvidia-cudnn-cu11 8.5.0.96 pypi_0 pypi
oauthlib 3.2.2 pypi_0 pypi
opencv-python-headless 4.7.0.72 pypi_0 pypi
openssl 1.1.1t h7f8727e_0
oss2 2.17.0 pypi_0 pypi
packaging 23.0 pypi_0 pypi
pai-easynlp 0.0.7 pypi_0 pypi
pandas 1.3.5 pypi_0 pypi
pathy 0.10.1 pypi_0 pypi
pillow 9.4.0 pypi_0 pypi
pip 22.3.1 py37h06a4308_0
preshed 3.0.8 pypi_0 pypi
protobuf 3.19.0 pypi_0 pypi
psutil 5.9.4 pypi_0 pypi
py-cpuinfo 9.0.0 pypi_0 pypi
pyarrow 11.0.0 pypi_0 pypi
pyasn1 0.4.8 pypi_0 pypi
pyasn1-modules 0.2.8 pypi_0 pypi
pycocotools 2.0.6 pypi_0 pypi
pycparser 2.21 pypi_0 pypi
pycryptodome 3.17 pypi_0 pypi
pydantic 1.10.6 pypi_0 pypi
pymysql 1.0.2 pypi_0 pypi
pyparsing 3.0.9 pypi_0 pypi
pythainlp 3.1.1 pypi_0 pypi
python 3.7.16 h7a1cb2a_0
python-crfsuite 0.9.9 pypi_0 pypi
python-dateutil 2.8.2 pypi_0 pypi
pytz 2022.7.1 pypi_0 pypi
pyvi 0.1.1 pypi_0 pypi
pywavelets 1.4.0 pypi_0 pypi
pyyaml 6.0 pypi_0 pypi
qudida 0.0.4 pypi_0 pypi
readline 8.2 h5eee18b_0
regex 2022.10.31 pypi_0 pypi
requests 2.28.2 pypi_0 pypi
requests-oauthlib 1.3.1 pypi_0 pypi
responses 0.18.0 pypi_0 pypi
rouge 1.0.1 pypi_0 pypi
rsa 4.9 pypi_0 pypi
s3transfer 0.6.0 pypi_0 pypi
sacremoses 0.0.53 pypi_0 pypi
scikit-image 0.19.3 pypi_0 pypi
scikit-learn 1.0.2 pypi_0 pypi
scipy 1.7.3 pypi_0 pypi
sentencepiece 0.1.97 pypi_0 pypi
seqeval 1.2.2 pypi_0 pypi
setuptools 59.8.0 pypi_0 pypi
simplejson 3.18.3 pypi_0 pypi
six 1.16.0 pypi_0 pypi
sklearn-crfsuite 0.3.6 pypi_0 pypi
smart-open 6.3.0 pypi_0 pypi
sortedcontainers 2.4.0 pypi_0 pypi
soupsieve 2.4 pypi_0 pypi
spacy 3.5.1 pypi_0 pypi
spacy-legacy 3.0.12 pypi_0 pypi
spacy-loggers 1.0.4 pypi_0 pypi
sqlite 3.40.1 h5082296_0
srsly 2.4.6 pypi_0 pypi
subword-nmt 0.3.8 pypi_0 pypi
tabulate 0.9.0 pypi_0 pypi
tensorboard 2.11.2 pypi_0 pypi
tensorboard-data-server 0.6.1 pypi_0 pypi
tensorboard-plugin-wit 1.8.1 pypi_0 pypi
termcolor 2.2.0 pypi_0 pypi
terminaltables 3.1.10 pypi_0 pypi
thinc 8.1.9 pypi_0 pypi
threadpoolctl 3.1.0 pypi_0 pypi
tifffile 2021.11.2 pypi_0 pypi
tk 8.6.12 h1ccaba5_0
tokenizers 0.13.2 pypi_0 pypi
torch 1.12.1+cu113 pypi_0 pypi
torchvision 0.14.1 pypi_0 pypi
tornado 6.2 pypi_0 pypi
tqdm 4.65.0 pypi_0 pypi
transformers 4.26.1 pypi_0 pypi
typer 0.7.0 pypi_0 pypi
typing-extensions 4.4.0 pypi_0 pypi
urllib3 1.26.15 pypi_0 pypi
wasabi 1.1.1 pypi_0 pypi
wcwidth 0.2.6 pypi_0 pypi
werkzeug 2.2.3 pypi_0 pypi
wheel 0.38.4 py37h06a4308_0
xxhash 3.2.0 pypi_0 pypi
xz 5.2.10 h5eee18b_1
yapf 0.32.0 pypi_0 pypi
yarl 1.8.2 pypi_0 pypi
zhconv 1.4.3 pypi_0 pypi
zipp 3.15.0 pypi_0 pypi
zlib 1.2.13 h5eee18b_0
Describe the solution you'd like.
最后一行依赖对应报错行:
这块是在判断训练集类型conll
yaml训练内如下: experiment: exp_dir: NER/ # 所有实验的根文件夹 exp_name: koubei # 本次配置文件的实验名称 seed: 42 # 随机种子
dataset: data_file: # 数据文件 train: '/data/ningfeim/test/NER/koubei/datas/train.txt' dev: '/data/ningfeim/test/NER/koubei/datas/dev.txt' test: '/data/ningfeim/test/NER/koubei/datas/test.txt' data_type: conll # 数据格式
task: named-entity-recognition # 任务名称,用于加载内建的 DatasetBuilder(如果需要的话)
preprocessor: type: sequence-labeling-preprocessor # 预处理器名称 model_dir: /data/ningfeim/project/Bert-ner-yiwei/bert #huggingface/modelscope 模型名字或路径,用于初始化 Tokenizer,可缺省 max_length: 512 # 预训练模型支持的最大输入长度
data_collator: SequenceLabelingDataCollatorWithPadding # 用于 batch 转换的 data_collator 名称
model: type: sequence-labeling-model # 模型名称 embedder: model_name_or_path: /data/ningfeim/test/NER/koubei/nlp_raner_named-entity-recognition_chinese-base-news # 预训练模型名称或路径,可以是huggingface/modelscope的backbone模型,或者也可以加载modelscope上的任务模型 dropout: 0.1 # dropout 概率 use_crf: true # 是否使用CRF
train: hooks: - type: TensorboardHook max_epochs: 5 # 最大训练轮数 dataloader: batch_size_per_gpu: 8 # 训练batch_size optimizer: type: AdamW # pytorch 优化器名称 lr: 5.0e-5 # 全局学习率 param_groups: - regex: crf lr: 5.0e-1 lr_scheduler: type: LinearLR # transformers 或 pytorch 的 lr_scheduler 名称 start_factor: 1.0 end_factor: 0.0 total_iters: 20
evaluation:
dataloader:
batch_size_per_gpu: 16 # 评估batch_size
metrics:
- type: ner-metric # 所有已实现的metric见 adaseq/metainfo.py
的 Metrics
类。
- type: ner-dumper # 输出预测结果
model_type: sequence_labeling
dump_format: column
Describe alternatives you've considered.
No response
Additional context.
No response
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
请问这是什么问题呢,如有相关讨论社区麻烦也给发一下
它的配置文件不能照抄,这里说dump_format只实现了两种格式的输出:conll和jsonline,,给的配置文件里面是column