notebooks
notebooks copied to clipboard
(TF)DistilBertForTokenClassification requires the 🤗 Datasets library
trafficstars
Collision with datasets import created by the following assignment statement.
datasets = load_dataset("conll2003")
...
tokenized_datasets = datasets.map(tokenize_and_align_labels, batched=True)
...
train_set = model.prepare_tf_dataset(
tokenized_datasets["train"],
shuffle=True,
batch_size=batch_size,
collate_fn=data_collator,
)
"TFDistilBertForTokenClassification requires the 🤗 Datasets library"
Fixed simply by..
conll = load_dataset("conll2003")
...
tokenized_datasets = conll.map(tokenize_and_align_labels, batched=True)
Effects:
- notebooks/examples/token_classification-tf.ipynb
- notebooks/examples/token_classification.ipynb