NLP

Results 10 issues of NLP

**Describe the bug** Run the instruction `sparseml.transformers.token_classification \ --output_dir models/teacher \ --model_name_or_path zoo:nlp/masked_language_modeling/bert-base/pytorch/huggingface/wikipedia_bookcorpus/base-none \ --recipe zoo:nlp/masked_language_modeling/bert-base/pytorch/huggingface/wikipedia_bookcorpus/base-none?recipe_type=transfer-token_classification \ --recipe_args '{"init_lr":0.00003}' \ --dataset_name conll2003 --per_device_train_batch_size 32 \ --per_device_eval_batch_size 32 --preprocessing_num_workers 6 \...

bug

when I run `p=Pipeline('auto')` ``` >>> from trankit import Pipeline 2022-05-31 18:01:41.938559: I tensorflow/core/util/util.cc:169] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off...

运行roberta_wwm_ext目录下的sh run_ner_msra.sh出错, 在执行了result = estimator.predict(input_fn=predict_input_fn)后, 遍历result时出错,出错信息: INFO:tensorflow:prediction_loop marked as finished I0625 11:05:48.747602 140735901344640 error_handling.py:101] prediction_loop marked as finished

when I run onnx_transformers\notebooks\benchmark_pipelines.ipynb,I got error below: onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. only use nlp_torch = pipeline("feature-extraction", onnx=False),get the inference time figure.

能直接获得前4层或前8层的输出词向量吗(或者输入句子的向量)?

提问时请尽可能提供如下信息: ### 基本信息 - 你使用的**ubuntu**: - 你使用的**Python3.6**版本: - 你使用的**Tensorflow-gpu-1.14.0**版本: - 你使用的**Keras-2.3.1**版本: - 你使用的**bert4keras**版本: - 你使用纯**tf.keras**: - 你加载的**预训练模型 通过pretraining.py训练得到的**: ### 核心代码 ``` #coding:utf-8 """ 测试MLM """ import numpy as np from...

How to use? thank you!

### Issue you'd like to raise. 2024-06-24 20:05:22,709 WARNING Failed to batch ingest runs: LangSmithConnectionError("Connection error caused failure to POST https://api.smith.langchain.com/runs/batch in LangSmith API. Please confirm your internet connection.. ConnectionError(ProtocolError('Connection...