Koichi Yasuoka

Results 19 comments of Koichi Yasuoka

Yes, yes @tiberiu44 it seems much better result except for "松". But I could not download the improved model after I cleaned `~/.nlpcube/3.0/lzh` up. Well, has the new model been...

I've released https://huggingface.co/KoichiYasuoka/roberta-classical-chinese-large-sentence-segmentation for sentence segmentation of classical Chinese. You can use it with `transformers>=4.1`: ``` import torch from transformers import AutoTokenizer,AutoModelForTokenClassification tokenizer=AutoTokenizer.from_pretrained("KoichiYasuoka/roberta-classical-chinese-large-sentence-segmentation") model=AutoModelForTokenClassification.from_pretrained("KoichiYasuoka/roberta-classical-chinese-large-sentence-segmentation") s="天平二年正月十三日萃于帥老之宅申宴會也于時初春令月氣淑風和梅披鏡前之粉蘭薰珮後之香加以曙嶺移雲松掛羅而傾盖夕岫結霧鳥封縠而迷林庭舞新蝶空歸故鴈於是盖天坐地促膝飛觴忘言一室之裏開衿煙霞之外淡然自放快然自足若非翰苑何以攄情詩紀落梅之篇古今夫何異矣宜賦園梅聊成短詠" p=[model.config.id2label[q] for q in torch.argmax(model(tokenizer.encode(s,return_tensors="pt"))[0],dim=2)[0].tolist()[1:-1]]...

The models are distributed under the Apache License 2.0. You can use them (almost) freely except for trademarks.

```py !test -d transformers-4.19.2 || git clone -b v4.19.2 --depth=1 https://github.com/huggingface/transformers transformers-4.19.2 !test -d JGLUE || ( git clone --depth=1 https://github.com/yahoojapan/JGLUE && cat JGLUE/fine-tuning/patch/transformers-4.9.2_jglue-1.0.0.patch | ( cd transformers-4.19.2 && patch...

Thank you @tomohideshibata -san for confirming `transformers` v4.19.2. Here I realize that I need to replace [SEP] for another `sep_token` when I evaluate another model whose `sep_token` is not [SEP]....

Thank you @tomohideshibata -san for the information about [SEP]. Well, I've just made tentative https://github.com/KoichiYasuoka/JGLUE/blob/main/fine-tuning/patch/transformers-4.19.2_jglue-1.0.0.patch for `transformers` v4.19.2 where I included `jsquad_metrics.py` instead of changing original `squad_metrics.py`. But I couldn't...

I'm looking forward to...

Well @James-G-Hill it seems great idea but I'm vague that it is really applicable when default `port=5000`. Umm...

Hi @kaisugi -san, I needed some kind of conversion for `run_qa.py`. My tentative script on Google Colaboratory below: ``` !test -d transformers-4.19.2 || git clone -b v4.19.2 --depth=1 https://github.com/huggingface/transformers transformers-4.19.2...