wolf-li

Results 6 issues of wolf-li

建议配合 github action 使用,不需要单独找 vps 进行部署。

Using pipline is slower than using python huggingface library transformers generate function, when the model file is loaded, in using CPU envierment.

Fine-tuning Marian model use huggingface transformers library. Translate model to rust_model.ot format. Can't use pipelien translate. This is the erroer message. >Tch tensor error: cannot find the tensor named model.decoder.embed_positions.weight...

nginx 从1.9.0开始,新增加了一个stream模块,用来实现四层协议的转发、代理或者负载均衡等。

作者您好: 想问一下在README.md 中显示测试 mt5 小样本训练结果 ![image](https://github.com/bojone/t5_in_bert4keras/assets/59823739/94cb02a1-09e2-4cbf-a768-5ce050c2b5a3) 您是如何训练的使用?使用的具体是 mt5 那个模型(small、large、base、lx、lxx),可否提供训练具体参数和详细过程?

Hi I try the script use client command: python netcat.py -t 10.10.100.38 -p 5002. No response when using the ctrl +d (EOF). You code is not right. Because it's fine...