TurboTransformers icon indicating copy to clipboard operation
TurboTransformers copied to clipboard

a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.

Results 51 TurboTransformers issues
Sort by recently updated
recently updated
newest added

Hi, Thank you for your awesome work! I would like to know if there is any plan on supporting vision-based Transformer? As transformers are becoming popular in vision tasks, I...

It seems that the dependency is not so much: 1. ninja (apt install ninja-build,) 2. intel MTK (pip install mtk) (which is actually not needed if using GPU inference only,...

I want to run bert on GPU with C++, and the default device is 0. Is there any way to select other device with C++ API?

enhancement

作者你好: 我想问一下使用TurboTransformers进行bert模型推理时,输出的精度会有损失吗?

Hi, I am using your library with Roberta for sequence classification, the problem raises when I use the lib with new transformers(3.4.0). ``` python from transformers import AutoModelForSequenceClassification import turbo_transformers...

感谢作者开源,提两个关于docker build的问题: 1. nvidia-docker run --gpus all 这个命令要求docker engine version >= 19.03 否则会报错 unknown flag: --gpus 对于docker < 19.03, 去掉--gpus all 执行 nvidia-docker run 2. docker 打包成镜像,然后再进入containter内部build, build的过程中还要再拉取依赖 这个过程对于对于网络隔离的环境很麻烦. 个人觉得...

``` turbo_transformers.set_num_threads(?) ``` Whatever number I put in there, it is always using 4 threads.

I tried to install the turbotransformers into my own docker image. There is no conda in my docker image. I installed packages directly using apt-get and pip. However, when executing...

When I run the "build_docker_gpu.sh", I got permission denied error. It occurred to me that it is a shared server, I don't have the root privilege. What I want to...