lightseq icon indicating copy to clipboard operation
lightseq copied to clipboard

CUDA ERROR

Open ifromeast opened this issue 3 years ago • 1 comments

CUDA version: 11.4 NVIDIA-SMI: 470.103.01 Driver Version: 470.103.01 lightseq version: 2.2.1 When I run python3 test/ls_bart.py, I will get an ERROR that

***generator config***
beam size: 4
extra decode length(max decode length - src input length): 50
length penalty: 1
diverse lambda: 0
sampling method: beam_search
topk: 1
topp: 0.75
Allocated 1307MB GPU buffer for transformer
decoder buffer init start
decoder buffer init succeed
creating huggingface model...
====================START warmup====================
=========lightseq=========
lightseq generating...
Traceback (most recent call last):
  File "test/ls_bart.py", line 103, in <module>
    main()
  File "test/ls_bart.py", line 84, in main
    warmup(tokenizer, ls_model, hf_model, sentences)
  File "test/ls_bart.py", line 54, in warmup
    ls_generate(ls_model, tokenizer, inputs_id)
  File "test/ls_bart.py", line 30, in ls_generate
    ls_res_ids, ls_time = ls_bart(model, inputs_id)
  File "test/ls_bart.py", line 12, in ls_bart
    generated_ids = model.infer(inputs)
RuntimeError: [CUDA][ERROR] /tmp/build-via-sdist-t8nsnqiz/lightseq-2.2.1/lightseq/inference/model/encoder.cc.cu(168): CUBLAS_STATUS_NOT_SUPPORTED

How to solve this problem?

ifromeast avatar Aug 04 '22 03:08 ifromeast

You may try to downgrade the driver version or change a different CUDA version.

zjersey avatar Aug 11 '22 02:08 zjersey