lollms-webui icon indicating copy to clipboard operation
lollms-webui copied to clipboard

Error when trying to use Petals.

Open d13g4 opened this issue 1 year ago • 1 comments

Expected Behavior

Text generation

Current Behavior

No generation

Steps to Reproduce

Using the (as of now) newest version of lollms from the git-repo.

  1. Binding: Petals (New)
  2. Model: petals-team/StableBeluga2
  3. Say "Hi" in the Chat

Context

Text generation requested by client: uApY5__1I2lZWiP5AAAD Received message : hi! Started generation task Exception in thread Thread-11 (start_message_generation): Traceback (most recent call last): File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/usr/lib/python3.10/threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "/home/diemo/lollms-webui/api/init.py", line 2040, in start_message_generation self.discussion_messages, self.current_message, tokens = self.prepare_query(client_id, message_id, is_continue, n_tokens=self.config.min_n_predict, generation_type=generation_type) File "/home/diemo/lollms-webui/api/init.py", line 1628, in prepare_query discussion_messages += self.model.detokenize(message_tokens) File "/home/diemo/lollms-webui/zoos/bindings_zoo/bs_petals/init.py", line 230, in detokenize t = torch.IntTensor([tokens_list]) NameError: name 'torch' is not defined

d13g4 avatar Dec 23 '23 11:12 d13g4

Hi. If you are on windows, then you need to use WSL and it should work. If you are on linux, this looks like you did not install the binding correctly, try to install it again and tell me if you get an error.

ParisNeo avatar Dec 26 '23 20:12 ParisNeo