Chenhui Zhang
Chenhui Zhang
Duplicate of [#556](https://github.com/THUDM/ChatGLM-6B/issues/556)
Duplicate of [#890](https://github.com/THUDM/ChatGLM-6B/issues/890)
Duplicate of [#3](https://github.com/THUDM/ChatGLM-6B/issues/3)
Duplicate of [#119](https://github.com/THUDM/ChatGLM-6B/issues/119)
Duplicate of [#3](https://github.com/THUDM/ChatGLM-6B/issues/3)
Duplicate of [#712](https://github.com/THUDM/ChatGLM-6B/issues/712)
Duplicate of [#517](https://github.com/THUDM/ChatGLM-6B/issues/517)
暂时没有直接获取Embedding的API。 目前可以通过设置`output_hidden_states=True`获取隐层表示,可参考以下代码: ```python def get_hidden_states( text: str, model: PreTrainedModel, tokenizer: PreTrainedModel ) -> Optional[Tuple[torch.Tensor]]: model = model.eval() inputs = tokenizer([text], return_tensors='pt').to(model.device) out = model(**inputs, output_hidden_states=True) return out.hidden_states ```
Can you please try the code mentioned in the [Multi-GPU Deployment](https://github.com/THUDM/ChatGLM-6B/blob/main/README_en.md#multi-gpu-deployment)? ```python from utils import load_model_on_gpus model = load_model_on_gpus("THUDM/chatglm-6b", num_gpus=2) ```
Duplicate of [#17](https://github.com/THUDM/ChatGLM-6B/issues/17)