Semihal

Results 12 comments of Semihal

@syssi hi! can you add pull request?

Build and install `rotary` and `layer_norm` from [flash-attn repository](https://github.com/Dao-AILab/flash-attention/tree/23e8fa5a263d1c7122bc46a86ef32030ee7130f9/csrc).

> > Build and install `rotary` and `layer_norm` from [flash-attn repository](https://github.com/Dao-AILab/flash-attention/tree/23e8fa5a263d1c7122bc46a86ef32030ee7130f9/csrc). > > hi @Semihal , can you give the command to build that? Clone the flash-attention repository with the...

Hi! I try load model and apply `.to(device)`, but i receive exception: `RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and...

Hi! Please tell me how to install LaTeX support? How to integrate Latex?

It is a pity that there is no support for Latex. The engine then doesn't make sense to me :(

@MikeyMJCO , Hi! @toomastamm suggested a good option or maybe just enable Latex support. I would like to write formulas inside the $latex$ or $$latex$$ (for centered) tag, and after...

@ssddanbrown, Hi! How workaround works. But otherwise it is inconvenient (when there are a lot of formulas ).

This, unfortunately, does not work very correctly:( ![test](https://user-images.githubusercontent.com/21113432/76327383-fc274700-62fa-11ea-84f6-303934162111.png)