Xie Zejian
Xie Zejian
## Deepspeed Engine not freeing GPU memory after moving to CPU ### Description I am trying to move the entire Deepspeed engine to the CPU. As observed, the Deepspeed engine...
Thanks for creating this amazing plugin. I found for some kind of doc string, lsp_signature will raise an error: ``` Error executing vim.schedule lua callback: /usr/share/nvim/runtime/lua/vim/lsp/util.lu a:1402: attempt to index...
Press `e` at some lazygit file, neovim will be opened inside lazygit instead open it in original neovim like ranger
I tried the example mentioned at README, but the behavior seems not properly:  I am using ConfirmBehavior.Replace and default config for copilot-cmp
**Describe the bug** After modify file system on host, windows guest have to refresh to know that. **To Reproduce** Steps to reproduce the behaviour: Write something on host **Expected behavior**...
If I set highlight overrides in config, say ```lua highlight_overrides = { all = function(cp) return { PmenuSel = { bg = cp.green, fg = cp.base }, } end, }...
Some times I can find `sqlite3.operationalerror` in lsp.log so I guess it's relevant to concurrent behavior. Most cases, it's just doesn't work.
For large PDF converted html, jiffy reader works but very slow, can I finish the binoic-reading-convertion "offline" without doing it every time I opened this html file?
I cannot install this as torch 1.4.0 doesn't exist at pypi, have you tested that for new torch, say torch 1.13 or even torch 2.0?
看了一下好像这个地址下不来了? https://dldir1.qq.com/weixin/Window/WeChatSetup.ex