Nyx1197

Results 12 comments of Nyx1197

> > > 你好,该问题已于最新发布的 10.2.0 版本解决,再次感谢反馈,祝使用愉快。如果以后有其他问题也欢迎继续反馈~ > > Release:https://github.com/LuRenJiasWorld/WP-Editor.md/releases/tag/v10.2.0 你好,我更新了最新发布的 10.2.0,问题似乎依然存在.我在下面贴上内容: 问题页面地址: [https://www.wltechlab.top:8998/archives/431](https://www.wltechlab.top:8998/archives/431) 代码在文章里使用了 md语法进行包裹 (`*3 javascript) jjencode可以加密JS,使之变为一串颜文字 ```JavaScript function jjencode( gv, text ) { var r=""; var n; var...

> > > > > 你好,该问题已于最新发布的 10.2.0 版本解决,再次感谢反馈,祝使用愉快。如果以后有其他问题也欢迎继续反馈~ > > > Release:https://github.com/LuRenJiasWorld/WP-Editor.md/releases/tag/v10.2.0 > > > > > > 你好,我更新了最新发布的 10.2.0,问题似乎依然存在.我在下面贴上内容: > > 问题页面地址: https://www.wltechlab.top:8998/archives/431 > > 代码在文章里使用了 md语法进行包裹 (`*3 javascript)...

> 抱歉带来不便 感谢你的持续更新~ 期待你的10.3.0版本

> > > > > 抱歉带来不便 > > > > > > 感谢你的持续更新~ > > 期待你的10.3.0版本 > > 你好,此问题已于 [b77c3f8](https://github.com/LuRenJiasWorld/WP-Editor.md/commit/b77c3f81b2420850f631105308bd72eec208a144) 修复。因为10.3.0版本的发布还需要一段时间,如果着急使用可以用WordPress的插件编辑功能将相关修改同步到你的网站,即可立即解决此问题。 > > ![image](https://user-images.githubusercontent.com/29622423/93786971-806f6500-fc62-11ea-84d8-d8fb32590dee.png) 非常感谢,现在文章可以正常阅读了.

> For all of you, who are running older harware: ollama-rocm 0.12.2 still works on AMD gfx906.对于所有使用旧硬件的用户:ollama-rocm 0.12.2 仍然在 AMD gfx906 上工作。 0.12.3 is also supported, but 0.12.4 and later...

> Vulkan ([#11835](https://github.com/ollama/ollama/pull/11835)) will restore support.Vulkan ( [#11835](https://github.com/ollama/ollama/pull/11835) ) 将恢复支持。 Will the upcoming Vulkan version be a standalone release, similar to ollama-rocm?

> [@Nyx1197](https://github.com/Nyx1197) for the Windows installer it will be bundled in once it's ready. On Linux, we may adjust the bundles to make it easier for users to pick and...

> In 0.12.11 Vulkan is now included in the official binaries, but still experimental. To enable, set OLLAMA_VULKAN=1 for the server. https://github.com/ollama/ollama/blob/main/docs/faq.mdx#how-do-i-configure-ollama-server and https://github.com/ollama/ollama/blob/main/docs/docker.mdx#vulkan-support Thank you for everything you've done;...

> > > In 0.12.11 Vulkan is now included in the official binaries, but still experimental. To enable, set OLLAMA_VULKAN=1 for the server. https://github.com/ollama/ollama/blob/main/docs/faq.mdx#how-do-i-configure-ollama-server and https://github.com/ollama/ollama/blob/main/docs/docker.mdx#vulkan-support > > > >...

> > > In 0.12.11 Vulkan is now included in the official binaries, but still experimental. To enable, set OLLAMA_VULKAN=1 for the server. https://github.com/ollama/ollama/blob/main/docs/faq.mdx#how-do-i-configure-ollama-server and https://github.com/ollama/ollama/blob/main/docs/docker.mdx#vulkan-support > > > >...