Zibri

Results 201 comments of Zibri

Or in alternative I wish to have the same you have at https://demo.tabbyml.com/playground but locally.

Also make it so it starts after pressing a button because newest chrome does not allow the audiocontext to be started without user interaction.

I managed to invert the Y axis...

and in MSYS: ``` [ 3%] Built target ggml [ 4%] Built target ggml_static [ 4%] Building CXX object CMakeFiles/llama.dir/llama.cpp.o /home/Zibri/llama.cpp/llama.cpp: In member function ‘bool llama_mlock::raw_lock(const void*, size_t) const’: /home/Zibri/llama.cpp/llama.cpp:1559:34:...

This is the full script to compile it under MSYS2 CLANG. ``` #!/bin/bash pacman -Suy mingw-w64-clang-x86_64-toolchain mingw-w64-clang-x86_64-cmake mingw-w64-clang-x86_64-openblas git clone --recurse-submodules https://github.com/ggerganov/llama.cpp cd llama.cpp export CC=/clang64/bin/clang.exe export CPP=/clang64/bin/clang++.exe export LDFLAGS='-DLLAMA_NATIVE=ON...

this is my full script to compile it under windows.. adapt it to your needs: https://github.com/ggerganov/llama.cpp/issues/7275

Note: it would be great to have the feature to export the full assistant definition as a llama.cpp "main" command. (or a gpt4all prompt)

Done: ``` import('bing-chat').then(_=>{ BingChat=_.BingChat; async function main() { // Initialize the API with your valid cookie const api = new BingChat({ cookie: process.env.BING_COOKIE }); // Send a message to Bing...

``` import('bing-chat').then(_=>{ BingChat=_.BingChat; async function main() { // Initialize the API with your valid cookie const api = new BingChat({ cookie: process.env.BING_COOKIE }); // Send a message to Bing Chat...