anything-llm icon indicating copy to clipboard operation
anything-llm copied to clipboard

[FEAT]: APIPie provider improvements

Open timothycarambat opened this issue 1 year ago • 2 comments

What would you like to see?

Refer to https://github.com/Mintplex-Labs/anything-llm/issues/2464#issuecomment-2458680527 to end for more details or scope.

  1. Enable streaming on Apipie providers
  2. File chat model providers with the query param for chatx so we can reduce model selection for LLM Options.
  • Keep the full .cache of models, apply the post-filter after that request on the resulting file contents. We can reuse this cache for Embedding and Voice Support
  1. Investigate Voice and Embedder model support (filter appropriate models for each). If Embedding, TTS, or STT are viable break them into separate issues from this issue.

timothycarambat avatar Nov 06 '24 16:11 timothycarambat

@timothycarambat

Thanks for this would be amazing to see, one other feature that might be worth looking at is we also provide Pinecone Vector DBs for our users so if you are looking to implement TTS and Embedding it might also be worth looking at our Pinecone integration also.

Docs available here https://apipie.ai/docs/Features/Pinecone

Toocky avatar Nov 08 '24 13:11 Toocky

We currently have a pinecone integration available for the user to use as their vectordb if they have an account, something like 97% of people never change from the default lancedb!

timothycarambat avatar Nov 08 '24 17:11 timothycarambat