Philip May

Results 184 comments of Philip May

I think it is not supported now. Can you please tell me which command is needed to support this? I might implement it... Or you could come up with a...

I can confirm, that `AzureChatOpenAI` does not use caching at the moment. Cache init by: ``` from langchain.cache import SQLiteCache import langchain langchain.llm_cache = SQLiteCache(database_path="./langchain-cache.db") ``` LangChain Version 0.0.166 Would...

Here is the code to reproduce. It is taken from https://github.com/pinecone-io/examples/blob/master/generation/langchain/handbook/06-langchain-agents.ipynb ```python from getpass import getpass OPENAI_API_KEY = getpass() ## from langchain import OpenAI llm = OpenAI( openai_api_key=OPENAI_API_KEY, temperature=0 )...

IMO after this line: https://github.com/hwchase17/langchain/blob/acfd11c8e424a456227abde8df8b52a705b63024/langchain/chains/sql_database/base.py#L83 We should add a normalization that: 1. `strip()` the SQL string 2. Checks if quotation marks are at the beginning and the end and then...

Started a PR: https://github.com/hwchase17/langchain/pull/3385

@cg123 can you please approve the workflow?

@cg123 hmm - others report that this is not raising an exception anymore but still does not work. I guess that MoE merging is very different when merging other models...

> can you elaborate what behavior you see from the resulting moe The resulting model has the same size on disk as one source model. This indicates a bug.

> https://huggingface.co/mlabonne/phixtral-4x2_8 mlabonne said he used mergekit, but i am running into the same error, even when using the fork on his github repo. I think this is from an...

I found a fix for this: Changing this line: https://github.com/arcee-ai/mergekit/blob/d0f5ad466ea9caaf3c997f27e1695a32d68e147f/mergekit/scripts/mixtral_moe.py#L324 to this: ``` MISTRAL_INFO = mergekit.architecture.PHI2_INFO_AGAIN_BUT_DIFFERENT ``` Solves the problem. It seems like the architecture is hard wired. But how...