libra

Results 28 comments of libra

is there any conclusion for this issue?

this advice is a good solution, Glad to see whether we can provide this feature.

is there any update or plan for this topic? ```bazel``` become more and more popular, it would be cool to support it.

No matter what solution, this maybe a a good feature for the users. One scenario is if we want everyone use the bazel for the LTS version, such as >...

> It is designed to improve speed of mainly sparse LLMs. It won't allow faster inference with dense LLMs. but there still many LLMs using the ReLU activation function, so...

It seem that because the LLM response with text start with "Thought", but the regex match code in langchain does not consider this.

but the model may do not adopt your request. @alexprice12 this is about standard

@hwchase17 if you have time, can you give some extra info. if it ok, I will try fix this.

Same problem, it is ok to use the `OpenAI`, but not the `ChatOpenAI`