Nick Byrne

Results 44 comments of Nick Byrne

No problem. Happy for you to close/leave open this issue as you see fit. I'm not sure if it's useful to archive my current branch in some way eg open...

> To model things similarly to Langchain, we would replace the prompt arg with a more generic input that would take a newly-defined class instance (i.e. LlmInput). I like this,...

> this only works if we can assume that every LLM that we want to use for preprocessing can also be used for answering This seems fine to me. I...

> If we do that, we could make it protocol, i.e. document, that if one receives a system message when one from the assistant is expected, this indicates an error...

> @nenb what do you think of removing the None option and instead use a string sentinel, e.g. "default"? This seems fine to me. The rest of the issue also...

> Maybe a combination of both is a good solution? GET /corpuses/metadata returns everything GET /corpuses/metadata?source_storage=Chroma returns the same object, but only having Chroma as single item in the outer...

> I would love to hear from @nenb @blakerosenthal @dillonroach how this is done in the existing deployment. - We simplified things a little bit by passing a file that...

_(After a short offline chat with @pmeier)_ I had a vague concern around latency and storage for large corpuses (eg instantiating 100K `Document` instances might not be a great thing...

A lot of what I have read around query preprocessing seems to reduce to sending the prompt to an LLM and/or ML model for reformulation. (This is separate to the...

> test_set_orthogonal_selection_3d Hey @agriyakhetarpal 👋 I had some questions related to your comments. **My background** I have a moderate amount of experience with the `zarr-python` codebase, but I am not...