Ruslan
Ruslan
@XAMPPRocky tests are fixed
I can make a PR with `max_results = limit`, if what I'm saying is relevant
Thanks for the clarification! Your explanation that `search_paper` has 3 different modes is very informative, I think this can go to the docs. Yes, I also think `limit` documentation could...
I'm also worried, wouldn't this introduce "Too many requests" error? (*I just had one, but it was unrelated to `semanticscholar`*)
@langchain4j some strange stuff happening in this thread
@smileyboy2019 if you want to support a new LLM in your app, there are two ways based on your LLM: 1) If it's an LLM that is from the same...
@smileyboy2019 please don't click on the links provided by seeronline and smileyboy2019, they are not relevant and may contain malware
> If the model only allows API URLs and API keys to be provided, can it be called through openAPI  Of course, you can. But be sure that the...
Okay, after a little bit of working, now I see why it's a complicated approach. 1. Imagine user wants to run a local model. The easiest way to do it...
It seems for me that this is a problem, that developers of OpenAI or other providers have omitted. Tokenization is really a part of the domain of LLMs, it shouldn't...