Ruslan

Results 127 comments of Ruslan

I can make a PR with `max_results = limit`, if what I'm saying is relevant

Thanks for the clarification! Your explanation that `search_paper` has 3 different modes is very informative, I think this can go to the docs. Yes, I also think `limit` documentation could...

I'm also worried, wouldn't this introduce "Too many requests" error? (*I just had one, but it was unrelated to `semanticscholar`*)

@langchain4j some strange stuff happening in this thread

@smileyboy2019 if you want to support a new LLM in your app, there are two ways based on your LLM: 1) If it's an LLM that is from the same...

@smileyboy2019 please don't click on the links provided by seeronline and smileyboy2019, they are not relevant and may contain malware

> If the model only allows API URLs and API keys to be provided, can it be called through openAPI ![image](https://private-user-images.githubusercontent.com/59221294/361393852-6f02e7be-e01b-4149-8a69-ca76f15d9808.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjQ2NzAwNDEsIm5iZiI6MTcyNDY2OTc0MSwicGF0aCI6Ii81OTIyMTI5NC8zNjEzOTM4NTItNmYwMmU3YmUtZTAxYi00MTQ5LThhNjktY2E3NmYxNWQ5ODA4LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA4MjYlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwODI2VDEwNTU0MVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWU2ZjBkZDM4ZWJmYjc2MmM1ZmI3Y2RlOTQwNzMwNjk4ODlhNzNhYTdkNmY2NzY0NmI4N2QyZGJjYTg5ZWM4NmYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.Vcy9CEVjZ6X9mvJNjubWKnrxaFv0ofTzv15W8QqaqOk) Of course, you can. But be sure that the...

Okay, after a little bit of working, now I see why it's a complicated approach. 1. Imagine user wants to run a local model. The easiest way to do it...

It seems for me that this is a problem, that developers of OpenAI or other providers have omitted. Tokenization is really a part of the domain of LLMs, it shouldn't...