Nicolas
Nicolas
@albertogasparin let me know if this is helpful - would love to keep contributing.
Any updates on this? Thank you
@drarmstr @alunyov Would love to hear your thoughts + feedback! Also, let me know if you have any questions.
Bumping this up one more time, def let me know if you have any questions! I would love some feedback if possible! Happy to iterate on it as well. Thank...
Huge! @tractorjuice can you send us an example of an url where this shows up?
@tractorjuice Thanks! That's very helpful.
@rafaelsideguide we should also support partial docs on the webhook, just got a customer request for that!
@miracle777 If you like the output to be longer, you can increase the `max_tokens` parameter in your llm. Example below: ``` llm = OpenAI(temperature=0, max_tokens=1500) ``` Hope this helps :)
Got it. I also agree that separating it more would be better. Although I have some doubts about how this could look like. 1 - Would the run method of...