thepok
thepok
Using a notebook while developing reduces calls a lot too. a local SQLite where every call is optionally saved could be a easy solution
i ll do this next. Any suggestions how to do it?
on inspection, at least OpenAI() implements a "_identifying_params" thats well suited?!?
import json openai = OpenAI(model="text-davinci-003") json.dumps(openai._identifying_params, sort_keys=True)
ChatGPT is realy smart: There are several systems for caching commonly used in Python. Here are a few examples: functools.lru_cache: This is a built-in Python module that provides a decorator...
it just wanted to help by showing options :) i have no preference. current system seems fine. Maybe add some kind of option, that only caches if LLM is in...
you should be able to use -1. that automatically uses max context size
releted to https://github.com/hwchase17/langchain/issues/263
maybe we should skip that, and go directly to some kind of pseudocode I made some great working examples: ``` Following is a list of sophisticated Functions you can use...
https://github.com/wyu97/GenRead