Logan
Logan
If that doesn't work, try generating a new api key
I'm not sure what to think then 🤔 The best advice I can give (in addition to setting that variable at the top of your script) is generating a new...
Going to close this issue for now, langchain should be managing their python version, not us :)
When loading from disk/storage, you'll need to pass in the service context again. The latest documentation is here: https://gpt-index.readthedocs.io/en/latest/guides/primer/usage_pattern.html#optional-save-the-index-for-future-use Going to close this issue for now!
We currently do not use pypdf2 anymore. Closing this issue for now
@phiweger using the ChatGPTLLMPredictor class, the messages get built like that for you under the good 💪 so single string inputs are fine
Technically, for list and vector indexes, the llm is not used during construction, so it's only needed at `query()` time @phiweger But, passing it to both won't hurt either. Also,...
@BrunoAPazetti the response string can be cut off, usually because the default max_tokens from a single OpenAI call is 256 See this page here for extending that: https://gpt-index.readthedocs.io/en/latest/how_to/custom_llms.html#example-changing-the-number-of-output-tokens-for-openai-cohere-ai21
@jmcrook https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-changing-the-number-of-output-tokens-for-openai-cohere-ai21
This is fixed, as noted by the PR above