llama_index
llama_index copied to clipboard
Is there any way to have llama_index access GPT3 through the requests module instead of the openai module?
Long story short, due to firewall rules at my corporation, the GPT-3 endpoints cannot be accessed using the openai module though they work if we manually make a request using the requests module in Python.
Given these constraints, what will it take to get llama_index working on such a system?