openai-python
openai-python copied to clipboard
Memory Leak
I upgraded openai version to 1.23.5
and still I can see some issues with the same function just it got moved to different place.
/Users/jogireddy/PycharmProjects/flaskProject/venv/lib/python3.9/site-packages/openai/_legacy_response.py:347: size=1389 KiB (+1389 KiB), count=12109 (+12109), average=117 B/Users/jogireddy/PycharmProjects/flaskProject/venv/lib/python3.9/site-packages/openai/_response.py:674: size=1278 KiB (+1278 KiB), count=11239 (+11239), average=116 B/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/linecache.py:137: size=1232 KiB (+1232 KiB), count=12014 (+12014), average=105 B/Users/jogireddy/PycharmProjects/flaskProject/venv/lib/python3.9/site-packages/httpx/_content.py:175: size=868 KiB (+868 KiB), count=83 (+83), average=10.5 KiB/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/functools.py:58: size=769 KiB (+769 KiB), count=12308 (+12308), average=64 B/Users/jogireddy/PycharmProjects/flaskProject/venv/lib/python3.9/site-packages/openai/_legacy_response.py:330: size=726 KiB (+726 KiB), count=6332 (+6332), average=117 B/Users/jogireddy/PycharmProjects/flaskProject/venv/lib/python3.9/site-packages/openai/_response.py:653: size=676 KiB (+676 KiB), count=5931 (+5931), average=117 B/Users/jogireddy/PycharmProjects/flaskProject/venv/lib/python3.9/site-packages/openai/_resource.py:34: size=358 KiB (+358 KiB), count=5088 (+5088), average=72 B/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/abc.py:102: size=261 KiB (+261 KiB), count=2992 (+2992), average=89 B/Users/jogireddy/PycharmProjects/flaskProject/venv/lib/python3.9/site-packages/httpx/_models.py:82: size=249 KiB (+249 KiB), count=3736 (+3736), average=68 B
Originally posted by @rdy5644 in https://github.com/openai/openai-python/issues/1361#issuecomment-2076360795
@rdy5644 please take a look #1246, it seems like the problem of reuse OpenAPI client
As the comments suggested to reuse the OpenAI client
` from openai import OpenAI
_client = OpenAI( api_key='<this client should never be used directly!>', )
def get_openai(user: User) -> OpenAI: return _client.with_options(api_key=user.openai_api_key)
`
This is how I'm using the same client by overriding the api_key but still the problem persists and when I replaced it with normal api calls using requests the memory utilisation is stable.
The method 'with_options' makes a copy in fact, you can take a look the source code in openai._client.py.
You should cache the client, maybe the api_key can be used the cached key.
using global variable or you can implentment an another class
openai_clients = {}
client = openai_clients .get(api_key)
if client is None:
client = openai.OpenAI(api_key = '')
openai_clients [api_key] = client
Yes this is something I can try. will give a try and get back
So when I tried the approach suggested for caching and creating a global client both didn't work as expected.
In case of reusing same client with_options function to override the api_key it didn't give any difference interms of memory issue. and when it comes to client caching based on api_key as mentioned above. If we don't perform client.close() the number of open files getting increased and the memory usage seems to be increasing very rapidly.
I don't see a clear problem or action item for us here, so I'm going to close this issue.