llama-cpp-python
llama-cpp-python copied to clipboard
TypeError: 'NoneType' object is not callable in __del__() when exiting
Prerequisites
- [x] I am running the latest code.
- [x] I followed the README.md.
- [x] I searched open/closed issues.
- [x] This is a reproducible error and fix.
Expected Behavior
When the model is closed or Python exits, it should clean up resources quietly.
Current Behavior
When using Llama(...), a crash occurs on exit:
TypeError: 'NoneType' object is not callable
This happens inside __del__() when it tries to close internal objects (e.g., ExitStack or model context) that are already None.
Environment and Context
- Windows 11
- Python 3.11.3
- llama-cpp-python version: 0.3.8
- Using quantized GGUF model via
Llama(...)
Steps to Reproduce
- Load model via
Llama(model_path=..., ...) - Let script exit
- Observe destructor crash in logs
Recommended Fix
Wrap close() call in __del__() like so:
def __del__(self):
try:
self.close()
except Exception:
pass