nnsight
nnsight copied to clipboard
Debugging
New Feature!
NNsight support now more user-friendly error handling which shows the exact line in the context that caused the exception to be raised.
Simply set debug=True on your .trace(...) call to activate this setting.
from nnsight import LanguageModel
lm = LanguageModel("openai-community/gpt2")
with lm.trace("Hello World", debug=True) as tracer:
lm.transformer.h[5].mlp.output[0][-1][100000].save()
Traceback (most recent call last):
File "traceback_test.py", line 7, in <module>
lm.transformer.h[5].mlp.output[0][-1][100000].save()
NNsightError: index 100000 is out of bounds for dimension 0 with size 768.
-
If you are using a
Session, you just need to passdebug=Trueto the.session(...)call and it will propagate to all the graphs defined within. -
You can make this debug mode your default setting by calling:
import nnsight
from nnsight import CONFIG
CONFIG.set_default_app_debug(True)
- Or if you want to only make it persistent for the current run, use:
import nnsight
from nnsight import CONFIG
CONFIG.APP.DEBUG = True
improvement: Passing scan=True to Tracer propagates to all the Invokers defined within.
with lm.trace(scan=True) as tracer:
with tracer.invoke("The Eiffel Tower is in the city of"):
print(lm.transformer.h[2].mlp.output.shape)
with tracer.invoke("Buckingham Palace is in the city of"):
print(lm.transformer.h[2].mlp.output.shape)
>>> torch.Size([1, 10, 768])
>>> torch.Size([1, 9, 768])
@AdamBelfki3 Can you merge 0.4 into this branch and make the PR into 0.4?