transformer-debugger
transformer-debugger copied to clipboard
error with /attention_head_record endpoint
Hi Team,
I hit error when I tried following request body to /attention_head_record { "dst": "logits", "layerIndex": 9, "activationIndex": 8, "datasets": [ "https://openaipublic.blob.core.windows.net/neuron-explainer/gpt2_small_data/collated-activations/" ] }
Error as below:
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 265, in attention_head_record
) = convert_activation_records_to_token_and_attention_activations_lists(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 191, in convert_activation_records_to_token_and_attention_activations_lists
return normalize_attention_token_scalars(zipped_tokens_and_raw_attention_activations)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 455, in normalize_attention_token_scalars
) = compute_scalar_summary_and_scale(compute_max_scalar_out, list_of_sequence_lists)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 427, in compute_scalar_summary_and_scale
scalar_indexed_by_token_sequence_list: list[list[list[float]]] = [
^
File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py", line 428, in
Any idea? Same request was working fine on /neuron_record. Will it due to wrong dataset I used? @WuTheFWasThat