transformer-debugger
transformer-debugger copied to clipboard
While running inference on my Mac with MacOS version 13.1, I received the following error: ``` RuntimeError: MPS does not support cumsum_out_mps op with int64 input. Support has been added...
is it possible for this tool to be used on any model that is not from OpenAI is it something worth doing to support models from different structure
minor fix
很强,学习学习
The citation for the package could be improved by using the [CITATION.cff format](https://citation-file-format.github.io/) and linking to a release with a DOI (perhaps connect to Zenodo?) You can generate the `cff`...
Hi Team, I hit error when I tried following request body to /attention_head_record { "dst": "logits", "layerIndex": 9, "activationIndex": 8, "datasets": [ "https://openaipublic.blob.core.windows.net/neuron-explainer/gpt2_small_data/collated-activations/" ] } Error as below: File "/home/ruibn/debug/transformer-debugger/neuron_explainer/activation_server/read_routes.py",...
like llama Series and qwen Series
Error: Unable to look up model info. Are you sure you're running an activation server for this dataset? Current URL: http://localhost:8002 server and client are deployed on the same machine,...