Jay Alammar

Results 32 comments of Jay Alammar

Yeah as @guustfranssensEY mentioned, make sure you have the prefix. This is an example from [model-config.yaml](https://github.com/jalammar/ecco/blob/main/src/ecco/model-config.yaml): ``` roberta-base: embedding: 'embeddings.word_embeddings' type: 'mlm' activations: - '\d+\.output\.dense' token_prefix: 'Ġ' partial_token_prefix: '' ```

That's an odd and unfamiliar error. Were you able to resolve it?

Just noting that recent PRs added the ability to more easily add different models (including local models). Beyond this, AraBERT would need RTL support on the javascript side, which is...

Hi Anshoo, Thank you so much for your kind words! 1. We are currently working on BERT support for activation exploration. We don't have immediate plans for supporting other models...

Unfortunately BART support wasn't completely there yet. I just made a couple of updates, could you try it now? Install the latest version of the repo: `!pip install git+https://github.com/jalammar/ecco.git`

My guess would be that that they put them in the same string separated by the [sep] token. The paper or model docs should explain that, I think.

@Lim-Sung-Jun There are examples of using BERT for neuron activation, but not for primary feature attribution/saliency. That hasn't been built out yet. I have create issue #64 to track the...

The latest version in the repo allows you to use any model you want (if you provide some configuration info). Which one would you like to use?

Thanks for sharing this @gorkemgoknar. I've created a notebook that guides you through finding the config for a model: https://github.com/jalammar/ecco/blob/main/notebooks/Identifying%20model%20configuration.ipynb