Jafar Isbarov
Jafar Isbarov
Following [this](https://github.com/FluxML/Metalhead.jl/pull/136) discussion, I closed that PR and opened this one where I have added the [FastAI implementation](https://github.com/FluxML/FastAI.jl/blob/67ab2ffd5170ff7e3b2efeb3609f5ab981990741/src/Vision/models/unet.jl) of UNet. The model itself seems to have some problems (they exist...
I am not sure whether this is intentional, but when the input table has missing values, `assert_continuous` function throws `AssertionError: columns must hold continuous variables`, even though all other values...
The test directory mentioned on README is out of date (apparently). Sorry for such a minor pull request, but I guess I am not the only one who will be...
`torch.load(model_file)` does not work if the device is set to CPU, so I replaced all of these calls with `torch.load(model_file, map_location=device)` which is more device-agnostic.
TensorBoard.dev has been shut down as of January 1, 2024. How can we access the experiment logs?
### Confirm that this is a metadata correction - [X] I want to file corrections to make the metadata match the PDF file hosted on the ACL Anthology. ### Anthology...
### Anthology ID 2024.sigturk-1.2 ### Type of Change Revision ### PDF of the Revision or Erratum [allma_v3.pdf](https://github.com/user-attachments/files/16639794/allma_v3.pdf) ### Brief Description of Changes This revision contains the final version that was...
### Feature request Add an `overwrite` argument to the `push_to_hub` method. ### Motivation I want to overwrite a repo without deleting it on Hugging Face. Is this possible? I couldn't...
Transformers.jl models require `NamedTuple` input. ExplainableAI.jl analyzers require a derivative of `AbstractArray`. We can solve this by modifying XAIBase.jl and ExplainableAI.jl to support the Transformers.jl interface. I can start working...
Thanks for making this open-source! The following function checks for `_pad_token` attribute: ```python def _tokenize(self, text_sample): if self.tokenizer._pad_token is None: # Some tokenizers (e.g. GPT2 tokenizer) have no padding token...