adapters icon indicating copy to clipboard operation
adapters copied to clipboard

caching doesn't happen in home folder

Open stephantul opened this issue 2 years ago • 0 comments

Environment info

  • adapter-transformers version: 3.2.1
  • Platform: macOS-12.6.7-arm64-arm-64bit
  • Python version: 3.10.12
  • Huggingface_hub version: 0.16.4
  • PyTorch version (GPU?): 2.0.1 (False)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: no
  • Using distributed or parallel set-up in script?: no

Information

Loading an adapter caches the adapter in a folder with the name "~" in the current working directory. In other words, the home is not correctly expanded.

https://raw.githubusercontent.com/Adapter-Hub/Hub/master/dist/v2/index/bert-base-uncased.json 
not found in cache or force_download set to True, downloading to 
/Users/username/path/adapter_test/~/.cache/torch/adapters/tmp3odzx_lk

To reproduce

Run the example from the quickstart guide.

import os

import torch
from transformers import BertTokenizer
from transformers.adapters import BertAdapterModel, AutoAdapterModel

# Load pre-trained BERT tokenizer from HuggingFace
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')

# An input sentence
sentence = "It's also, clearly, great fun."

# Tokenize the input sentence and create a PyTorch input tensor
input_data = tokenizer(sentence, return_tensors="pt")

# Load pre-trained BERT model from HuggingFace Hub
# The `BertAdapterModel` class is specifically designed for working with adapters
# It can be used with different prediction heads
model = BertAdapterModel.from_pretrained('bert-base-uncased')

# Load pre-trained task adapter from Adapter Hub
# This method call will also load a pre-trained classification head for the adapter task
adapter_name = model.load_adapter("sentiment/sst-2@ukp", config='pfeiffer')

# Activate the adapter we just loaded, so that it is used in every forward pass
model.set_active_adapters(adapter_name)

# Predict output tensor
outputs = model(**input_data)

# Retrieve the predicted class label
predicted = torch.argmax(outputs[0]).item()
assert predicted == 1

Expected behavior

Use the .cache folder in my home directory. Having folders named "~" is extremely dangerous, as rm ~ actually expands to the home directory 😄

stephantul avatar Jul 24 '23 11:07 stephantul