Deprecate `torch_dtype`
torch_dtype -> dtype
https://github.com/intel/auto-round/issues/776
This change cannot be applied directly, since users running Transformers versions below 4.57 would no longer be supported.
as said in comment
Do we need to upgrade transformers in requirement? https://github.com/intel/auto-round/blob/3c1a678152579bac7ff51b5a6b64076bc792d728/requirements.txt#L12 Will it bring other problems?
as said in comment
Do we need to upgrade transformers in requirement?
https://github.com/intel/auto-round/blob/3c1a678152579bac7ff51b5a6b64076bc792d728/requirements.txt#L12
Will it bring other problems?
not a good option for now
1 one option is waiting for another 2 or 3 months to upgrade the trasnsformers 2 another is trying the way gpt suggested
import functools
import warnings
def support_legacy_keys(legacy_map: dict):
"""
legacy_map: dict, key = 新参数名, value = 旧参数名
"""
def decorator(func):
@functools.wraps(func)
def wrapper(*args, **kwargs):
for new_key, old_key in legacy_map.items():
if old_key in kwargs:
if new_key in kwargs:
raise ValueError(f"Cannot use both '{new_key}' and '{old_key}'")
warnings.warn(f"'{old_key}' is deprecated, use '{new_key}' instead", DeprecationWarning)
kwargs[new_key] = kwargs.pop(old_key)
return func(*args, **kwargs)
return wrapper
return decorator
使用示例:
@support_legacy_keys({"dtype": "torch_dtype"})
def quan(*, dtype=None):
print(f"dtype = {dtype}")
# 调用方式
quan(dtype="torch.float16") # 新参数
quan(torch_dtype="torch.float16") # 老参数,会自动映射并提示
1 one option is waiting for another 2 or 3 months to upgrade the trasnsformers 2 another is trying the way gpt suggested
import functools import warnings def support_legacy_keys(legacy_map: dict): """ legacy_map: dict, key = 新参数名, value = 旧参数名 """ def decorator(func): @functools.wraps(func) def wrapper(*args, **kwargs): for new_key, old_key in legacy_map.items(): if old_key in kwargs: if new_key in kwargs: raise ValueError(f"Cannot use both '{new_key}' and '{old_key}'") warnings.warn(f"'{old_key}' is deprecated, use '{new_key}' instead", DeprecationWarning) kwargs[new_key] = kwargs.pop(old_key) return func(*args, **kwargs) return wrapper return decorator使用示例:
@support_legacy_keys({"dtype": "torch_dtype"}) def quan(*, dtype=None): print(f"dtype = {dtype}") # 调用方式 quan(dtype="torch.float16") # 新参数 quan(torch_dtype="torch.float16") # 老参数,会自动映射并提示
For 2, we also need to add a judgment on the transformers version. I prefer option 1. Do we still have examples of dependencies v4.38-v4.57?
1 one option is waiting for another 2 or 3 months to upgrade the trasnsformers 2 another is trying the way gpt suggested
import functools import warnings def support_legacy_keys(legacy_map: dict): """ legacy_map: dict, key = 新参数名, value = 旧参数名 """ def decorator(func): @functools.wraps(func) def wrapper(*args, **kwargs): for new_key, old_key in legacy_map.items(): if old_key in kwargs: if new_key in kwargs: raise ValueError(f"Cannot use both '{new_key}' and '{old_key}'") warnings.warn(f"'{old_key}' is deprecated, use '{new_key}' instead", DeprecationWarning) kwargs[new_key] = kwargs.pop(old_key) return func(*args, **kwargs) return wrapper return decorator使用示例:
@support_legacy_keys({"dtype": "torch_dtype"}) def quan(*, dtype=None): print(f"dtype = {dtype}") # 调用方式 quan(dtype="torch.float16") # 新参数 quan(torch_dtype="torch.float16") # 老参数,会自动映射并提示For 2, we also need to add a judgment on the transformers version. I prefer option 1. Do we still have examples of dependencies v4.38-v4.57?
This should not depend on our examples or perspective; we need to make decisions from the users’ point of view