fairseq
fairseq copied to clipboard
How to use different kinds of optimizers in one task?
❓ Questions and Help
What is your question?
Hey! I want to use different optimizers for my params in model. However, I found that when call __init__()
function for optimizers, I find that I can not distinguish parameters, for example(trainer.py):
def _build_optimizer(self):
params = list(
filter(
lambda p: p.requires_grad,
chain(self.model.parameters(), self.criterion.parameters()),
)
)
I can only get the parameters without the key names unless I rewrite the code. This makes it difficult to devide the parameters into different parts and use different optimizers. How to solve this problem?
Code
What have you tried?
What's your environment?
- fairseq Version (e.g., 1.0 or main): 1.0.0a0
- PyTorch Version (e.g., 1.0): 1.9.0
- OS (e.g., Linux): Linux
- How you installed fairseq (
pip
, source): source - Build command you used (if compiling from source): pip install -e ./
- Python version: 3.8
- CUDA/cuDNN version: 11.3
- GPU models and configuration: A40
- Any other relevant information:
see https://github.com/facebookresearch/fairseq/issues/4563