UCE
UCE copied to clipboard
Issue with Multi-GPU Distributed Computing Using UCE Model
Hi there,
I encountered a problem while calculating embeddings with the UCE model and setting args.multi_gpu=True. I received an error: AttributeError: 'TransformerModel' object has no attribute 'module'. Could you please help me understand what might be causing this and how to resolve it?
Thank you!
Hi, are you launching the command with hugging face accelerator?