OpenLRM icon indicating copy to clipboard operation
OpenLRM copied to clipboard

What resources to train from scratch

Open reynoldscem opened this issue 9 months ago • 1 comments

Hi,

Could you please provide some numbers around which GPUs, how many of them, and how many hours roughly were required to train these models?

reynoldscem avatar May 01 '24 16:05 reynoldscem

Hi,

Plz refer to this issue https://github.com/3DTopia/OpenLRM/issues/2#issuecomment-1882590904. There isn't much differences for V1.1 models.

Plz also note that you can also use less resources, e.g. 8 A100 GPUs, by configuring the gradient accumulation steps in the config file.

ZexinHe avatar May 06 '24 09:05 ZexinHe