olmocr
olmocr copied to clipboard
Any plan for releasing a docker environment
🚀 The feature, motivation and pitch
Thank you for this amazing project. I noticed that setting up a Docker environment for inference takes a considerable amount of time to run on our SLURM clusters. I was wondering if you have any plans to release a public Docker environment for inference?
Alternatives
No response
Additional context
No response
check this out! https://github.com/allenai/olmocr/issues/75
this image works well: docker://nvidia/cuda:12.2.0-devel-ubuntu22.04