nimble icon indicating copy to clipboard operation
nimble copied to clipboard

build without conda

Open ShivaKothuru opened this issue 4 years ago • 6 comments

📚 Documentation

Can I have this build without conda? If yes, please specify the steps.

Thank you.

ShivaKothuru avatar Dec 12 '20 22:12 ShivaKothuru

Thank you for the question! We implemented Nimble on PyTorch (we inserted our code into PyTorch codebase), so you need to build PyTorch from source to install Nimble. According to the official guideline of PyTorch, the PyTorch team highly recommend building PyTorch with Anaconda environment, and there is no documentation about how to build PyTorch without Anaconda.

So I also recommend you to build PyTorch (and Nimble) using conda environment. We haven't tried building PyTorch without conda so far.

gyeongin avatar Dec 15 '20 04:12 gyeongin

Thank you @gyeongin

I would like to implement Nimble on Jetson device(e.g. Jeston nano, TX2) for faster inference. So, I would like to know if we can build without conda. Also, Can we build with simplified version just for inference?

ShivaKothuru avatar Dec 15 '20 11:12 ShivaKothuru

For the first question about Jetson devices, there is a document for building PyTorch from source on Jetson devices. I think you can also build Nimble if you follow this instruction. (I cannot verify this by myself because I don't have Jetson :cry:)

For the second question, our team is currently discussing about inference-only version of Nimble, which does not modify PyTorch codebase so can be installed without building PyTorch from source. But for now, there is no inference-only version available.

gyeongin avatar Dec 16 '20 06:12 gyeongin

Hi @ShivaKothuru, Were you able to run Nimble on Jetson devices? If so, could you please share the steps to do so. Thank you!

eash3010 avatar Nov 01 '21 00:11 eash3010

For the second question, our team is currently discussing about inference-only version of Nimble, which does not modify PyTorch codebase so can be installed without building PyTorch from source. But for now, there is no inference-only version available.

Hi @gyeongin , could you talk about how to implement the inference-only version without modifying the PyTorch codebase? Does that mean you will implement it in Python and run the tasks by calling some Pytorch functions? Will you use the Pytorch CUDA graph API? Thanks!

WandererCU avatar Nov 05 '21 23:11 WandererCU

Waiting for the answer for the above question

umairjavaid avatar Dec 15 '21 17:12 umairjavaid