DeepSpeed icon indicating copy to clipboard operation
DeepSpeed copied to clipboard

Install DeepSpeed with cpu-only

Open khangt1k25 opened this issue 4 years ago • 10 comments

Can we install DeepSpeed without cuda? and if yes, so how can i do that, I don't see that in documentations. Thanks

khangt1k25 avatar Dec 18 '20 10:12 khangt1k25

i know this is not an issue per se: but for me its also not clear yet if deepspeed is cuda only or if it can be used to scale a training task to multiple cpu only hosts

a0n avatar Apr 10 '21 03:04 a0n

any updates?

haruya-umemoto avatar Jun 01 '21 04:06 haruya-umemoto

Could this be done?

AbhayGoyal avatar Apr 29 '23 19:04 AbhayGoyal

Yes, it is possible to install in cpu-only environment. Currently, only CPUAdam and CPUAdagrad features are available. However, more features will be enabled some thanks to Intel contributions.

tjruwase avatar May 01 '23 18:05 tjruwase

Could you point me to the code for that? I could not find it.

On Mon, May 1, 2023, 1:32 PM Olatunji Ruwase @.***> wrote:

Yes, it is possible to install in cpu-only environment. Currently, only CPUAdam and CPUAdagrad features are available. However, more features will be enabled some thanks to Intel contributions https://github.com/microsoft/DeepSpeed/pull/3041.

— Reply to this email directly, view it on GitHub https://github.com/microsoft/DeepSpeed/issues/609#issuecomment-1530051693, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEMF2JUSCY5T3Q6XGPG3G2TXD76SXANCNFSM4VA63GEA . You are receiving this because you commented.Message ID: @.***>

AbhayGoyal avatar May 01 '23 19:05 AbhayGoyal

Earlier PRs https://github.com/microsoft/DeepSpeed/pull/2507 https://github.com/microsoft/DeepSpeed/pull/2775

CI: https://github.com/microsoft/DeepSpeed/blob/master/.github/workflows/nv-torch-latest-cpu.yml

tjruwase avatar May 01 '23 19:05 tjruwase

Hi, I meant example of how to use it.

On Mon, May 1, 2023, 2:47 PM Olatunji Ruwase @.***> wrote:

Earlier PRs #2507 https://github.com/microsoft/DeepSpeed/pull/2507 #2775 https://github.com/microsoft/DeepSpeed/pull/2775

CI: https://github.com/microsoft/DeepSpeed/blob/master/.github/workflows/nv-torch-latest-cpu.yml

— Reply to this email directly, view it on GitHub https://github.com/microsoft/DeepSpeed/issues/609#issuecomment-1530125755, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEMF2JUS4Z4ECIEYO55WEITXEAHLVANCNFSM4VA63GEA . You are receiving this because you commented.Message ID: @.***>

AbhayGoyal avatar May 01 '23 23:05 AbhayGoyal

It depends on the feature you want to use. Like I said earlier only CPUAdam and CPUAdagrad are now available in cpu-only environments, and for both cases the usage is exactly the same for gpu environments. For example, the following CPUAdam perf test works seamlessly on either cpu-only or gpu environments. https://github.com/microsoft/DeepSpeed/blob/master/tests/perf/adam_test.py

As more features are enabled, they will be accompanied by relevant documentation. However, our goal is to minimize user-visible differences.

tjruwase avatar May 02 '23 11:05 tjruwase

Thank you very much for your reply. Also, would this work in case of inference? So if 8 want to use a LLM model and want to do only inference using DeepSpeed? Is there an example code sample for it as well?

On Tue, May 2, 2023, 6:46 AM Olatunji Ruwase @.***> wrote:

It depends on the feature you want to use. Like I said earlier only CPUAdam and CPUAdagrad are now available in cpu-only environments, and for both cases the usage is exactly the same for gpu environments. For example, the following CPUAdam perf test works seamlessly on either cpu-only or gpu environments. https://github.com/microsoft/DeepSpeed/blob/master/tests/perf/adam_test.py

As more features are enabled, they will be accompanied by relevant documentation. However, our goal is to minimize user-visible differences.

— Reply to this email directly, view it on GitHub https://github.com/microsoft/DeepSpeed/issues/609#issuecomment-1531328930, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEMF2JRJLBJGLGXNP5SC3ITXEDXY3ANCNFSM4VA63GEA . You are receiving this because you commented.Message ID: @.***>

AbhayGoyal avatar May 02 '23 14:05 AbhayGoyal

@AbhayGoyal, yes, the intel PRs would enable CPU inference. Please keep an eye on those. You can also ask questions directly on those PRs.

tjruwase avatar May 02 '23 17:05 tjruwase

Hi @AbhayGoyal - closing this issue for now, since we've recommended the right way to follow up on, there are also CI tests linked.

If you have other questions, please open a new issue and link this one and we would be happy to answer it.

loadams avatar Aug 18 '23 21:08 loadams