apex icon indicating copy to clipboard operation
apex copied to clipboard

Cannot import amp from apex

Open rrryan2016 opened this issue 5 months ago • 3 comments

I carefullly install apex following,

git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --disable-pip-version-check --no-cache-dir --no-build-isolation --config-settings "--build-option=--cpp_ext" --config-settings "--build-option=--cuda_ext" ./

and I succeed as the terminal shows,

...
  adding 'apex-0.1.dist-info/top_level.txt'
  adding 'apex-0.1.dist-info/RECORD'
  removing build/bdist.linux-x86_64/wheel
  Building wheel for apex (pyproject.toml) ... done
  Created wheel for apex: filename=apex-0.1-cp39-cp39-linux_x86_64.whl size=4903869 sha256=0f241f2a7b54288cda80b2d10cb87133ccabba4cfd3b42972ee42e9d90fcbb31
  Stored in directory: /tmp/pip-ephem-wheel-cache-g9drhe5a/wheels/2e/1e/59/cdf980e888f7295fcfa718c8200ed2a83df4eda22ace6996e5
Successfully built apex
Installing collected packages: apex
Successfully installed apex-0.1

Then, I can import apex, but when I from apex import amp, it shows

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: cannot import name 'amp' from 'apex' (/home/yons/anaconda3/envs/py39/lib/python3.9/site-packages/apex/__init__.py)

Any suggestion to solve it?

Env: Ubuntu, CUDA 12.8, python 3.9, RTX 4090D, torch 2.7.1+cu128

rrryan2016 avatar Jun 18 '25 08:06 rrryan2016

It's expected on the master branch as amp was removed. apex.amp had been deprecated for a few years, for what would you want to use it?

crcrpar avatar Jun 20 '25 03:06 crcrpar

It's expected on the master branch as amp was removed. apex.amp had been deprecated for a few years, for what would you want to use it?

I intend to reproduce a repo, https://github.com/ShikunLi/Sel-CL, which include amp, such as,

@amp.autocast() model, optimizer = amp.initialize(model, optimizer, opt_level="O1",num_losses=2) with amp.scale_loss(args.lambda_s*loss_simi, optimizer,loss_id=1) as scaled_loss:

Any suggestion to modify the codes? It's okay to not accelerate by means of apex.

rrryan2016 avatar Jun 23 '25 06:06 rrryan2016

As I said, I'd not recommend apex.amp at all in favor of torch.autocast and torch.amp. While just to run the script, you may want to try https://github.com/NVIDIA/apex/releases/tag/25.04 instead. It's the tag before apex.amp removal.

An alternative would be some NGC PyTorch container with pytorch versions close to the one the repo mentions. e.g. https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes/rel_21-03.html#rel_21-03

crcrpar avatar Jun 23 '25 07:06 crcrpar

Closing as WAR has been suggested.

nWEIdia avatar Oct 03 '25 22:10 nWEIdia