multinerf
multinerf copied to clipboard
train Mip-NeRF 360 with my dataset
Thank you for your work. I am a beginner in nerf . I want to use my own data set to reproduce the results of Mip-NeRF 360 . As shown below, the environment is installed and the test has passed . Also generate data via "bash scripts/local_colmap_and_resize.sh ${DATA_DIR}"
`(fy_multinerf) fangyi@dreamtech-3080TI:~/workspace/multinerf$ ./scripts/run_all_unit_tests.sh.
Ran 1 test in 3.078s
OK ....
Ran 4 tests in 13.085s
OK ...............................Mean Error = 0.0803162083029747, Tolerance = 0.1 .Mean Error = 0.08638705313205719, Tolerance = 0.1 ........................
Ran 56 tests in 103.219s
OK ............PE of degree 5 has a maximum error of 2.5369226932525635e-06 .PE of degree 10 has a maximum error of 6.4849853515625e-05 .PE of degree 15 has a maximum error of 0.002378210425376892 .PE of degree 20 has a maximum error of 0.11622805148363113 .PE of degree 25 has a maximum error of 1.999955415725708 .PE of degree 30 has a maximum error of 1.9999704360961914 ....
Ran 21 tests in 30.296s
OK ......
Ran 6 tests in 6.087s
OK ..
Ran 2 tests in 5.443s
OK .
Ran 1 test in 1.036s
OK .
Ran 1 test in 2.281s
OK .......
Ran 7 tests in 5.478s
OK ..........................................
Ran 42 tests in 41.142s
OK`
When I use the command "python -m train
--gin_configs=configs/360.gin
--gin_bindings="Config.data_dir = '${DATA_DIR}'"
--gin_bindings="Config.checkpoint_dir = '${DATA_DIR}/checkpoints'"
--logtostderr" will report an error and cannot continue training.
2023-05-15 10:27:16.006656: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT I0515 10:27:16.990981 140038417268864 xla_bridge.py:440] Unable to initialize backend 'rocm': NOT_FOUND: Could not find registered platform with name: "rocm". Available platform names are: Interpreter Host CUDA I0515 10:27:16.991207 140038417268864 xla_bridge.py:440] Unable to initialize backend 'tpu': module 'jaxlib.xla_extension' has no attribute 'get_tpu_client' I0515 10:27:16.991323 140038417268864 xla_bridge.py:440] Unable to initialize backend 'plugin': xla_extension has no attributes named get_plugin_device_client. Compile TensorFlow with //tensorflow/compiler/xla/python:enable_plugin_device set to true (defaults to false) to enable this. /home/fangyi/.conda/envs/fy_multinerf/lib/python3.9/site-packages/jax/_src/xla_bridge.py:643: UserWarning: jax.host_id has been renamed to jax.process_index. This alias will eventually be removed; please update your code. warnings.warn( Traceback (most recent call last): File "/home/fangyi/.conda/envs/fy_multinerf/lib/python3.9/runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "/home/fangyi/.conda/envs/fy_multinerf/lib/python3.9/runpy.py", line 87, in _run_code exec(code, run_globals) File "/clusters/data_3090_0/USER_DATA/fangyi/multinerf/train.py", line 288, in <module> app.run(main) File "/home/fangyi/.conda/envs/fy_multinerf/lib/python3.9/site-packages/absl/app.py", line 308, in run _run_main(main, args) File "/home/fangyi/.conda/envs/fy_multinerf/lib/python3.9/site-packages/absl/app.py", line 254, in _run_main sys.exit(main(argv)) File "/clusters/data_3090_0/USER_DATA/fangyi/multinerf/train.py", line 55, in main dataset = datasets.load_dataset('train', config.data_dir, config) File "/clusters/data_3090_0/USER_DATA/fangyi/multinerf/internal/datasets.py", line 52, in load_dataset return dataset_dict[config.dataset_loader](split, train_dir, config) File "/clusters/data_3090_0/USER_DATA/fangyi/multinerf/internal/datasets.py", line 295, in __init__ self._load_renderings(config) File "/clusters/data_3090_0/USER_DATA/fangyi/multinerf/internal/datasets.py", line 627, in _load_renderings image_paths = [os.path.join(image_dir, colmap_to_image[f]) File "/clusters/data_3090_0/USER_DATA/fangyi/multinerf/internal/datasets.py", line 627, in <listcomp> image_paths = [os.path.join(image_dir, colmap_to_image[f]) KeyError: '.ipynb_checkpoints/5316-checkpoint.jpg'
Looking forward to and thank you for your answer! !
Hi, I've encountered the same problem.
Were you able to solve it?
这是怎么回事啊?