williamljb
williamljb
Hi Ian786, Thanks for your interest! For installation, please check the guide [here](https://github.com/akanazawa/hmr). The prerequisites are the same. For training/test options, please see src_ortho/config.py
Hi Ian786, 1. It is (b). The final visualized result will be on camera 0. 2. The faces in the list are the same as SMPL models. This is just...
Hi Ian786, The problem is because there is an extra space between two image paths. Try the following: python -m demo --img_paths data/image1.png,data/image2.png
Hi wine3603, Thanks for your interest! The generator is to provide code that generates the dataset. Below are some rough steps. 1. Export pose and shape params from Moshed CMU...
Hi wine3603, It seems to be a bash failure due to incomplete downloads. Here is a relevant post: [https://github.com/Ultimaker/cura-build/issues/221#issuecomment-520835019](https://github.com/Ultimaker/cura-build/issues/221#issuecomment-520835019)
Hi wine3603, The command is using the following directories to find the libraries: -L/usr/lib64 -Ldependencies/lib -L/usr/lib64/atlas Are your libraries located in /usr/lib64 or /usr/lib64/atlas? If not, you will need to...
Try replacing '-ltatlas -lsatlas' with '-latlas'. I remembered Atlas has many different versions with many different library names. Basically, the '-l$name' matches the libraries named with 'lib$name.so'.
You will probably need to recompile the dependencies. Try 'make clean' and then 'make' in the dependency directory.
For the pose sampling I simply used the CMU MoCap dataset. For the shape sampling I used np.random.uniform(-3,3,[20])
Hi sulgwyn, This is because the outputted global rotation is not perpendicular to the y-axis. You can visualize the results using global rotations from the other views. Specifically, [here](https://github.com/williamljb/HumanMultiView/blob/master/demo.py#L241) it...