4D-Humans icon indicating copy to clipboard operation
4D-Humans copied to clipboard

How can I apply a smoothing to the video results?

Open linxiang4200 opened this issue 1 year ago • 34 comments

Thank you for your great work! While running the video demo, I noticed some slight jitter in the movements of the model's limbs. Is this due to the lack of smoothing?

linxiang4200 avatar Aug 07 '23 12:08 linxiang4200

Hello @linxiang4200 I would suggest 2 things. If you imported it in a 3d software, remove the Z data from the pelvis.

and if you are in blender, you have a smooth option on the graph editor that could help you with this problem.

I noticed that depending on the footage, you get much less jitters.

For example this video. https://github.com/shubham-goel/4D-Humans/assets/4061130/27c813a6-7dfd-40be-945a-ed9176fcfa64

I had much less jitter, take a look on the video bellow.

https://github.com/shubham-goel/4D-Humans/assets/4061130/5bae0c7f-e7e7-4855-9b25-261609390947

I'm not sure but I feel that if you have a good contrast between the character on the footage and the background, the result will be better

carlosedubarreto avatar Aug 08 '23 10:08 carlosedubarreto

Hello @linxiang4200 I would suggest 2 things. If you imported it in a 3d software, remove the Z data from the pelvis.

and if you are in blender, you have a smooth option on the graph editor that could help you with this problem.

I noticed that depending on the footage, you get much less jitters.

For example this video. https://github.com/shubham-goel/4D-Humans/assets/4061130/27c813a6-7dfd-40be-945a-ed9176fcfa64

I had much less jitter, take a look on the video bellow.

blender_hXWWl67Fit.mp4 I'm not sure but I feel that if you have a good contrast between the character on the footage and the background, the result will be better

thanks your excellent advice,but I'm not good at blender. at last, I found another way ,use smoothNet, like this repo

linxiang4200 avatar Aug 11 '23 06:08 linxiang4200

Hello @linxiang4200 , you mean that you were able use the code that start here? https://github.com/haofanwang/CLIFF/blob/f38cad87cb9df34bf377129f9e5ebaaae54af51a/demo.py#L355C19-L355C20

Its looks preety interesting, I'll try it out. thanks for pointing out

carlosedubarreto avatar Aug 11 '23 08:08 carlosedubarreto

@linxiang4200 I must say that your idea was great, with it, most of the results I've got were much better.

Let me show a couple of tests I made.

the 0.05 is the default output from 4d humans, and the 0.06 is with smoothnet

https://github.com/shubham-goel/4D-Humans/assets/4061130/c61e0365-e198-42b0-8cba-d8545901c4f7

https://github.com/shubham-goel/4D-Humans/assets/4061130/c63aa0f1-3c9c-4001-babb-490b47f80480

https://github.com/shubham-goel/4D-Humans/assets/4061130/87937343-6864-4a15-8629-f7ba70a5d5e8

carlosedubarreto avatar Aug 12 '23 19:08 carlosedubarreto

Hello @linxiang4200 , you mean that you were able use the code that start here? https://github.com/haofanwang/CLIFF/blob/f38cad87cb9df34bf377129f9e5ebaaae54af51a/demo.py#L355C19-L355C20

Its looks preety interesting, I'll try it out. thanks for pointing out

Hi, how did you fix from mmhuman3d.utils.demo_utils import smooth_process from mmhuman3d.utils.demo_utils import smooth_process part? I combined it with track.py but this error came out.

from mmhuman3d.utils.demo_utils import smooth_process ModuleNotFoundError: No module named 'mmhuman3d'

k-a-s-o-u avatar Aug 14 '23 10:08 k-a-s-o-u

Hello @linxiang4200 , you mean that you were able use the code that start here? https://github.com/haofanwang/CLIFF/blob/f38cad87cb9df34bf377129f9e5ebaaae54af51a/demo.py#L355C19-L355C20 Its looks preety interesting, I'll try it out. thanks for pointing out

Hi, how did you fix from mmhuman3d.utils.demo_utils import smooth_process from mmhuman3d.utils.demo_utils import smooth_process part? I combined it with track.py but this error came out.

from mmhuman3d.utils.demo_utils import smooth_process ModuleNotFoundError: No module named 'mmhuman3d'

I used the track and after the track a executed that smooth code from cliff.

But to kake it work i had to install mmcv 1.6 (i think) and download the source code from mmhuman3d, and the source of pytorch3d. You dont need to build pytorch but have the oytorch folder inside where you are goi g to run the code, but mmhuan3d source code there too

carlosedubarreto avatar Aug 14 '23 10:08 carlosedubarreto

Hello @linxiang4200 , you mean that you were able use the code that start here? https://github.com/haofanwang/CLIFF/blob/f38cad87cb9df34bf377129f9e5ebaaae54af51a/demo.py#L355C19-L355C20 Its looks preety interesting, I'll try it out. thanks for pointing out

Hi, how did you fix from mmhuman3d.utils.demo_utils import smooth_process from mmhuman3d.utils.demo_utils import smooth_process part? I combined it with track.py but this error came out. from mmhuman3d.utils.demo_utils import smooth_process ModuleNotFoundError: No module named 'mmhuman3d'

I used the track and after the track a executed that smooth code from cliff.

But to kake it work i had to install mmcv 1.6 (i think) and download the source code from mmhuman3d, and the source of pytorch3d. You dont need to build pytorch but have the oytorch folder inside where you are goi g to run the code, but mmhuan3d source code there too

Thank you. I tried it but didn't understand placing the source of pytorch3d and mmhuman3d in conda env without installing. I will wait someone integrates this function with 4D Humans!

k-a-s-o-u avatar Aug 14 '23 14:08 k-a-s-o-u

you can place in the same folder that you will use the scriot to smooth things out.

For example on my installation you'll find it this way. so when calling from mmhuman3d.utils.demo_utils import smooth_process it works because there is a folder called mmhuman3d there. and having that folder there, it doenst need to be installed.

You can install it too, but I was having a hard time trying to install it.

image

and here is the code I made to use on the smooth smooth.zip

carlosedubarreto avatar Aug 14 '23 15:08 carlosedubarreto

On the code Im' loading from pickle instead of joblib, because I made an addon for it in blender and didnt want to install joblib on blender as pickle comes in python by default

carlosedubarreto avatar Aug 14 '23 15:08 carlosedubarreto

Thank you! However I got error on smooth.py File "D:\4D-Humans\smooth.py", line 13, in b = pickle.load(handle) _pickle.UnpicklingError: invalid load key, 'x'.

I googled on Stack Overflow and Github but couldn't find out solution. I tried it both on Python3.9 and 3.10 but got same error. I installed mmcv 1.6.0 on conda activate 4D-Humans env and placed Pytorch3D, mmhuman3d source code folders in 4D-Humans as on the picture.

May I ask if you know any clue about this?

2023-08-15 18 45 03

(4D-Humans) D:\4D-Humans>pip list Package Version Editable project location


absl-py 1.4.0 accelerate 0.21.0 addict 2.4.0 aiofiles 23.2.1 aiohttp 3.8.5 aiosignal 1.3.1 altair 5.0.1 annotated-types 0.5.0 antlr4-python3-runtime 4.9.3 anyio 3.7.1 asttokens 2.2.1 async-timeout 4.0.2 attrs 23.1.0 av 10.0.0 backcall 0.2.0 black 23.7.0 boto3 1.26.124 botocore 1.29.124 braceexpand 0.1.7 Brotli 1.0.9 cachetools 5.3.1 certifi 2023.7.22 charset-normalizer 3.2.0 chumpy 0.70 click 8.1.6 cloudpickle 2.2.1 colorama 0.4.6 colorlog 6.7.0 contourpy 1.1.0 cycler 0.11.0 Cython 3.0.0 decorator 5.1.1 detectron2 0.6 diffusers 0.19.3 dill 0.3.7 einops 0.6.1 encodec 0.1.1 exceptiongroup 1.1.2 executing 1.2.0 fastapi 0.101.0 ffmpy 0.3.1 filelock 3.9.0 fonttools 4.41.1 freetype-py 2.4.0 frozenlist 1.4.0 fsspec 2023.4.0 funcy 2.0 fvcore 0.1.5.post20221221 google-auth 2.22.0 google-auth-oauthlib 1.0.0 gradio 3.40.1 gradio_client 0.4.0 grpcio 1.56.2 h11 0.14.0 hmr2 0.0.0 d:\4d-humans httpcore 0.17.3 httpx 0.24.1 huggingface-hub 0.14.1 hydra-colorlog 1.2.0 hydra-core 1.3.2 hydra-submitit-launcher 1.2.0 idna 3.4 imageio 2.31.1 importlib-metadata 6.6.0 importlib-resources 6.0.1 iopath 0.1.9 ipython 8.14.0 jedi 0.18.2 Jinja2 3.1.2 jmespath 1.0.1 joblib 1.3.2 jsonschema 4.19.0 jsonschema-specifications 2023.7.1 kiwisolver 1.4.4 lazy_loader 0.3 lightning-utilities 0.9.0 linkify-it-py 2.0.2 Markdown 3.4.4 markdown-it-py 2.2.0 MarkupSafe 2.1.3 matplotlib 3.7.2 matplotlib-inline 0.1.6 mdit-py-plugins 0.3.3 mdurl 0.1.2 mmcv 1.6.0 mmhuman3d 0.11.0 mpmath 1.2.1 multidict 6.0.4 mypy-extensions 1.0.0 networkx 3.0 numpy 1.23.0 oauthlib 3.2.2 omegaconf 2.3.0 opencv-python 4.8.0.74 orjson 3.9.4 packaging 23.1 pandas 2.0.3 parso 0.8.3 pathspec 0.11.2 phalp 0.1.3 pickleshare 0.7.5 Pillow 9.3.0 pip 23.2.1 platformdirs 3.10.0 plyfile 1.0.1 portalocker 2.7.0 prompt-toolkit 3.0.38 protobuf 4.23.4 pure-eval 0.2.2 pyasn1 0.5.0 pyasn1-modules 0.3.0 pycocotools 2.0.6 pycparser 2.21 pydantic 2.1.1 pydantic_core 2.4.0 pydub 0.25.1 pyglet 2.0.9 Pygments 2.15.1 PyOpenGL 3.1.0 pyparsing 3.0.9 pyre-extensions 0.0.29 pyrender 0.1.45 pyrootutils 1.0.4 PySocks 1.7.1 PySoundFile 0.9.0.post1 python-dateutil 2.8.2 python-dotenv 1.0.0 python-multipart 0.0.6 pytorch-lightning 2.0.6 pytorch3d 0.7.4 pytube 15.0.0 pytz 2023.3 PyWavelets 1.4.1 pywin32 306 PyYAML 6.0 referencing 0.30.2 regex 2023.3.23 requests 2.31.0 requests-oauthlib 1.3.1 rich 13.5.0 rpds-py 0.9.2 rsa 4.9 Rtree 1.0.1 s3transfer 0.6.0 safetensors 0.3.2 scenedetect 0.6.2 scikit-image 0.21.0 scikit-learn 1.3.0 scipy 1.10.1 semantic-version 2.10.0 sentencepiece 0.1.99 setuptools 68.0.0 six 1.16.0 smplx 0.1.28 sniffio 1.3.0 stack-data 0.6.2 starlette 0.27.0 submitit 1.4.5 suno-bark 0.0.1a0 sympy 1.11.1 tabulate 0.9.0 tensorboard 2.13.0 tensorboard-data-server 0.7.1 termcolor 2.3.0 threadpoolctl 3.2.0 tifffile 2023.7.18 timm 0.9.2 tokenizers 0.13.3 tomli 2.0.1 toolz 0.12.0 torch 2.0.1 torchaudio 2.0.2+cu118 torchmetrics 1.0.1 torchvision 0.15.2 tqdm 4.65.0 traitlets 5.9.0 transformers 4.31.0 trimesh 3.22.5 typing_extensions 4.7.1 typing-inspect 0.8.0 tzdata 2023.3 uc-micro-py 1.0.2 urllib3 1.26.16 uvicorn 0.23.2 vedo 2023.4.6 wcwidth 0.2.6 webdataset 0.2.48 websockets 11.0.3 Werkzeug 2.3.6 wheel 0.41.0 win-inet-pton 1.1.0 xformers 0.0.20 yacs 0.1.8 yapf 0.40.1 yarl 1.9.2 zipp 3.15.0

k-a-s-o-u avatar Aug 15 '23 10:08 k-a-s-o-u

@k-a-s-o-u Probably yout problem is because you are using my loading process that uses Pickle instead of joblib.

You probably will ake it work exchaging for joblib import of the data

For exmaple, I did this code to convert to pickle

import joblib
import pickle
import os

path_addon = os.path.dirname(os.path.abspath(__file__))
base_file = os.path.join(path_addon,'4D-Humans-main','outputs','results')
file = os.path.join(base_file,'demo_video.pkl')
file_converted = os.path.join(base_file,'demo_video_converted.pkl')
results = joblib.load(file)


with open(file_converted, 'wb') as handle:
   pickle.dump(results, handle, protocol=pickle.HIGHEST_PROTOCOL)
   
# with open('filename.pickle', 'rb') as handle:
#    b = pickle.load(handle)

probably you'll just have to use the results = joblib.load(file)

carlosedubarreto avatar Aug 15 '23 11:08 carlosedubarreto

@carlosedubarreto I just created pkl file by 4D-Humans' track.py. I used your import in blender.py when imported pkl to Blender. I don't even have any demo_video_converted.pkl file and I can't combine your suggested code with smooth.py. I struggle for few hours but I gave up and I should wait someone will pull request this and combine at next update! Thank you so much for your help anyway!

import joblib
import pickle
import os

path_addon = os.path.dirname(os.path.abspath(__file__))
base_file = os.path.join(path_addon,'4D-Humans-main','outputs','results')
file = os.path.join(base_file,'demo_video.pkl')
file_converted = os.path.join(base_file,'demo_video_converted.pkl')
results = joblib.load(file)


with open(file_converted, 'wb') as handle:
   pickle.dump(results, handle, protocol=pickle.HIGHEST_PROTOCOL)
   
# with open('filename.pickle', 'rb') as handle:
#    b = pickle.load(handle)

k-a-s-o-u avatar Aug 15 '23 16:08 k-a-s-o-u

@k-a-s-o-u Sorry for that. The code I sent if part of the addon I did, so it wont work just by executing it, you would need to analyze the output and adapt it in the code. the good part is that the most complicated part of the process is already in the code.

I hope someone does a pull request adding smoothnet on this repo, but, maybe that wont happen.

If you allow me to suggest, it would be great if you keep on trying, If you like the result youve saw and want it to be even better, you will only gain with that.

And you have one big advantage, me ☺️. I say that because I suffered a lot to make the code I did, and I'm willing to share the knowledge I've got from what I learned. I can point out things that you could do to make it work for you.

But one thing I wont do ia make it "plug and play" that I do on the addons I make.

I do that because, the addon I make is for people that dont want to understande the code and just use it, and because of that, I charge then (not all the time, but most of the time lately ☺️, after all it takes me weeks codgin things for people to dont have to tjhink about the complexity of things) and when people want to learn, I think tht is a great thing and I'm always trying to help others if the want to learn.

so if you want to learn, if you want to go further, you can count on me, seriously. Maybe it will take a bit of time for me to answer, becuase I help a lot of people, have a work, freelance and a business I'm building, but I will answer ☺️

Hope you understand and dont tkae what I said on the negative way, It should be a positive answer form my part 😀

carlosedubarreto avatar Aug 17 '23 07:08 carlosedubarreto

@carlosedubarreto I'm appreciated for your warm hearted productive suggestion! Eventually I made it working and got smoothed pkl!

This is the result movie https://github.com/shubham-goel/4D-Humans/assets/29495485/635bc6cd-e41e-4c77-b18c-515810972bfd

by the way I have one last question. I've got this error and to fix that I simply created a new folder on directory and then copied smoothnet_windowsize8.py as follow error indication. because I couldn't find out which file I can change smoothnet_windowsize8.py loading directry even searched on VSCode. I just want to know how did you fixed that, to edit some code?

File "D:\miniconda3\envs\4D-Humans\lib\site-packages\mmcv\utils\path.py", linene 23, in check_file_exist raise FileNotFoundError(msg_tmpl.format(filename)) FileNotFoundError: file "D:\4D-Humans\configs_cliff_base_\post_processing\smoothnet_windowsize8.py" does not exist

Anyway I've got so much thank you that you gave me such opportunity for seriously facing to the code for learning😄and your big help! I just solved error one by one patiently took many hours but impressed when it run!!

k-a-s-o-u avatar Aug 17 '23 19:08 k-a-s-o-u

hello @k-a-s-o-u

I'm vary happy that you were able to make it work. I thought that you were in the right path, I just wanted to give a little push, and with that you would have 2 great things, the result and the happyness of making it by yourself, and I must congratulate you, its not an easy job 👏

Sorry for that error, I'm sure it was my fault.

Let me explain. As I was having to integrate the cliff code with the other 4d human code, to know where the code was from, I changes the config folder name so I could know from where it was coming from. 😅

You could rename it to the original name, probably "configs" intead of "configs_cliff_base"

And with that smooth it makes things better most of the time, but on some you will see that the tranlation will becamo messy, you can do this, enable it only for the pose data, and you'll be good to go.

On my addon I made an option for people to apply the smooth on the pose, the translation or on both. By default, it will apply on both, but if the movement become strange, the user can apply only on the pose and it will work.

Oh another thing, I'm planning to try another code to smooth the result, the darkpose that was suggested to me from pose2sim creator David Pagnon. If it works successfully, I'll put some info here.

carlosedubarreto avatar Aug 18 '23 06:08 carlosedubarreto

Hello @carlosedubarreto

And with that smooth it makes things better most of the time, but on some you will see that the tranlation will becamo messy, you can do this, enable it only for the pose data, and you'll be good to go. On my addon I made an option for people to apply the smooth on the pose, the translation or on both. By default, it will apply on both, but if the movement become strange, the user can apply only on the pose and it will work.

I saw your add-on version of this. It's awesome work! I want to make it someday.

Oh another thing, I'm planning to try another code to smooth the result, the darkpose that was suggested to me from pose2sim creator David Pagnon. If it works successfully, I'll put some info here.

It sounds exciting! look forward to it!

By the way I realized 4D-Humans' motion captured doll got strange armature direction are standing perpendicularly on body(left on picuture) as usually armature should be set along anatomy on picture the right model. I researched manually aligning armature bones without breaking animation on Blender but there're no solution for this weird case. Do you know if this issue can fix? 2023-08-23 01 05 05

k-a-s-o-u avatar Aug 22 '23 16:08 k-a-s-o-u

this is a "normal" behavior of blender importing FBX. There is an option in blender where you can choose for it to align the bones automatically, but I think i did a test in the past and it messed with the result.

My honest opinion, dont bother with that. You can always retarget to another armature, or maybe use another armatura as input to get the movements (maybe if you have another chracter with the same bones structure and the same name , probably it might work when importing the motinon in blender. I didnt test it but I dont see why it would not work. Maybe that is the best solutio for what you want to have.

carlosedubarreto avatar Aug 22 '23 17:08 carlosedubarreto

Hi @carlosedubarreto I'm trying to use your smoothing file but doing it at the end affects how it is visualized on the image. It messes up the keypoint projections. Is there a better way to get the smoothed output visualized on the image/video?

smandava98 avatar Oct 27 '23 06:10 smandava98

@smandava98 , can you show some examples? I couldnt visualize (on my mind) what you are saying.

carlosedubarreto avatar Oct 27 '23 06:10 carlosedubarreto

Basically, my issue was that I could not visualize the smoothed 4D-Humans results on my video.

I figured out the issue. I was using PHALP, not 4D-Humans (PHALP implemented 4D-Humans). They use 'camera' parameter to mean the 2d camera. I had to use 'camera_bbox' as that was the 3D camera parameter that they had when they created the pkl file.

smandava98 avatar Oct 27 '23 06:10 smandava98

@smandava98 sorry, I didnt even remembered that I shared the code LOL.

I think the best would be to visualize the result in blender, for example. Using smoothnet I had some problems when there is too much movement on the original data. The pose for the character goes elsewhere in parts of the animation.

I actually dont know what to tell you, since my work was besically dealing with the result inside blender. I dont use the rendered video at all (I dont even look at the result)

So in the end what I suggest is not watching the video result, but the import of the animation in 3d space, in blender for example.

and to do that you can even get the free addon that I did called CEB 4d humans. the free version you can create the data using the 4d humans code and there is an option to import it in blender. The problem is that this free version doesnt have smooth net, but if you create another PKL file, there is an option to import using the addon (I'm just not sure if it will work with your pkl after smoothing.

Anyway, even smoothnet working, there is anther method for smoothing that you can use native tools in blender, I receiverd the tip from a user called giacomo spaconi,

the idea was to bake the animation, to make the animation curves to be corrected in blender and then use the smooth tool that blender has for animation. This approach brought the best result for smoothhing the animation. I would suggest you to try it.

carlosedubarreto avatar Oct 27 '23 07:10 carlosedubarreto

@carlosedubarreto Thanks Carlos. I ended up fixing my issue (edited my comment above). Your code works great!

smandava98 avatar Oct 27 '23 07:10 smandava98

@smandava98 Thats great, thanks a lot for letting me know

carlosedubarreto avatar Oct 27 '23 11:10 carlosedubarreto

@carlosedubarreto - sorry to ask you this , i have not much idea but is there any other software which could be easier ( blender seems to have a big learning curve) to quickly import the pckl file to get some animations going ?

timtensor avatar Mar 20 '24 22:03 timtensor

@carlosedubarreto - sorry to ask you this , i have not much idea but is there any other software which could be easier ( blender seems to have a big learning curve) to quickly import the pckl file to get some animations going ?

Hello @timtensor , the main problem I think is "converting" the pkl data to something usable on the 3d softwares.

Actually blender could be the easiest, I say that because you can use an addon I did exactly for that.

You can get it for free here: https://carlosedubarreto.gumroad.com/l/ceb4dhumans?layout=profile

carlosedubarreto avatar Mar 21 '24 03:03 carlosedubarreto

@carlosedubarreto thank you for the feedback, my idea was to do a 360 degree panning, camera view across the smpl files. I will see what could be done and how difficult would it be to get it running.

timtensor avatar Mar 21 '24 08:03 timtensor

@timtensor , 360 around the animation result, right?

You might need to apply some smooth, because the result of 4d humans is very good, but has some jitters.

carlosedubarreto avatar Mar 21 '24 09:03 carlosedubarreto

@carlosedubarreto - thanks. For now I just want to have a quick demo even with thejitters . I will try to find the instructions if I can get it up and running . Is there also a Ubuntu version of it ?

timtensor avatar Mar 21 '24 09:03 timtensor

@timtensor , for the blender addon I suggested, only for windows, sorry

carlosedubarreto avatar Mar 21 '24 15:03 carlosedubarreto

@linxiang4200 I must say that your idea was great, with it, most of the results I've got were much better.

Let me show a couple of tests I made.

the 0.05 is the default output from 4d humans, and the 0.06 is with smoothnet

20230812.0.06.wip.compare.TWITTER.mp4 blender_wCgdEwH52K.mp4 blender_aEm7wjouvx.mp4

I am having the same problem and trying to reproduce this issue. But what parameter does 0.05 stand for in PHALP?

jackgeng19 avatar Apr 08 '24 20:04 jackgeng19