pytorch_6dof-graspnet
pytorch_6dof-graspnet copied to clipboard
Bumps [numpy](https://github.com/numpy/numpy) from 1.17.4 to 1.22.0. Release notes Sourced from numpy's releases. v1.22.0 NumPy 1.22.0 Release Notes NumPy 1.22.0 is a big release featuring the work of 153 contributors spread...
I'm trying download the shapenet dataset for training: ``` python3 train.py --arch {vae,gan,evaluator} --dataset_root_folder $DATASET_ROOT_FOLDER ``` The README on how to obtain the dataset is unclear to me. Is there...
hello, Thank you very much for sharing this great work. I have trained a VAE model following the instruction,it seems work well on the NPY dataset you given  But...
Hi, Thank you very much for sharing your implementation! I followed the instructions in the README file to install the dependencies, but I got the following error when pip was...
Hi @jsll, Thanks for the wonderful work. I have been trying to use train.py with continue_train is being set to true. For this I set options same as pretrained_evaluator/opt.yml: `python...
Hi @jsll , I found the VAE results are pretty weird compared with the GAN results. Here are the examples: ```shell python -m demo.main --generate_dense_grasps --num_grasp_samples 20 --grasp_sampler_folder checkpoints/vae_pretrained/ ```...
I am using your warehouse, I tried the demo effect and felt very good. I see that the demo code is an input of test data in NPY format, and...
``File "/aiLab/zzq/pytorch_6dof-graspnet-master/data/base_dataset.py", line 109, in change_object_and_render cad_path, cad_scale, in_camera_pose, thread_id) File "/aiLab/zzq/pytorch_6dof-graspnet-master/renderer/online_object_renderer.py", line 116, in change_and_render color, depth, pc, transferred_pose = self.render(pose) File "/aiLab/zzq/pytorch_6dof-graspnet-master/renderer/online_object_renderer.py", line 122, in render self.renderer =...
Can anyone here, give me correct and clear steps to run demo files. Please I want it urgently.....
Hi, The mid point is obtained in Line 672 in utils.py. In order to compute this value the first two elements from the transformed control points are used (grasp_cps). According...