pose-tensorflow
pose-tensorflow copied to clipboard
Train on a different dataset with different no. of joints
I have been trying to train the mpii model on my own data. I'm only interested in training for 4 joints i.e. two shoulder joints and two hip joints. I've marked the joints in the training dataset and created a dataset.mat file as per instructions and changed num_joints to 4 in pose_cfg.yaml file. When i try to run the train.py script it gives me the following error:
ValueError: Cannot feed value of shape (1, 46, 62, 4) for Tensor 'Placeholder_1:0', which has shape '(1, ?, ?, 14)'
I've checked nnet/pose_net.py file and it seems to be taking number of units for the prediction layer from the num_joints property defined in pose_cfg.yaml. Can you please help me out on this issue? I need to know the required changes to make it work.
Following is the complete stack trace for the error :
Exception in thread Thread-1:
Traceback (most recent call last):
File "/home/ahmed/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/ahmed/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "../../../train.py", line 49, in load_and_enqueue
sess.run(enqueue_op, feed_dict=food)
File "/home/ahmed/anaconda3/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 789, in run
run_metadata_ptr)
File "/home/ahmed/anaconda3/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 975, in _run
% (np_val.shape, subfeed_t.name, str(subfeed_t.get_shape())))
ValueError: Cannot feed value of shape (1, 46, 62, 4) for Tensor 'Placeholder_1:0', which has shape '(1, ?, ?, 14)'
Hey. My guess will be that your "dataset_type" is set to "mpii", which overwrites num_joints parameter in the code. If that is the case, remove this parameter from the config. Otherwise, can you please share the pose_cfg.yaml here?
Hi Eldar, Thanks for the reply, following is my pose_cfg.yaml.
dataset: /home/ahmed/Workspace/pose_estimation/pose-tensorflow-master/dataset/dataset.mat
dataset_type: "mpii"
images_path: /home/ahmed/Workspace/pose_estimation/pose-tensorflow-master/dataset/Images
pos_dist_thresh: 17
global_scale: 0.8452830189
scale_jitter_lo: 0.85
scale_jitter_up: 1.15
net_type: resnet_101
init_weights: /home/ahmed/Workspace/pose_estimation/pose-tensorflow-master/models/pretrained/resnet_v1_101.ckpt
location_refinement: true
locref_huber_loss: true
locref_loss_weight: 0.05
locref_stdev: 7.2801
num_joints: 4
intermediate_supervision: true
intermediate_supervision_layer: 12
max_input_size: 850
multi_step:
- [0.005, 10000]
- [0.02, 430000]
- [0.002, 730000]
- [0.001, 1030000]
display_iters: 20
save_iters: 60000
mirror: false
Ok, I got it working by removing dataset_type: "mpii" parameter. Training is in progress but after every 20 iterations (as defined by display_iters parameter), All the configuration is being printed repeatedly. Any guesses on how to prevent this? Following is the stuff which is being printed :
Config:
{'batch_size': 1,
'crop': False,
'crop_pad': 0,
'dataset': '/home/ahmed/Workspace/pose_estimation/pose-tensorflow-master/dataset/dataset_2.mat',
'dataset_type': 'default',
'display_iters': 20,
'fg_fraction': 0.25,
'global_scale': 0.8452830189,
'images_path': '/home/ahmed/Workspace/pose_estimation/pose-tensorflow-master/dataset/Images',
'init_weights': '/home/ahmed/Workspace/pose_estimation/pose-tensorflow-master/models/pretrained/resnet_v1_101.ckpt',
'intermediate_supervision': True,
'intermediate_supervision_layer': 12,
'location_refinement': True,
'locref_huber_loss': True,
'locref_loss_weight': 0.05,
'locref_stdev': 7.2801,
'log_dir': 'log',
'max_input_size': 850,
'mean_pixel': [123.68, 116.779, 103.939],
'mirror': False,
'multi_step': [[0.005, 10000],
[0.02, 430000],
[0.002, 730000],
[0.001, 1030000]],
'net_type': 'resnet_101',
'num_joints': 4,
'optimizer': 'sgd',
'pairwise_huber_loss': True,
'pairwise_loss_weight': 1.0,
'pairwise_predict': False,
'pairwise_stats_collect': False,
'pairwise_stats_fn': 'pairwise_stats.mat',
'pos_dist_thresh': 17,
'regularize': False,
'save_iters': 10000,
'scale_jitter_lo': 0.85,
'scale_jitter_up': 1.15,
'scoremap_dir': 'test',
'shuffle': True,
'snapshot_prefix': 'snapshot',
'sparse_graph': [],
'stride': 8.0,
'tensorflow_pairwise_order': True,
'use_gt_segm': False,
'video': False,
'video_batch': False,
'weigh_negatives': False,
'weigh_only_present_joints': False,
'weigh_part_predictions': False,
'weight_decay': 0.0001}
hi @ahmed-18 can you please tell how exactly you created labels for joints on the your images and stored it using matlab.