TFFRCNN
TFFRCNN copied to clipboard
how to load my pretrain model in ckpt file v2 format?
Hi,when I use ckpt v1 format,the command line as follows,
python ./faster_rcnn/test_net.py --gpu 0 --weights ./models/VGGnet_fast_rcnn_iter_150000.ckpt --imdb voc_2007_test --cfg ./experiments/cfgs/faster_rcnn_end2end.yml --network VGGnet_test
it works.
but I train my model in voc 2007 dataset ,the output like this.
then , I use the command line
python ./faster_rcnn/test_net.py --gpu 0 --weights ./output/faster_rcnn_end2end_pva_voc/voc_2007_trainval/PVAnet_iter_10000.ckpt --imdb voc_2007_test --cfg ./experiments/cfgs/faster_rcnn_end2end_pva.yml --network PVAnet_test
the output is:
'SNAPSHOT_INFIX': '',
'SNAPSHOT_ITERS': 5000,
'SNAPSHOT_PREFIX': 'PVAnet',
'SOLVER': 'Momentum',
'STEPSIZE': 60000,
'USE_FLIPPED': True,
'USE_PREFETCH': False,
'WEIGHT_DECAY': 1e-05},
'USE_GPU_NMS': True}
Waiting for ./output/faster_rcnn_end2end_pva_voc/voc_2007_trainval/PVAnet_iter_10000.ckpt to exist...
Waiting for ./output/faster_rcnn_end2end_pva_voc/voc_2007_trainval/PVAnet_iter_10000.ckpt to exist...
my tensorflow version is 1.2 I have read some issues. but can't solve it . I need some help. I want to know how to use ckpt v2 format.What's the difference between ckpt v2 format and ckpt v1 format? how to load my pretrain model in ckpt file v2 format ? modify my test_net.py?
thanks!
In train.py: +from tensorflow.core.protobuf import saver_pb2 and in function of "def init": self.saver = tf.train.Saver(max_to_keep=100,write_version=saver_pb2.SaverDef.V1)
you can see the ".ckpt"
hi?i train pvanet successfully,but i want test it in ./faster_rcnn/demo.py. what should i do? @sheirving @CharlesShang
Hi,I think i fix the problems.@YgRen @gentlebreeze1
- change your class num in cfg file
- if you want to use ckpt v2 format add some code before start a session: checkpoint_dir = os.path.dirname(args.model) ckpt = tf.train.get_checkpoint_state(checkpoint_dir) saver.restore(sess, ckpt.model_checkpoint_path) I am a newbie. If don't explain it clearly, you can ask me again.