channel-pruning icon indicating copy to clipboard operation
channel-pruning copied to clipboard

KeyError: 'conv1_2_V'

Open ethanhe42 opened this issue 3 years ago • 0 comments

hello,I'm learning from your channel-pruning-master. I have a problem running the problem.Do you have any solutions? Here are my problem: /home/linux/anaconda3/envs/py35/lib/python3.5/site-packages/sklearn/externals/joblib/init.py:15: FutureWarning: sklearn.externals.joblib is deprecated in 0.21 and will be

removed in 0.23. Please import this functionality directly from joblib, which can be installed with: pip install joblib. If this warning is raised when loading pickled models,

you may need to re-serialize those models with scikit-learn 0.21+. warnings.warn(msg, category=FutureWarning) /home/linux/anaconda3/envs/py35/lib/python3.5/site-packages/sklearn/utils/deprecation.py:144: FutureWarning: The sklearn.linear_model.base module is deprecated in version

0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.linear_model. Anything that cannot be imported from

sklearn.linear_model is now part of the private API. warnings.warn(message, FutureWarning) no lighting pack using CPU caffe [libprotobuf INFO google/protobuf/io/coded_stream.cc:610] Reading dangerously large protocol message. If the message turns out to be larger than 2147483647 bytes, parsing

will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h. [libprotobuf WARNING google/protobuf/io/coded_stream.cc:81] The total number of bytes read was 553432081

stage0 freeze

temp/bn_vgg.prototxt using CPU caffe including last conv layer! run for 100 batches nFeatsPerBatch 100 Extracting conv1_1 (10000, 64) Extracting conv1_2_V (10000, 22) Extracting conv1_2_H (10000, 22) Extracting conv1_2_P (10000, 59) Extracting conv2_1_V (10000, 37) Extracting conv2_1_H (10000, 37) Extracting conv2_1_P (10000, 118) Extracting conv2_2_V (10000, 47) Extracting conv2_2_H (10000, 47) Extracting conv2_2_P (10000, 119) Extracting conv3_1_V (10000, 83) Extracting conv3_1_H (10000, 83) Extracting conv3_1_P (10000, 226) Extracting conv3_2_V (10000, 89) Extracting conv3_2_H (10000, 89) Extracting conv3_2_P (10000, 243) Extracting conv3_3_V (10000, 106) Extracting conv3_3_H (10000, 106) Extracting conv3_3_P (10000, 256) Extracting conv4_1_V (10000, 175) Extracting conv4_1_H (10000, 175) Extracting conv4_1_P (10000, 482) Extracting conv4_2_V (10000, 192) Extracting conv4_2_H (10000, 192) Extracting conv4_2_P (10000, 457) Extracting conv4_3_V (10000, 227) Extracting conv4_3_H (10000, 227) Extracting conv4_3_P (10000, 512) Extracting conv5_1_V (10000, 398) Extracting conv5_1_H (10000, 512) Extracting conv5_2_V (10000, 390) Extracting conv5_2_H (10000, 512) Extracting conv5_3_V (10000, 379) Extracting conv5_3_H (10000, 512) Acc 0.000 wrote memory data layer to temp/mem_bn_vgg.prototxt freezing imgs to temp/frozen100.pickle

stage1 speed3.0

using CPU caffe loading imgs from temp/frozen100.pickle loaded Process Process-3: Traceback (most recent call last): File "/home/linux/anaconda3/envs/py35/lib/python3.5/multiprocessing/process.py", line 315, in _bootstrap self.run() File "/home/linux/anaconda3/envs/py35/lib/python3.5/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/home/linux/channel-pruning-master/lib/worker.py", line 21, in job ret = target(**kwargs) File "train.py", line 75, in solve WPQ, new_pt = net.R3() File "/home/linux/channel-pruning-master/lib/net.py", line 1348, in R3 rank = rankdic[conv] KeyError: 'conv1_2_V'

ethanhe42 avatar Jul 17 '20 16:07 ethanhe42