improved-gan icon indicating copy to clipboard operation
improved-gan copied to clipboard

question of implementation of inception score

Open youkaichao opened this issue 6 years ago • 7 comments

these lines puzzle me:

w = sess.graph.get_operation_by_name("softmax/logits/MatMul").inputs[1]

logits = tf.matmul(tf.squeeze(pool3, [1, 2]), w)

softmax = tf.nn.softmax(logits)

I'm wondering, why not just use sess.graph.get_tensor_by_name('softmax:0') ? Why bother to manually do the matrix multiplication and apply softmax? also, why not add the bias term?

youkaichao avatar Jun 28 '18 07:06 youkaichao

ok , I know that inception model require the batch size to be 1 if we just run sess.graph.get_tensor_by_name('softmax:0'). (the restriction comes from tf.get_default_graph().get_tensor_by_name('pool_3/_reshape/shape_1:0') which has the fixed value of [1, 2048]. but still, I can't understand why not add the bias term

youkaichao avatar Jun 28 '18 08:06 youkaichao

without the bias term ( as in the repo), I get 10.954855 +- 0.4320521 inception score on CIFAR-10(using test images).

with the bias term ( as in the repo), I get 11.228305 +- 0.45700935 inception score on CIFAR-10(using test images).

so I think that the bias term should be added. It matters.

youkaichao avatar Jun 28 '18 08:06 youkaichao

I also found this issue. I actually tested the inception score with bias term and without inception score with a set of experiments: the basic founding is that inception score with bias term is consistently higher than the one without bias term, with a relatively fixed gap.

It does matter a lot, but the official implementation does not take it into account, so maybe most other reported inception scores also do not. I choose to report the score without the bias term to provide a fair comparison.

I think it is indeed important to know whether there has consideration on keep or drop the bias term from the original author.

-- Zhiming Zhou

On Thu, Jun 28, 2018 at 4:20 PM, youkaichao [email protected] wrote:

without the bias term ( as in the repo), I get 10.954855 +- 0.4320521 inception score on CIFAR-10(using test images).

with the bias term ( as in the repo), I get 11.228305 +- 0.45700935 inception score on CIFAR-10(using test images).

so I think that the bias term should be added. It matters.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/openai/improved-gan/issues/39#issuecomment-400953297, or mute the thread https://github.com/notifications/unsubscribe-auth/ALZ8sKmMUGeb9CvA8A5h5D8jVLzq9ldSks5uBJHegaJpZM4U64fa .

ZhimingZhou avatar Jun 28 '18 13:06 ZhimingZhou

@ZhimingZhou @TimSalimans maybe the author just forgot the bias term?

youkaichao avatar Jun 28 '18 14:06 youkaichao

without the bias term ( as in the repo), I get 10.954855 +- 0.4320521 inception score on CIFAR-10(using test images).

with the bias term ( as in the repo), I get 11.228305 +- 0.45700935 inception score on CIFAR-10(using test images).

so I think that the bias term should be added. It matters.

Hello, I am currently working on a GAN-related topic involving the calculation of the inception score, but when I use the source code of the author to run the real data of CIFAR10, the inception score is only (5.5425735, 0.059681736) (train data) & (5.5588408, 0.17018904) (test data), and I also used the pytorch version of the code for evaluation, the result is at 9.5+, this problem has been bothering me for a few days, and also checked a lot of information on the network, including issues There is no mention of the relevant details. So I am very confused. So I would like to ask you how to get the result of 11.24 in paper? Can you share your code or give me a hint? Thank you very much!

Adherer avatar Oct 04 '18 14:10 Adherer

@Adherer I think you can take a look at here. But it seems that the model file of that link has changed. Fortunately, I have saved a copy of that model file at here. (It annoys me that the link still works but actually points to another model file)

youkaichao avatar Oct 05 '18 08:10 youkaichao

@Adherer I think you can take a look at here. But it seems that the model file of that link has changed. Fortunately, I have saved a copy of that model file at here. (It annoys me that the link still works but actually points to another model file)

I use your method,and get the code from that website.But it still did not work,and get this error: Traceback (most recent call last): File "inception_score.py", line 156, in score, std = get_inception_score(test_X, 'classify_image_graph_def.pb', batch_size=64, splits=10) File "inception_score.py", line 78, in get_inception_score w = sess.graph.get_tensor_by_name('softmax/weights_1:0') File "/home/liujian/.local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3207, in get_tensor_by_name return self.as_graph_element(name, allow_tensor=True, allow_operation=False) File "/home/liujian/.local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3035, in as_graph_element return self._as_graph_element_locked(obj, allow_tensor, allow_operation) File "/home/liujian/.local/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3077, in _as_graph_element_locked "graph." % (repr(name), repr(op_name))) KeyError: "The name 'softmax/weights_1:0' refers to a Tensor which does not exist. The operation, 'softmax/weights_1', does not exist in the graph."

Adherer avatar Oct 05 '18 10:10 Adherer