3d_gan_tensorflow
3d_gan_tensorflow copied to clipboard
AdamOptimizer and scope problem
Hello, i have run the code and got a error information. It is about Adamoptimizer. It can not work under "reuse=True" condition. I am a new beginner about tensorflow. Could you help me to solve it? Thank you
This code was implemented with Tensorflow 1.1, check your TF version and see if they match. If not, there might be minor adjustment need to be made in order to make it work.
If possible, try provide more details of the error.
Thank you for your reply. I have solved the problem by adding a command as below : with tf.variable_scope(tf.get_variable_scope())
Though i have trained the 3D GAN model successuflly, the chair i generated is wrong, it seems that i generate a cube. I have noticed that the generator contains four layers, h0(fc)-h1(deconv)-h2(deconv)-h3(deconv). I am confused if it is the full definition of the generator in this 3D GAN model or just part of it? Hope for your reply.
The generator structure should work fine.
For how long your have trained the model? Usually the generated data looks randomly scattered in the 3D space in the first few epochs.
About 9 hours. The result i got seems like a cube, with dense points. I have used the training epochs in your code. Training epochs = 20001, and then continue training for epoch in range (4000, 25001). Actually, I have not changed the code and just run it. I am confused with the chair i generated. Thank you so much.
The train samples that generated during training are [64,32,32,32] arrays. I am confused about it. Should it be [64,64,64] array that represents a chair? The output of generator is [64,32,32,32] array?
@timzhang642 Dear Tim, i have found that i used generator1 can work. I am appreciate for your suggestion. Thank you.
64 means there are 64 chairs in this training batch.
The train samples that generated during training are [64,32,32,32] arrays. I am confused about it. Should it be [64,64,64] array that represents a chair? The output of generator is [64,32,32,32] array?
Hello tim,
I tried to run the code but get the error:
----> 8 train_chairs=train_chairs.reshape([988,32,32,32,1]) # turn train_chairs into 5D tensor [batch, depth, height, width, channels]
ValueError: cannot reshape array of size 851968 into shape (988,32,32,32,1)
How do I fix it?
I my case, I had 988 instances in train_chairs
; In your case, looks like you only have 851968/32/32/32 = 26 instances. So change it to train_chairs=train_chairs.reshape([26,32,32,32,1])
.
Thank you so much! Actually I'd already figured it out as I printed out the size of shape and know the reason.
On Fri, Nov 30, 2018 at 4:31 PM Yuxuan (Tim) Zhang [email protected] wrote:
I my case, I had 988 instances in train_chairs; In your case, looks like you only have 851968/32/32/32 = 26 instances. So change it to train_chairs=train_chairs.reshape([26,32,32,32,1]).
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/timzhang642/3d_gan_tensorflow/issues/2#issuecomment-443345818, or mute the thread https://github.com/notifications/unsubscribe-auth/ApEwh2HnewZTyRDMHhyxR-hXle-S9MTbks5u0aOygaJpZM4QDrZJ .
-- Jack Wang
Hi Tim,
I encounter another issue when I run "train the GAN": it seems to enter the dead loop after many iternations. I printed out the related info that the tnsor is always empty. what is happening. Can you help? thx
Tensor("Merge_1448/MergeSummary:0", shape=(), dtype=string) Tensor("Neg:0", shape=(), dtype=float32) Tensor("d_loss:0", shape=(), dtype=string) Tensor("d_prob_x:0", shape=(), dtype=string) Tensor("d_prob_z:0", shape=(), dtype=string)
On Fri, Nov 30, 2018 at 4:31 PM Yuxuan (Tim) Zhang [email protected] wrote:
I my case, I had 988 instances in train_chairs; In your case, looks like you only have 851968/32/32/32 = 26 instances. So change it to train_chairs=train_chairs.reshape([26,32,32,32,1]).
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/timzhang642/3d_gan_tensorflow/issues/2#issuecomment-443345818, or mute the thread https://github.com/notifications/unsubscribe-auth/ApEwh2HnewZTyRDMHhyxR-hXle-S9MTbks5u0aOygaJpZM4QDrZJ .
-- Jack Wang
Jack Wang 12:57 PM (0 minutes ago) to reply+02913087a6936f03e302dfa2927b8a7dd8f1d4e9ad1d65ff92cf0000000118196e3292a169ce0ff7353b Hi Tim,
I encounter another issue when I run "train the GAN": it seems to enter the dead loop after many iternations. I printed out the related info that the tnsor is always empty. what is happening. Can you help? thx
Tensor("Merge_1448/MergeSummary:0", shape=(), dtype=string) Tensor("Neg:0", shape=(), dtype=float32) Tensor("d_loss:0", shape=(), dtype=string) Tensor("d_prob_x:0", shape=(), dtype=string) Tensor("d_prob_z:0", shape=(), dtype=string)
ReplyForward https://drive.google.com/u/0/settings/storage?hl=en https://www.google.com/intl/en/policies/terms/ https://www.google.com/intl/en/policies/privacy/ https://www.google.com/gmail/about/policy/
On Fri, Nov 30, 2018 at 4:31 PM Yuxuan (Tim) Zhang [email protected] wrote:
I my case, I had 988 instances in train_chairs; In your case, looks like you only have 851968/32/32/32 = 26 instances. So change it to train_chairs=train_chairs.reshape([26,32,32,32,1]).
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/timzhang642/3d_gan_tensorflow/issues/2#issuecomment-443345818, or mute the thread https://github.com/notifications/unsubscribe-auth/ApEwh2HnewZTyRDMHhyxR-hXle-S9MTbks5u0aOygaJpZM4QDrZJ .
-- Jack Wang
Actually there is an error msg when I called optimizor before I encounter the above issue "ValueError: Variable d/h0/conv2d/W_conv3d/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=tf.AUTO_REUSE in VarScope?"
Shall I use GradientDescentOptimizer(). instead of Adam?
please ignore the past comments. I can run it now but there is no any output. Your code only can run on GPU?
Thank you for your reply. I have solved the problem by adding a command as below : with tf.variable_scope(tf.get_variable_scope())
Though i have trained the 3D GAN model successuflly, the chair i generated is wrong, it seems that i generate a cube. I have noticed that the generator contains four layers, h0(fc)-h1(deconv)-h2(deconv)-h3(deconv). I am confused if it is the full definition of the generator in this 3D GAN model or just part of it? Hope for your reply.
I have the same problem as you, where did you add "with tf.variable_scope(tf.get_variable_scope())"? Can you answer it?