self-attention-gan icon indicating copy to clipboard operation
self-attention-gan copied to clipboard

Results 17 self-attention-gan issues
Sort by recently updated
recently updated
newest added

The evaluation script seems to be stuck at the line "Creating CudaSolver handles for stream 0x1f1ffea0" for more than a day. I trained a model on my own dataset and...

How can we train on Unconditional dataset? I have tried on CelebA dataset, with set parameter `number_class` to `1`. But training process not going well, got error after 60K steps....

Hello, Which parameters do I need to change to make this train and evaluate on one GPU? I am currently getting an OOM Resource Exhausted Error when i try to...

It seems that you create the variable 'reuse_vars' in the build_model_single_gpu function. However, I do not find you use this variable to reuse variables among your multiple GPUs. Could you...

Hi, I couldn't find the implementation of the attention layer inside the network models. In the SAGAN paper it is mentioned that they have added the self-attention mechanism at different...

https://github.com/brain-research/self-attention-gan/blob/ad9612e60f6ba2b5ad3d3340ebae60f724636d75/non_local.py#L77 Is there a mistake in the reshape operation linked above? Shouldn't it be ``` attn_g = tf.reshape(attn_g, [batch_size, h // 2, w // 2, num_channels // 2]) attn_g =...