Non-local_pytorch
Non-local_pytorch copied to clipboard
关于 Embedded Gaussian问题
想问一下,在non_local_embedded_gaussian.py实现文件中,我并没有发现Embedded Gaussian的具体表达式,只是有矩阵乘法+softmax函数...如果是Embedded Gaussian的实现,具体是哪几行代码实现呢? https://github.com/AlexHex7/Non-local_pytorch/blob/39ad90c91538d34e88865c9fb0ce4a844751346c/lib/non_local_embedded_gaussian.py#L85 https://github.com/AlexHex7/Non-local_pytorch/blob/39ad90c91538d34e88865c9fb0ce4a844751346c/lib/non_local_embedded_gaussian.py#L86
@mymuli Hi, theta_x, phi_x 是embedded后的x。
@mymuli Hi, theta_x, phi_x 是embedded后的x。
我想问一下,theta_x, phi_x 是Embedded Gaussian后的x,在non_local_embedded_gaussian.py文件中具体语句是?
@mymuli 在论文3.2节-Embedded Gaussian部分
A simple extension of the Gaussian function is to compute similarity in an embedding space.
Here θ(x_i) = W_θx_i and φ(x_j ) = W_φx_j are two embeddings
想问一下,在non_local_embedded_gaussian.py实现文件中,我并没有发现Embedded Gaussian的具体表达式,只是有矩阵乘法+softmax函数...如果是Embedded Gaussian的实现,具体是哪几行代码实现呢?
https://github.com/AlexHex7/Non-local_pytorch/blob/39ad90c91538d34e88865c9fb0ce4a844751346c/lib/non_local_embedded_gaussian.py#L85
https://github.com/AlexHex7/Non-local_pytorch/blob/39ad90c91538d34e88865c9fb0ce4a844751346c/lib/non_local_embedded_gaussian.py#L86
在f_div_C = F.softmax(f, dim=-1) 之前,加个指数函数就可以了吧。