Eirun_Xu

Results 10 comments of Eirun_Xu

> @shicai > 你好。我用你提供的deploy改写的train_val进行训练,loss一直是2.5左右,不下降。我改train_val只改了开头的数据层和在结尾加了accuracy和softmaxwithloss,请问还有哪里需要修改? 请问能加个联系方式吗?我也准备finetuning ,也只是改了开头和加了结尾,我的QQ1443563995

> I also obtained a lower result. MIOU 74.58% can I have your qq, i have a so poor performace on my owndataset and trained via 8 gpus, thank u...

> Hi guys, I will share some of my experiment settings. > > Dataset: Cityscape without coarse additional data > Backbone: Resnet101 > output_stride: 16 > initial_learning_rate: 0.005 > learning_decay:...

> ![QQ截图20190320223533](https://user-images.githubusercontent.com/7559308/54692777-83b2fd80-4b60-11e9-9c24-0cc281aa2f02.png) > > interesting. after running the main.py with 4 GPUs the mean IOU is only 75.75% not 77.14%... Can i have your qq number , I really want...

output=self.fc(output) output=self.softmax(output) return output The softmax function is needed here?

> ```python > s=self.global_pool(U) > z=self.fc1(s) > a_b=self.fc2(z) > a_b=a_b.reshape(batch_size,self.M,self.out_channels,-1) > a_b=self.softmax(a_b) > ``` > > You mean that here? @XUYUNYUN666 No, I point that the line 86, the softmax...

> The reason why I used conv2d to implent the fully connected layer is that the author of the SKNet adopted the conv2d. Because there is a bias in the...

> @PHDPeter Hi, this folder contains the results. The videos are result from original implementation with author's shared features. The gif images are result from this work https://drive.google.com/open?id=1AjGkg8ZEaOphJZObIopYHhIf5fCVV2UG I have...

> Hi @XUYUNYUN666 , I am not sure but I think the difference between two graphs come from the different implementation in visualization module and how interpolation the results. Thank...

> > > No dataloaders hello, i also try reproducing the danet, can i have your qq number?