Rebeen Ali Hamad

Results 21 comments of Rebeen Ali Hamad

Thank you very much, regarding the second answer A2 I think Giga means 10 ^9 based your [code ](https://github.com/sovrasov/flops-counter.pytorch/blob/master/ptflops/flops_counter.py#L54) so when we change to Million Mac we should GMac* 1000=...

This is the computation cost of mobilenetv2 which I think it is not correct,, what do you think? ``` import torch import torch.nn as nn import torch.nn.functional as F class...

I compared that is why I am confused if we look at page 5 in this [paper](https://arxiv.org/pdf/1801.06434.pdf) we can see that the highest flops are 42.0 million for mobileNet v2...

Thank you, yes that is really a problem I am confused why these two flops are different,

@jarrettyeo, I just have a question X_test in both DE.shape_values and summary_plot should have the same shape or they have to have different shape? import shap tf.compat.v1.disable_eager_execution() background = X_train[np.random.choice(X_train.shape[0],...

Thank you actually I faced this problem `TypeError Traceback (most recent call last) in () 14 feature_names=list_column, 15 max_display=12, ---> 16 plot_type='bar') /usr/local/lib/python3.6/dist-packages/shap/plots/summary.py in summary_plot(shap_values, features, feature_names, max_display, plot_type, color,...

@jarrettyeo Thank you very much I saw your answer on Stackoverflow, also I solved the problem so now I want to run the code properly and let you know about...

@jarrettyeo thank you very much, actually, I could not show the screenshot of the results so I sent to your LinkedIn could you please let me know your opinion

@hugddygff Hi, sorry have you found the sparse attention using Keras or TensorFlow

I changed the place of SE and I put before out = self.bn3(self.conv3(out)) but it makes an error