Taehoon Lee
Taehoon Lee
@christian-rauch, The `keras-applications` is designed to work with two backends: `keras` and `tf.keras`. Thus, if you want to use the `keras`-based applications, you should import modules from `keras` NOT `keras-applications`...
@udrechsler, I will share my inference codes in the near future. Key recipes for ImageNet are the following: - Down-sampling should not be performed as a square. Adjust shorten side...
@BenTaylor3115, Please just keep the ratio 7/8(=224/256). And as far as I know, there are no examples where the image sizes are smaller than 224 on the official ImageNet results.
@Terizian, Could you describe the details such as statistics of 100 runs? The analysis for CPU and GPU utilizations needs to be more sophisticated.
@Waffleboy, the architecture of NASNet differs according to `input_shape`. Specifically, the line 513 (`elif p_shape[img_dim] != ip_shape[img_dim]`) causes structural changes, because the zero-padding in 2x down-sampling (conv, pool) perform differently...
@Waffleboy, 1. Training from scratch: `keras.applications.NASNetMobile(weights=None, input_shape=(128, 128, 3), classes=7)`, 2. Transfer learning: the section "Fine-tune InceptionV3 on a new set of classes" in [the official docs](https://keras.io/applications/).
@Waffleboy, Could you share your codes?
@Waffleboy, You should change `parallel_model = multi_gpu_model(current, gpus=6)` to `parallel_model = multi_gpu_model(transfer_model, gpus=6)`.
@Waffleboy, The NASNet weights are fine.
@Nagaraj4896, The error is not reproducible. The test `_test_application_basic(app, module=module)` has been checked in Travis. Are you using `tests/data/elephant.jpg`?