qkeras icon indicating copy to clipboard operation
qkeras copied to clipboard

QKeras: a quantization deep learning library for Tensorflow Keras

Results 53 qkeras issues
Sort by recently updated
recently updated
newest added

Hi there! I was interested in implementing the Qkeras example for MNIST CNN model as given in the examples section - [Link](https://github.com/google/qkeras/blob/master/examples/example_mnist.py). This examples involves quantizing the weights and activations...

Hi there! I was interested in implementing the Qkeras example for MNIST CNN model as given in the examples section - [Link](https://github.com/google/qkeras/blob/master/examples/example_mnist.py). This examples involves quantizing the weights and activations...

I have a problem cloning a model. The code bellow shows an example. ``` import tensorflow as tf import numpy as np import qkeras import matplotlib.pyplot as plt from tensorflow.keras.datasets...

I get the following stack trace when calling `get_best_model()` on the AutoQKeras object using hyperband mode: ``` Traceback (most recent call last): File "exp_nns.py", line 533, in auto_qkeras_model = t.auto_mnist_cnn(auto_qkeras=True,...

Hi everyone Is it expected behavior that the quantization-aware training in QKeras is much slower than normal training in Keras? And if so, out of interest, where does the overhead...

My question is about the Qm.n format when m

chrome-extension://oemmndcbldboiebfnladdacbdfmadadm/https://openaccess.thecvf.com/content_ICCV_2019/papers/Gong_Differentiable_Soft_Quantization_Bridging_Full-Precision_and_Low-Bit_Neural_Networks_ICCV_2019_paper.pdf

This pull request fixes issue #72 The issue was that the normal kernel was used as the recurrent one when recurrent quantizer was not defined.

Add support for qdense_batchnorm by folding qdense kernel with batchnorm parameters, then computing qdense_batchnorm output using the qdense inputs and folded kernel

cla: yes

Building a simple GRU model using Keras: ``` gru = Sequential(GRU(16, input_shape=(2,4))) gru.compile(loss='mse', optimizer='adam') gru.summary() ``` Produces output: ``` Model: "sequential_332" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= gru_3...