keras-nlp icon indicating copy to clipboard operation
keras-nlp copied to clipboard

Modular Natural Language Processing workflows with Keras

Results 360 keras-nlp issues
Sort by recently updated
recently updated
newest added

Currently, we use `tf.py_function` to implement BLEU score. This is because the graph mode implementation which we tried out wasn't very efficient. This notebook compares the two approaches: https://colab.research.google.com/drive/1TZ8XnrmMcU8ZE2J-3amb44p-U-hxER53?usp=sharing. We...

enhancement

Made an attempt to do the above here: https://colab.research.google.com/drive/1PBMzeBd-HyFE0o4VXwk19-kqHIhOZM49?usp=sharing. Ran into an issue: ``` --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in () 1 inputs = keras.Input(shape=(), dtype="string") ----> 2...

**Describe feature** The [Integrated Gradients](https://www.tensorflow.org/tutorials/interpretability/integrated_gradients) (IG) can be a great tool to understand the neural network. **How API will change?** It would enhance as follows: ```python from keras_nlp.utils.visualization import IntegratedGradients...

Currently, the [`rouge`](https://github.com/google-research/google-research/tree/master/rouge) package does not have functionality to pass a custom tokeniser. This commit takes care of that: https://github.com/google-research/google-research/commit/61ce9f0ca76025dac5b671c0631e443a9975a8a3. However, it is not part of a release yet. Waiting...

type:feature

Closes [156](https://github.com/keras-team/keras-nlp/issues/156)

We can add notebooks (or share Colab notebooks) for existing examples in the library (with instructive text and explanation).

Issue discussing the nuances for the Random Swaps layer. Rough colab implementation can be found [here](https://colab.research.google.com/gist/aflah02/236ff822f8eeb29f95423f664afff73a/randomswaps.ipynb)

**Describe the bug** When loading a model with TransformerDecoder, with the custom objects option, it rebuilds and reinitializes the layers within TransformerDecoder. **To Reproduce** In keras_nlp/layers/transformer_decoder_test, change the test case...

bug

This is an extension to the [Random Deletion Layer](https://github.com/keras-team/keras-nlp/issues/152). I currently plan to use this issue to link the next PR which will add Stop Word Deletion specifically to the...

I think we are at the point where we need some automated testing for this. This runs preprocessing, a few pretraining train steps, and a few finetuning train steps by...