keras-nlp icon indicating copy to clipboard operation
keras-nlp copied to clipboard

Modular Natural Language Processing workflows with Keras

Results 360 keras-nlp issues
Sort by recently updated
recently updated
newest added

I'm interested in contributing scripts which allow users to incorporate data augmentation techniques directly without using external libraries. I can start with stuff like synonym replacement, random insertion, random swap,...

type:feature

We should add an integration test, running some actual limited training job for TransformerEncoder/TransformerDecoder, and possibly also using a tokenizer and position embedding.

**Is your feature request related to a problem? Please describe.** This idea is from one best paper in NeurIPS 2021: [MAUVE: Measuring the Gap Between Neural Text and Human Text...

NLP Papers often compare against baselines and having a prebuilt random encoder could help with that. A random encoder is similar to a simple encoder with a slight difference here...

@mattdangerw and the keras-nlp team: For standard classification metrics (AUC, F1, Precision, Recall, Accuracy, etc.), [keras.metrics](https://keras.io/api/metrics/) can be used. But there are several NLP-specific metrics which can be implemented here,...

We can add a few examples: - Token Classification with BERT **Dataset:** CoNLL 2003 **What's different?** Here, we have to classify every word into its NER type. However, since BERT...

type:feature

We would like to use type annotations in KerasNLP. We should add them to the BERT example code in https://github.com/keras-team/keras-nlp/tree/master/examples/bert

This PR is a rework on https://github.com/keras-team/keras-nlp/pull/303. Recreate the PR instead of direct editing for clear remote-local tracking.

# Proposal In #387 we allowed construction of a BERT model from a "preset" model architecture and weights; for example `Bert.from_preset("bert_base_uncased_en")`. I propose to do the same with `BertPreprocessor`, automatically...