keras-nlp
                                
                                 keras-nlp copied to clipboard
                                
                                    keras-nlp copied to clipboard
                            
                            
                            
                        Integrate PositionEmbedding into the BERT example
We should replace the entire embedding code in this file with keras_nlp and keras components. https://github.com/keras-team/keras-nlp/blob/master/examples/bert/bert_model.py
There's a few different ways we could do this. We can't only use the TokenAndPositionEmbedding as the bert model also needs a segment embedding.
The best approach for now might be to adapt the TokenAndPositionEmbedding code to a TokenPositionAndSegmentEmbedding that lives directly in the bert_model.py. This would have the same signature as TokenAndPositionEmbedding with addition of a max_segments parameter, and compute a segment id embedding along with the positions and token embeddings inside call.
We could discuss if we want to add this as a core keras-nlp offering in the future, but it could be added directly to our BERT example code as a good way to validate the design first.
Hi @mattdangerw I would like to work on this. If I am not wrong then you are referring to this class in keras-nlp/tree/master/examples/machine_translation/model.py right? I am asking this because it seems that the path that is given above is outdated. Please correct me if I am wrong.