cehr-bert
cehr-bert copied to clipboard
Switch from our own implementation of MultiHeadAttention to the one created by tensorflow
- Upload Tensorflow to 2.12 so we can use TensorFlow's version of MultiHeadAttention
- Replace our own implementation of MultiHeadAttention with the Tensorflow version of MultiHeadAttention
- Toggle
mask
at https://github.com/cumc-dbmi/cehr-bert/blob/8862fd6a1f026839d301625fb30a1ba7e19a4672/data_generators/learning_objective.py#L281. Becausemask=1
indicates attention whereasmask=0
represents no attention