Philippe Rémy
Philippe Rémy
But let's try to understand first why before I merged. Because in my case if I start the program with multi processes and multi threadings at the same time, it...
@ooobsidian sure go ahead with your implementation for now. I'll need to check that in details to understand why we have different behaviors :)
@MihaiBairac how about consider your file as a string?
@Archirekh-Majumder you should probably ask this question directly on https://github.com/stanfordnlp/CoreNLP.
@jeromemassot I don't think you can run it on Colab (as it is). The library starts a big java server (CoreNLP server) locally and then sends requests. I don't know...
@timwfburton thanks for the feedback! Appreciated! Can you paste the full code that I can run?
https://github.com/philipperemy/keras-attention-mechanism/blob/482b0c937b3888da5967b47478701838a4222269/examples/add_two_numbers.py#L95 What you want is the attention_weights. But maybe it does not work well with the Sequential API. Try with the functional API.
I'll close this issue (cf. answer above). If it's not clear, feel free to comment on it!
@LuisFMCuriel hey!! Thanks for reporting! Can you provide a quick snippet that I can run to see which it's failing here?
@LuisFMCuriel So I've been researching extensively with the sub models and I could not find a way to make it work all the time...