nidhikamath91

Results 10 comments of nidhikamath91

Changing tf.concat(0, [context_embedded, utterance_embedded]) to tf.concat([context_embedded, utterance_embedded], 0) resolved the issue

I used the exact code in preprocessing. Txt and it worked. I did not understand the explanation in the new workaround. On Wed, Sep 12, 2018, 10:31 AM LilyDreamZhao wrote:...

Thanks. But could you help me with how can I create place holders for the input data and use them in data.py . Eg get a placeholder for input query

And what would be x? The input query, how do i represent that? Regards Nidhi On Tue, Oct 16, 2018, 4:20 PM Prabhsimran Singh wrote: > @nidhikamath91 according to what...

In the below part input_seq = input('Enter Query: ') sentence = inference(input_seq) Arent you feeding the input query to the inference method? How can i take that input sequence as...

Interesting. Could you send me a link to where you found this? On Tue, Oct 16, 2018, 4:56 PM Prabhsimran Singh wrote: > @nidhikamath91 apparently TF serving > doesn't support...

Can i convert it to stateless in any way? On Tue, Oct 16, 2018, 5:13 PM Prabhsimran Singh wrote: > > https://stackoverflow.com/questions/49471395/adding-support-for-stateful-rnn-models-within-the-tf-serving-api > > — > You are receiving this...

Hello, So I was thinking of the below solution, tell me what do you thinking about it. I will create a inference graph with issue and solution placeholders and then...

But converting them to numpy arrays is possible ony during runtime right ? with eval() [image: Mailtrack] Sender notified by Mailtrack 30.10.18, 15:02:28 On Tue, Oct 30, 2018 at 2:57...

Right now i was building a inference graph such that I pass the issue string to the issue placeholder, create lookup tensor and pass it to net rnn but how...