Relation-Network
Relation-Network copied to clipboard
What are self.labels and how do you define your context objects?
What is
self.label = tf.placeholder(
dtype=tf.float32,
shape=[self.batch_size, self.c_max_len, self.c_max_len],
name="label"
)
I'm a bit confused as to why the dimensions are context_length by context_length. A bit of context - I don't understand what you're doing in the following lines:
s_embedded = sentenceLSTM(sentences, real_lens, reuse = reuse)
c_embedded = tf.concat([s_embedded, labels], axis=1)
c_embedded = tf.reshape(c_embedded, shape = [self.batch_size, self.c_max_len, self.c_max_len + self.c_word_embed])
tagged_c_objects = tf.unstack(c_embedded, axis=1)
Could you explain this to me? Cheers
Label is the sentence position represented in length 20 one-hot vector. The sentence right above the question is label 1, and it is encoded as [1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0].
s_embedded = sentenceLSTM(sentences, real_lens, reuse = reuse)
size: [batch_size*20, 32]
As you know, each 20 sentences in one context passes the same sentenceLSTM. In tensorflow, it is really inefficient to use for loop to deal with 20 sentences. Therefore, I treated 20 sentences as batch.
c_embedded = tf.concat([s_embedded, labels], axis=1)
size: [batch_size*20, 52]
tag labels for each sentences
c_embedded = tf.reshape(c_embedded, shape = [self.batch_size, self.c_max_len, self.c_max_len + self.c_word_embed]) tagged_c_objects = tf.unstack(c_embedded, axis=1)
tagged_c_objects are 20 length list of embedded sentences. There is no permutation function in tensorflow, so I made 20 length list to make all combinations using itertools.
I hope this answer helps.