acelove
acelove
Now I use **Function(lambda x : tf.nn.embedding_lookup(word_embedding,x))** to replace **Embedding(len(vocab),dims)** where word_embedding is defined as below: `word_embedding = tf.Variable(tf.random_normal([len(vocab),dims])` Is this way efficient enough?
I use **Broadcast** and **Zip** to achieve my goal.However, Broadcast copy the data many times so it cost a lot of memery.Is there a more efficient way?
If N is fixed an known,you can use `NGrams(N) >> GetItem(0) >> Concat() >> Function(lambda x:tf.reshape(-1,N,5) )` to convert sequence of vectors to a Tensor of shape (N,5)
Because after using td.Map I get a sequence of tensor .However, the model's output cannot be sequenceType,so I think I have to convert it to TupleType