Deep-math-machine-learning.ai
Deep-math-machine-learning.ai copied to clipboard
Top 10 Similar terms
Thank you firstly for the tutorial I wanted to ask if it is possible to use the final embeddings to test out a word and return top 10 similar terms.
e.g
Top 10 Similar words given an input word
word="external" word_vec = final_embeddings[dictionary[word]] sim = np.dot(word_vec,-final_embeddings.T).argsort()[0:8] for idx in range(8): print (reverse_dictionary[sim[idx]])
a emdeding vector represents the word. so similar words have similar emebeding (we can use a distance metric to find out the distance ) .
This thread has different ways to handle please check it out https://stackoverflow.com/questions/40074412/word2vec-get-nearest-words
Hey @Madhu009 Thanks for the reply I understand embedding vectors, I was just wondering if there was a quick workaround for the code so that I can plug in a word and return top 10 similar terms. I tried using the tensorflow board but it wasnt successful either. Currently researching other methods too