Deep-math-machine-learning.ai icon indicating copy to clipboard operation
Deep-math-machine-learning.ai copied to clipboard

Top 10 Similar terms

Open Mustyy opened this issue 6 years ago • 2 comments

Thank you firstly for the tutorial I wanted to ask if it is possible to use the final embeddings to test out a word and return top 10 similar terms.

e.g

Top 10 Similar words given an input word

word="external" word_vec = final_embeddings[dictionary[word]] sim = np.dot(word_vec,-final_embeddings.T).argsort()[0:8] for idx in range(8): print (reverse_dictionary[sim[idx]])

Mustyy avatar Nov 14 '18 21:11 Mustyy

a emdeding vector represents the word. so similar words have similar emebeding (we can use a distance metric to find out the distance ) .

This thread has different ways to handle please check it out https://stackoverflow.com/questions/40074412/word2vec-get-nearest-words

Madhu009 avatar Nov 17 '18 07:11 Madhu009

Hey @Madhu009 Thanks for the reply I understand embedding vectors, I was just wondering if there was a quick workaround for the code so that I can plug in a word and return top 10 similar terms. I tried using the tensorflow board but it wasnt successful either. Currently researching other methods too

Mustyy avatar Nov 17 '18 08:11 Mustyy