Getting started with Word2Vec

1. Source by Google

Project with Code:

Blog:

Paper:
[1] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. . In Proceedings of Workshop at ICLR, 2013.

Note: The new model architectures:

[2] Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean. . In Proceedings of NIPS, 2013.

Note: The Skip-gram Model with Hierarchical Softmax and Negative Sampling

[3] Tomas Mikolov, Wen-tau Yih, and Geoffrey Zweig. . In Proceedings of NAACL HLT, 2013.

Note: It seems no more information

[4] Tomas Mikolov, Quoc V. Le, Ilya Sutskever.

Note: Intersting word2vec application on SMT

[5] by Tomas Mikolov and etc.

2. Best explained with original models, optimizing methods, Back-propagation background and Word Embedding Visual Inspector

Paper:

Slides:

Youtube Video: – word2vec and wevi

Demo:

3. Word2Vec Tutorials:

Word2Vec Tutorial by Chris McCormick:

a)
Note: Skip over the usual introductory and abstract insights about Word2Vec, and get into more of the details

b)

Alex Minnaar’s Tutorials

The original article url is down, the following pdf version provides by Chris McCormick:

a)

b)

4. Learning by Coding

Python Word2Vec by Gensim related articles:

a)

b)

c)

d)

e)

Note: Simple but very powerful tutorial for word2vec model training in gensim.

5. Ohter Word2Vec Resources:

Posted by TextProcessing


Leave a Reply

Your email address will not be published. Required fields are marked *