Getting started with Word2Vec

Deep Learning Specialization on Coursera

1. Source by Google

Project with Code: Word2Vec

Blog: Learning the meaning behind words

[1] Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient Estimation of Word Representations in Vector Space. In Proceedings of Workshop at ICLR, 2013.

Note: The new model architectures:

[2] Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean. Distributed Representations of Words and Phrases and their Compositionality. In Proceedings of NIPS, 2013.

Note: The Skip-gram Model with Hierarchical Softmax and Negative Sampling

[3] Tomas Mikolov, Wen-tau Yih, and Geoffrey Zweig. Linguistic Regularities in Continuous Space Word Representations. In Proceedings of NAACL HLT, 2013.

Note: It seems no more information

[4] Tomas Mikolov, Quoc V. Le, Ilya Sutskever. Exploiting Similarities among Languages for Machine Translation

Note: Intersting word2vec application on SMT

[5] NIPS DeepLearning Workshop NN for Text by Tomas Mikolov and etc.

2. Best explained with original models, optimizing methods, Back-propagation background and Word Embedding Visual Inspector

Paper: word2vec Parameter Learning Explained

Slides: Word Embedding Explained and Visualized

Youtube Video: Word Embedding Explained and Visualized – word2vec and wevi

Demo: wevi: word embedding visual inspector

3. Word2Vec Tutorials:

Word2Vec Tutorial by Chris McCormick:

a) Word2Vec Tutorial – The Skip-Gram Model
Note: Skip over the usual introductory and abstract insights about Word2Vec, and get into more of the details

b) Word2Vec Tutorial Part 2 – Negative Sampling

Alex Minnaar’s Tutorials

The original article url is down, the following pdf version provides by Chris McCormick:

a) Word2Vec Tutorial Part I: The Skip-Gram Model

b) Word2Vec Tutorial Part II: The Continuous Bag-of-Words Model

4. Learning by Coding

Distributed Representations of Sentences and Documents

Python Word2Vec by Gensim related articles:

a) Deep learning with word2vec and gensim, Part One

b) Word2vec in Python, Part Two: Optimizing

c) Parallelizing word2vec in Python, Part Three

d) Gensim word2vec document: models.word2vec – Deep learning with word2vec

e) Word2vec Tutorial by Radim Řehůřek

Note: Simple but very powerful tutorial for word2vec model training in gensim.

An Anatomy of Key Tricks in word2vec project with examples

5. Ohter Word2Vec Resources:

Word2Vec Resources by Chris McCormick

Posted by TextProcessing

Leave a Reply

Your email address will not be published. Required fields are marked *