Eric Kim

Eric Kim

Data Scientist, Petroleum Engineer

Natural Language Processing

Optimize Computational Efficiency of Skip-Gram with Negative Sampling

When training your NLP model with Skip-Gram, the very large size of vocabs imposes high computational cost on your machine. Since the original Skip-Gram model is unable to handle this high cost, we use an alternative, called Negative Sampling.

2019-05-26
22 min reading
Natural Language Processing

Demystifying Neural Network in Skip-Gram Language Modeling

The past couple of years, neural networks in Word2Vec have nearly taken over the field of NLP, thanks to their state-of-art performance. But how much do you understand about the algorithm behind it? This post will crack the secrets behind neural net in Word2Vec.

2019-05-06
20 min reading
Natural Language Processing

Understanding Multi-Dimensionality in Vector Space Modeling

How does word vectors in Natural Language Processing capture meaningful relationships among words? How can you quantify those relationships? Addressing these questions starts from understanding the multi-dimensional nature of NLP applications.

2019-04-16
18 min reading