Natural Language Processing with Deep Learning in Python - Complete guide on deriving and implementing word2vec, GloVe, word embeddings, and sentiment analysis with recursive nets
Created by Lazy Programmer Inc.
Preview this Course - GET COUPON CODE
In this course we are going to look at NLP (natural language processing) with deep learning.
Previously, you learned about some of the basics, like how many NLP problems are just regular machine learning and data science problems in disguise, and simple, practical methods like bag-of-words and term-document matrices.
These allowed us to do some pretty cool things, like detect spam emails, write poetry, spin articles, and group together similar words.
In this course I’m going to show you how to do even more awesome things. We’ll learn not just 1, but 4 new architectures in this course.
What you'll learn
- Understand and implement word2vec
- Understand the CBOW method in word2vec
- Understand the skip-gram method in word2vec
- Understand the negative sampling optimization in word2vec
- Understand and implement GloVe using gradient descent and alternating least squares
- Use recurrent neural networks for parts-of-speech tagging
- Use recurrent neural networks for named entity recognition
- Understand and implement recursive neural networks for sentiment analysis
- Understand and implement recursive neural tensor networks for sentiment analysis
- Use Gensim to obtain pretrained word vectors and compute similarities and analogies