Text Processing Course: Stanford Deep Learning for Natural Language Processing

Name: Deep Learning for Natural Language Processing

Website:

Description

Natural language processing (NLP) is one of the most important technologies of the information age. Understanding complex language utterances is also a crucial part of artificial intelligence. Applications of NLP are everywhere because people communicate most everything in language: web search, advertisement, emails, customer service, language translation, radiology reports, etc. There are a large variety of underlying tasks and machine learning models powering NLP applications. Recently, deep learning approaches have obtained very high performance across many different NLP tasks. These models can often be trained with a single end-to-end model and do not require traditional, task-specific feature engineering. In this spring quarter course students will learn to implement, train, debug, visualize and invent their own neural network models. The course provides a deep excursion into cutting-edge research in deep learning applied to NLP. The final project will involve training a complex recurrent neural network and applying it to a large scale NLP problem. On the model side we will cover word vector representations, window-based neural networks, recurrent neural networks, long-short-term-memory models, recursive neural networks, convolutional neural networks as well as some very novel models involving a memory component. Through lectures and programming assignments students will learn the necessary engineering tricks for making neural networks work on practical problems.

About the Instructors

Richard Socher

“I am the Founder and CEO at MetaMind. Our vision is to improve artificial intelligence and make it easily accessible. I enjoy research in machine learning, natural language processing and computer vision. In spring 2015, I taught a class on Deep Learning for Natural Language Processing at Stanford. I got my PhD in the CS Department at Stanford, advised by Chris Manning and Andrew Ng. This Wired article talks about some of the research work that we do at MetaMind. I’m on Twitter.”


Leave a Reply

Your email address will not be published. Required fields are marked *