How to Write a Resume Summary + Examples
March 24, 2025
Article
Break into NLP. Master cutting-edge NLP techniques through four hands-on courses! Updated with TensorFlow labs in December 2023.
Instructors: Eddy Shyu
143,283 already enrolled
(5,741 reviews)
Recommended experience
Intermediate level
Working knowledge of machine learning, intermediate Python experience including DL frameworks & proficiency in calculus, linear algebra, & statistics
(5,741 reviews)
Recommended experience
Intermediate level
Working knowledge of machine learning, intermediate Python experience including DL frameworks & proficiency in calculus, linear algebra, & statistics
Use logistic regression, naïve Bayes, and word vectors to implement sentiment analysis, complete analogies & translate words.
Use dynamic programming, hidden Markov models, and word embeddings to implement autocorrect, autocomplete & identify part-of-speech tags for words.
Use recurrent neural networks, LSTMs, GRUs & Siamese networks for sentiment analysis, text generation & named entity recognition.
Use encoder-decoder, causal, & self-attention to machine translate complete sentences, summarize text, and answer questions.
Add to your LinkedIn profile
Add this credential to your LinkedIn profile, resume, or CV
Share it on social media and in your performance review
Natural Language Processing (NLP) is a subfield of linguistics, computer science, and artificial intelligence that uses algorithms to interpret and manipulate human language.
This technology is one of the most broadly applied areas of machine learning and is critical in effectively analyzing massive quantities of unstructured, text-heavy data. As AI continues to expand, so will the demand for professionals skilled at building models that analyze speech and language, uncover contextual patterns, and produce insights from text and audio.
By the end of this Specialization, you will be ready to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages and summarize text. These and other NLP applications are going to be at the forefront of the coming transformation to an AI-powered future.
This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
Applied Learning Project
This Specialization will equip you with machine learning basics and state-of-the-art deep learning techniques needed to build cutting-edge NLP systems:
• Use logistic regression, naïve Bayes, and word vectors to implement sentiment analysis, complete analogies, translate words, and use locality-sensitive hashing to approximate nearest neighbors.
• Use dynamic programming, hidden Markov models, and word embeddings to autocorrect misspelled words, autocomplete partial sentences, and identify part-of-speech tags for words.
• Use dense and recurrent neural networks, LSTMs, GRUs, and Siamese networks in TensorFlow to perform advanced sentiment analysis, text generation, named entity recognition, and to identify duplicate questions.
• Use encoder-decoder, causal, and self-attention to perform advanced machine translation of complete sentences, text summarization, and question-answering. Learn models like T5, BERT, and more with Hugging Face Transformers!
Use logistic regression, naïve Bayes, and word vectors to implement sentiment analysis, complete analogies & translate words.
Use dynamic programming, hidden Markov models, and word embeddings to implement autocorrect, autocomplete & identify part-of-speech tags for words.
Use recurrent neural networks, LSTMs, GRUs & Siamese networks in TensorFlow for sentiment analysis, text generation & named entity recognition.
Use encoder-decoder, causal, & self-attention to machine translate complete sentences, summarize text, and answer questions.
DeepLearning.AI is an education technology company that develops a global community of AI talent. DeepLearning.AI's expert-led educational experiences provide AI practitioners and non-technical professionals with the necessary tools to go all the way from foundational basics to advanced application, empowering them to build an AI-powered future.
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Earn a degree from world-class universities - 100% online
Upskill your employees to excel in the digital economy
Natural language processing is a subfield of linguistics, computer science, and artificial intelligence that uses algorithms to interpret and manipulate human language.
In the Natural Language Processing (NLP) Specialization, you will learn how to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages, summarize text, and even build chatbots. These and other NLP applications will be at the forefront of the coming transformation to an AI-powered future.
NLP is one of the most broadly applied areas of machine learning and is critical in effectively analyzing massive quantities of unstructured, text-heavy data. As AI continues to expand, so will the demand for professionals skilled at building models that analyze speech and language, uncover contextual patterns, and produce insights from text and audio.
This Specialization will equip you with both the machine learning basics as well as the state-of-the-art deep learning techniques needed to build cutting-edge NLP systems:
• Use logistic regression, naïve Bayes, and word vectors to implement sentiment analysis, complete analogies, translate words, and use locality-sensitive hashing to approximate nearest neighbors.
• Use dynamic programming, hidden Markov models, and word embeddings to autocorrect misspelled words, autocomplete partial sentences, and identify part-of-speech tags for words.
• Use dense and recurrent neural networks, LSTMs, GRUs, and Siamese networks in TensorFlow and Trax to perform advanced sentiment analysis, text generation, named entity recognition, and to identify duplicate questions.
• Use encoder-decoder, causal, and self-attention to perform advanced machine translation of complete sentences, text summarization, question-answering, and build chatbots. Learn include T5, BERT, transformer, reformer, and more!
The Natural Language Processing Specialization is one-of-a-kind. • It teaches cutting-edge techniques drawn from recent academic papers, some of which were only first published in 2019. • It covers practical methods for handling common NLP use cases (autocorrect, autocomplete), as well as advanced deep learning techniques for chatbots and question-answering. • It starts with the foundations and takes you to a stage where you can build state-of-the-art attention models that allow for parallel computing. • You will not only use packages but also learn how to build these models from scratch. We walk you through all the steps, from data processing to the finished products you can use in your own projects. • You will complete one project every week to make sure you understand the concepts for a total of 16 programming assignments.
Working knowledge of machine learning, intermediate Python experience including DL frameworks & proficiency in calculus, linear algebra, & stats.
This Specialization is for students of machine learning or artificial intelligence and software engineers looking for a deeper understanding of how NLP models work and how to apply them.
This Specialization consists of four Courses. At the rate of 5 hours a week, it typically takes 4 weeks to complete each Course.
Younes Bensouda Mourri and Lukasz Kaiser created the Natural Language Processing Specialization.
Younes Bensouda Mourricompleted his Bachelor's in Applied Math and CS and Master's in Statistics from Stanford University. Younes helped create 3 AI courses at Stanford - Applied Machine Learning, Deep Learning, and Teaching AI - and taught two for a few years. Currently, he is an adjunct lecturer of computer science at Stanford University.
Lukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the “Attention is all you need” Transformer paper.
In Course 1: NLP with Classification and Vector Spaces,
All the programming assignments and ungraded labs have been refactored
All programming assignments have new automatic graders
In Course 2: NLP with Probabilistic Models,
All the programming assignments and ungraded labs have been refactored
All programming assignments have new automatic graders
In Course 3: NLP with Sequence Models,
A new section on visualizing embeddings from the trained model has been added to the Week 1 assignment
Corrections have been made to the evaluation section of the Week 2 assignment Week 3 assignment has been refactored
The following lectures have been updated to new versions:
Trax: Neural Networks
Trax: Layers (Now Classes, subclasses, and inheritance)
Dense and ReLU layers
Serial Layer
Math in Simple RNNs
Gated Recurrent Units
RNNs and Vanishing Gradients
Introduction to LSTMs
LSTM Architecture
Triplets
The following ungraded labs have been updated to new versions:
Hidden State Activation
GRU
Perplexity
Vanishing and Exploding gradients
In Course 4: NLP with Attention Models,
Two new ungraded labs concerned with how attention is implemented in deep learning models have been added
The following lectures have been updated to new versions or modified (excluding HuggingFace content):
Seq2seq model for NMT
Seq2seq model with Attention
Queries, Keys, Values, and Attention
Setup for Machine Translation
Teacher Forcing
NMT with Attention
Evaluation: BLEU Score
Evaluation: ROUGE Score
Sampling and Decoding
Beam Search
Minimum Bayes Risk (MBR)
Transformers vs. RNNs
Transformers overview
Scaled dot-product Attention
Masked Self-Attention
Multi-Head Attention
Four new lectures on Hugging Face have been added:
Introduction from the Hugging Face team
Introduction to Hugging Face
Using Transformers
Fine-tuning a pre-trained model
Two new ungraded labs on Hugging Face have been added:
Use of pipelines for Question & Answering
Fine-Tuning a pre-trained model for Question & Answering
If you would like to update to the new material, reset your deadlines. If you’re in the middle of a course, you will lose your progress when you reset your deadlines. Please save any notebook assignments you've submitted by downloading your existing notebooks before switching to the new version.
• Your certificates will carry over for any courses you’ve already completed.
• If your subscription is currently active, you can access the updated labs and submit assignments without paying for the month again.
• If you go to the Specialization, you will see the original version of the lecture videos and assignments. You can complete the original version if so desired (this is not recommended).
• If you would like to update to the new material, reset your deadlines. If you’re in the middle of a course, you will lose your notebook work when you reset your deadlines.• Please save your work by downloading your existing notebooks before switching to the new version.
• If you do not see the option to reset deadlines, contact Coursera via the Learner Help Center.
• Your certificates will carry over for any courses you’ve already completed. • If your subscription is currently inactive, you will need to pay again to access the labs and submit assignments for those courses.
This course is completely online, so there’s no need to show up to a classroom in person. You can access your lectures, readings and assignments anytime and anywhere via the web or your mobile device.
If you subscribed, you get a 7-day free trial during which you can cancel at no penalty. After that, we don’t give refunds, but you can cancel your subscription at any time. See our full refund policy.
Yes! To get started, click the course card that interests you and enroll. You can enroll and complete the course to earn a shareable certificate, or you can audit it to view the course materials for free. When you subscribe to a course that is part of a Specialization, you’re automatically subscribed to the full Specialization. Visit your learner dashboard to track your progress.
Yes. In select learning programs, you can apply for financial aid or a scholarship if you can’t afford the enrollment fee. If fin aid or scholarship is available for your learning program selection, you’ll find a link to apply on the description page.
When you enroll in the course, you get access to all of the courses in the Specialization, and you earn a certificate when you complete the work. If you only want to read and view the course content, you can audit the course for free. If you cannot afford the fee, you can apply for financial aid.
This Specialization doesn't carry university credit, but some universities may choose to accept Specialization Certificates for credit. Check with your institution to learn more.
Financial aid available,