Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
Natural LanguageProcessing
Yuriy Guts – Jul 09, 2016
Who Is This Guy?
Data Science Team Lead
Sr. Data Scientist
Software Architect, R&D Engineer
I also teach Machine Learning:
What is NLP?
Study of interaction between computers and human languages
NLP = Computer Science + AI + Computational Linguistics
Common NLP Tasks
Easy Medium Hard
• Chunking
• Part-of-Speech Tagging
• Named Entity Recognition
• Spam Detection
• Thesaurus
• Syntactic Parsing
• Word Sense Disambiguation
• Sentiment Analysis
• Topic Modeling
• Information Retrieval
• Machine Translation
• Text Generation
• Automatic Summarization
• Question Answering
• Conversational Interfaces
Interdisciplinary Tasks: Speech-to-Text
Interdisciplinary Tasks: Image Captioning
What Makes NLP so Hard?
Ambiguity
Non-Standard Language
Also: neologisms, complex entity names, phrasal verbs/idioms
More Complex Languages Than English
• German: Donaudampfschiffahrtsgesellschaftskapitän (5 “words”)
• Chinese: 50,000 different characters (2-3k to read a newspaper)
• Japanese: 3 writing systems
• Thai: Ambiguous word boundaries and sentence concepts
• Slavic: Different word forms depending on gender, case, tense
Write Traditional “If-Then-Else” Rules?
BIG NOPE!
Leads to very large and complex codebases.
Still struggles to capture trivial cases (for a human).
Better Approach: Machine Learning
“ • A computer program is said to learn from experience E
• with respect to some class of tasks T and performance measure P,
• if its performance at tasks in T, as measured by P,
• improves with experience E.
— Tom M. Mitchell
Part 1
Essential Machine Learning Backgroundfor NLP
Before We Begin: Disclaimer
• This will be a very quick description of ML. By no means exhaustive.
• Only the essential background for what we’ll have in Part 2.
• To fit everything into a small timeframe, I’ll simplify some aspects.
• I encourage you to read ML books or watch videos to dig deeper.
Common ML Tasks
• Regression
• Classification (Binary or Multi-Class)
1. Supervised Learning
2. Unsupervised Learning
• Clustering
• Anomaly Detection
• Latent Variable Models (Dimensionality Reduction, EM, …)
Natural Language Processing (NLP)
Natural Language Processing (NLP)
Regression
Predict a continuous dependent variable
based on independent predictors
Natural Language Processing (NLP)
Natural Language Processing (NLP)
Natural Language Processing (NLP)
Linear Regression
Natural Language Processing (NLP)
After adding polynomial features
Natural Language Processing (NLP)
Classification
Assign an observation to some category
from a known discrete list of categories
Logistic Regression
Class A
Class B
(Multi-class extension = Softmax Regression)
Neural Networks
and Backpropagation Algorithm
http://playground.tensorflow.org/
Clustering
Group objects in such a way
that objects in the same group are similar,
and objects in the different groups are not
K-Means Clustering
Evaluation
How do we know if an ML model is good?
What do we do if something goes wrong?
Underfitting & Overfitting
Development & Troubleshooting
• Picking the right metric: MAE, RMSE, AUC, Cross-Entropy, Log-Loss
• Training Set / Validation Set / Test Set split
• Picking hyperparameters against Validation Set
• Regularization to prevent OF
• Plotting learning curves to check for UF/OF
Deep Learning
• Core idea: instead of hand-crafting complex features, use increased computing
capacity and build a deep computation graph that will try to learn feature
representations on its own.
End-to-end learning rather than a cascade of apps.
• Works best with lots of homogeneous, spatially related features
(image pixels, character sequences, audio signal measurements).
Usually works poorly otherwise.
• State-of-the-art and/or superhuman performance on many tasks.
• Typically requires massive amounts of data and training resources.
• But: a very young field. Theories not strongly established, views change.
Example: Convolutional Neural Network
Part 2
NLP Challenges And Approaches
“Classical” NLP Pipeline
Tokenization
Morphology
Syntax
Semantics
Discourse
Break text into sentences and words, lemmatize
Part of speech (POS) tagging, stemming, NER
Constituency/dependency parsing
Coreference resolution, wordsense disambiguation
Task-dependent (sentiment, …)
Often Relies on Language Banks
• WordNet (ontology, semantic similarity tree)
• Penn Treebank (POS, grammar rules)
• PropBank (semantic propositions)
• …Dozens of them!
Tokenization & Stemming
POS/NER Tagging
Parsing (LPCFG)
“Classical” way: Training a NER Tagger
Task: Predict whether the word is a PERSON, LOCATION, DATE or OTHER.
Could be more than 3 NER tags (e.g. MUC-7 contains 7 tags).
1. Current word.
2. Previous, next word (context).
3. POS tags of current word and nearby words.
4. NER label for previous word.
5. Word substrings (e.g. ends in “burg”, contains “oxa” etc.)
6. Word shape (internal capitalization, numerals, dashes etc.).
7. …on and on and on…
Features:
Feature Representation: Bag of Words
A single word is a one-hot encoding vector with the size of the dictionary :(
Problem
• Manually designed features are often over-specified, incomplete,
take a long time to design and validate.
• Often requires PhD-level knowledge of the domain.
• Researchers spend literally decades hand-crafting features.
• Bag of words model is very high-dimensional and sparse,
cannot capture semantics or morphology.
Maybe Deep Learning can help?
Deep Learning for NLP
• Core enabling idea: represent words as dense vectors
[0 1 0 0 0 0 0 0 0] [0.315 0.136 0.831]
• Try to capture semantic and morphologic similarity so that the features
for “similar” words are “similar”
(e.g. closer in Euclidean space).
• Natural language is context dependent: use context for learning.
• Straightforward (but slow) way: build a co-occurrence matrix and SVD it.
Embedding Methods: Word2Vec
CBoW version: predict center word from context Skip-gram version: predict context from center word
Benefits
• Learns features of each word on its own, given a text corpus.
• No heavy preprocessing is required, just a corpus.
• Word vectors can be used as features for lots of supervised
learning applications: POS, NER, chunking, semantic role labeling.
All with pretty much the same network architecture.
• Similarities and linear relationships between word vectors.
• A bit more modern representation: GloVe, but requires more RAM.
Linearities
Training a NER Tagger: Deep Learning
Just replace this with NER tag
(or POS tag, chunk end, etc.)
Language Modeling
Assign high probabilities to well-formed sentences
(crucial for text generation, speech recognition, machine translation)
“Classical” Way: N-Grams
Problem: doesn’t scale well to bigger N. N = 5 is pretty much the limit.
Deep Learning Way: Recurrent NN (RNN)
Can use past information without restricting the size of the context.
But: in practice, can’t recall information that came in a long time ago.
Long Short Term Memory Network (LSTM)
Contains gates that control forgetting, adding, updating and outputting information.
Surprisingly amazing performance at language tasks compared to vanilla RNN.
Tackling Hard Tasks
Deep Learning enables end-to-
end learning for Machine
Translation, Image Captioning,
Text Generation, Summarization:
NLP tasks which are inherently
very hard!
RNN for Machine Translation
Hottest Current Research
• Attention Networks
• Dynamic Memory Networks
(see ICML 2016 proceedings)
Tools I Used
• NLTK (Python)
• Gensim (Python)
• Stanford CoreNLP (Java with bindings)
• Apache OpenNLP (Java with bindings)
Deep Learning Frameworks with GPU Support:
• Torch (Torch-RNN) (Lua)
• TensorFlow, Theano, Keras (Python)
NLP Progress for Ukrainian
• Ukrainian lemma dictionary with POS tags
https://github.com/arysin/dict_uk
• Ukrainian lemmatizer plugin for ElasticSearch
https://github.com/mrgambal/elasticsearch-ukrainian-lemmatizer
• lang-uk project (1M corpus, NER, tokenization, etc.)
https://github.com/lang-uk
Demo 1: Exploring Semantic Properties Of ASOIAF(“Game of Thrones”)
Demo 2: TopicModeling for DOU.UA Comments
GitHub Repos with IPython Notebooks
• https://github.com/YuriyGuts/thrones2vec
• https://github.com/YuriyGuts/dou-topic-modeling
yuriy.guts@gmail.com
linkedin.com/in/yuriyguts
github.com/YuriyGuts

More Related Content

Natural Language Processing (NLP)

  • 2. Who Is This Guy? Data Science Team Lead Sr. Data Scientist Software Architect, R&D Engineer I also teach Machine Learning:
  • 3. What is NLP? Study of interaction between computers and human languages NLP = Computer Science + AI + Computational Linguistics
  • 4. Common NLP Tasks Easy Medium Hard • Chunking • Part-of-Speech Tagging • Named Entity Recognition • Spam Detection • Thesaurus • Syntactic Parsing • Word Sense Disambiguation • Sentiment Analysis • Topic Modeling • Information Retrieval • Machine Translation • Text Generation • Automatic Summarization • Question Answering • Conversational Interfaces
  • 7. What Makes NLP so Hard?
  • 9. Non-Standard Language Also: neologisms, complex entity names, phrasal verbs/idioms
  • 10. More Complex Languages Than English • German: Donaudampfschiffahrtsgesellschaftskapitän (5 “words”) • Chinese: 50,000 different characters (2-3k to read a newspaper) • Japanese: 3 writing systems • Thai: Ambiguous word boundaries and sentence concepts • Slavic: Different word forms depending on gender, case, tense
  • 11. Write Traditional “If-Then-Else” Rules? BIG NOPE! Leads to very large and complex codebases. Still struggles to capture trivial cases (for a human).
  • 12. Better Approach: Machine Learning “ • A computer program is said to learn from experience E • with respect to some class of tasks T and performance measure P, • if its performance at tasks in T, as measured by P, • improves with experience E. — Tom M. Mitchell
  • 13. Part 1 Essential Machine Learning Backgroundfor NLP
  • 14. Before We Begin: Disclaimer • This will be a very quick description of ML. By no means exhaustive. • Only the essential background for what we’ll have in Part 2. • To fit everything into a small timeframe, I’ll simplify some aspects. • I encourage you to read ML books or watch videos to dig deeper.
  • 15. Common ML Tasks • Regression • Classification (Binary or Multi-Class) 1. Supervised Learning 2. Unsupervised Learning • Clustering • Anomaly Detection • Latent Variable Models (Dimensionality Reduction, EM, …)
  • 18. Regression Predict a continuous dependent variable based on independent predictors
  • 26. Classification Assign an observation to some category from a known discrete list of categories
  • 27. Logistic Regression Class A Class B (Multi-class extension = Softmax Regression)
  • 30. Clustering Group objects in such a way that objects in the same group are similar, and objects in the different groups are not
  • 32. Evaluation How do we know if an ML model is good? What do we do if something goes wrong?
  • 34. Development & Troubleshooting • Picking the right metric: MAE, RMSE, AUC, Cross-Entropy, Log-Loss • Training Set / Validation Set / Test Set split • Picking hyperparameters against Validation Set • Regularization to prevent OF • Plotting learning curves to check for UF/OF
  • 35. Deep Learning • Core idea: instead of hand-crafting complex features, use increased computing capacity and build a deep computation graph that will try to learn feature representations on its own. End-to-end learning rather than a cascade of apps. • Works best with lots of homogeneous, spatially related features (image pixels, character sequences, audio signal measurements). Usually works poorly otherwise. • State-of-the-art and/or superhuman performance on many tasks. • Typically requires massive amounts of data and training resources. • But: a very young field. Theories not strongly established, views change.
  • 37. Part 2 NLP Challenges And Approaches
  • 38. “Classical” NLP Pipeline Tokenization Morphology Syntax Semantics Discourse Break text into sentences and words, lemmatize Part of speech (POS) tagging, stemming, NER Constituency/dependency parsing Coreference resolution, wordsense disambiguation Task-dependent (sentiment, …)
  • 39. Often Relies on Language Banks • WordNet (ontology, semantic similarity tree) • Penn Treebank (POS, grammar rules) • PropBank (semantic propositions) • …Dozens of them!
  • 43. “Classical” way: Training a NER Tagger Task: Predict whether the word is a PERSON, LOCATION, DATE or OTHER. Could be more than 3 NER tags (e.g. MUC-7 contains 7 tags). 1. Current word. 2. Previous, next word (context). 3. POS tags of current word and nearby words. 4. NER label for previous word. 5. Word substrings (e.g. ends in “burg”, contains “oxa” etc.) 6. Word shape (internal capitalization, numerals, dashes etc.). 7. …on and on and on… Features:
  • 44. Feature Representation: Bag of Words A single word is a one-hot encoding vector with the size of the dictionary :(
  • 45. Problem • Manually designed features are often over-specified, incomplete, take a long time to design and validate. • Often requires PhD-level knowledge of the domain. • Researchers spend literally decades hand-crafting features. • Bag of words model is very high-dimensional and sparse, cannot capture semantics or morphology. Maybe Deep Learning can help?
  • 46. Deep Learning for NLP • Core enabling idea: represent words as dense vectors [0 1 0 0 0 0 0 0 0] [0.315 0.136 0.831] • Try to capture semantic and morphologic similarity so that the features for “similar” words are “similar” (e.g. closer in Euclidean space). • Natural language is context dependent: use context for learning. • Straightforward (but slow) way: build a co-occurrence matrix and SVD it.
  • 47. Embedding Methods: Word2Vec CBoW version: predict center word from context Skip-gram version: predict context from center word
  • 48. Benefits • Learns features of each word on its own, given a text corpus. • No heavy preprocessing is required, just a corpus. • Word vectors can be used as features for lots of supervised learning applications: POS, NER, chunking, semantic role labeling. All with pretty much the same network architecture. • Similarities and linear relationships between word vectors. • A bit more modern representation: GloVe, but requires more RAM.
  • 50. Training a NER Tagger: Deep Learning Just replace this with NER tag (or POS tag, chunk end, etc.)
  • 51. Language Modeling Assign high probabilities to well-formed sentences (crucial for text generation, speech recognition, machine translation)
  • 52. “Classical” Way: N-Grams Problem: doesn’t scale well to bigger N. N = 5 is pretty much the limit.
  • 53. Deep Learning Way: Recurrent NN (RNN) Can use past information without restricting the size of the context. But: in practice, can’t recall information that came in a long time ago.
  • 54. Long Short Term Memory Network (LSTM) Contains gates that control forgetting, adding, updating and outputting information. Surprisingly amazing performance at language tasks compared to vanilla RNN.
  • 55. Tackling Hard Tasks Deep Learning enables end-to- end learning for Machine Translation, Image Captioning, Text Generation, Summarization: NLP tasks which are inherently very hard! RNN for Machine Translation
  • 56. Hottest Current Research • Attention Networks • Dynamic Memory Networks (see ICML 2016 proceedings)
  • 57. Tools I Used • NLTK (Python) • Gensim (Python) • Stanford CoreNLP (Java with bindings) • Apache OpenNLP (Java with bindings) Deep Learning Frameworks with GPU Support: • Torch (Torch-RNN) (Lua) • TensorFlow, Theano, Keras (Python)
  • 58. NLP Progress for Ukrainian • Ukrainian lemma dictionary with POS tags https://github.com/arysin/dict_uk • Ukrainian lemmatizer plugin for ElasticSearch https://github.com/mrgambal/elasticsearch-ukrainian-lemmatizer • lang-uk project (1M corpus, NER, tokenization, etc.) https://github.com/lang-uk
  • 59. Demo 1: Exploring Semantic Properties Of ASOIAF(“Game of Thrones”) Demo 2: TopicModeling for DOU.UA Comments
  • 60. GitHub Repos with IPython Notebooks • https://github.com/YuriyGuts/thrones2vec • https://github.com/YuriyGuts/dou-topic-modeling