Conceptual Dependency and Natural Language Processing
Conceptual Dependency and Natural Language Processing
Conceptual Dependency and Natural Language Processing
2. “Mary fell.”
In this case, the actor is missing. Mary didn’t really do the
falling, another force, gravity, has acted on Mary and
pushed her in a direction, to the ground.
Example CD Sentences
3. “John amazed Mary”
No action specified;
Can use these to link together sequences of events. Note that there are
other modifiers, such as k for continuous events. The focus here is not
on the details, but understanding how to make basic constructions of
CD.
CD Examples
Kenrick went to New York.
o d New York
Kenrick PTRANS KM
CD Examples
• Mary fell.
o d Ground
Gravity PROPEL Mary
X, X>Ground
CD Examples
• John amazed Mary.
nD
J
o
h O
r
s
t
a
t
e A
m
a
z
e
d
M
a
r
y
C
o
o
l
CD Examples
• John saw Mary.
o RC
P
(
Jo
h
n
)
nM
J
o
h T
R
A
NM
Sa
r
y
E
y
e
s
(
Jo
h
n
)
I
o DM
a
r
y
nA
J
o
h T
T
E
N
DEy
e
s
o DB
o
o
k
nA
J
o
h T
T
E
N
DEy
e
s
CD Examples
• John drank milk.
o DM
o
ut
h
(
Jo
hn
)
nI
J
o
h N
G
E
STM
i
l
k
G
l
a
s
s
I
o DM
o
ut
h
(
Jo
hn
)
nP
J
o
h T
R
A
NS
Mi
l
k
CD Examples
• John shot up heroin.
o D Vein(John)
John INGEST Heroin
Hypo
I
o D Vein(John)
John PROPEL Hypo
CD Examples
• It is often difficult to keep inferences out of the
representation process.
• Consider the sentence “John beat Mary with a bat”
k o D Mary
John PROPEL Bat
r
state Mary
Phys_Contact
Bat
state Phys_State<X
Mary
Phys_State
>X
Bat Example
• The resulting condition that Mary was in a lower
physical state is actually a inference. The sentence
alone doesn’t say that Mary was hurt.
• However, it is something we have inferred.
Normally we would leave this out of the CD
representation, and use inference rules to figure
out that Mary was hurt.
– For example, a rule could say that Phys_Contact with a
person and a hard object results in a lower physical
state.
CD Example
• Mary gave John a bat.
p o R John
Mary ATRANS Bat
Mary
CD Example
• We can combine CD events as objects, e.g.
“Mary told Kenrick that she gave John a bat.”
p o R Kenrick
Mary MTRANS
Mary
p o R John
Mary ATRANS Bat
Mary
CD Example
• Wile E. Coyote decided to kill the road
runner.
o DC
P
(W
il
e)
W
i
l
e M
B
UI
LD
L
T
M(
Wil
e)
o
u
tp
ut
obj
ec
t i
npu
t
obj
ec
ts
W
i
l
e D
O L
T
M
r
s
t
at
e P
h
y_
St
at
e(
0)
R
R
P
h
y_
St
at
e>0
There is a tendency to become ad hoc and make up our own definitions; this is ok as long as
we are consistent and can still access the essential primitive actions (same problem in
predicate logic)
CD a panacea?
• Can you think of something that would be
difficult, inadequate, or impossible to
represent in CD?
Scripts
• Given a knowledge base in CD, or parsed English
sentences into CD, what can we do with them?
• One application is to couple CD with the notion of
scripts.
– A scripts is a stereotypical sequence of events in a
particular context. These are sequences we have
learned over time. They are similar to scripts for a play
or movie, and contain actors, props, roles, scenes, and
tracks.
Stereotypical Script
• Consider the stereotypical script of • There are set props:
going to eat at McDonalds (do you – ketchup, mustard dispenser
remember the last time you ate fast – signs
food? How about the time before that? – cash register
Experience may be lost in the
– tables
stereotypicality unless something
unique happened) – drink machine
– Sometimes there are deviations to the • There is a stereotypical
script; e.g. going to the bathroom, or sequence of events:
modifications to the script like getting a – Wait in line
drink before receiving food. People use – Give order
existing scripts or cases to learn new
cases; eventually new cases may – Pay money
become new scripts. – Receive food
• There are set actors: – Sit down
– cleaning guy – Eat
– cashier – Bus own table
– Manager
Simple Shopping Script
• Actors: Shopper, Clerk
• Objects: Merchandise
• Location: Store
By linking these sentences together, the system can answer questions like “Why did John
drive to Humpy’s?” (To be at the proximity of the restaurant to use the restaurant plan).
Additionally, we can use the rules to provide disambiguation of variables as with scripts.
Plans
• In short, plans allow:
– Inference rules to connect CD events
– Disambiguation through variable instantiations
Parsing into CD
• So far, we have ignored the problem of parsing
input text into CD. We’ve been assuming that we
are already working in the CD domain. However,
a more general system will have to parse input
English text into the CD format.
• One parsing technique is to assign a “packet” to
each word with all of the sense definitions it may
have. The packet watches for other words or
context that came before or after it, and uses this
context to determine the correct meaning of the
word.
CD Parsing Example - Knowledge
(Def-Word Jack
(Assign *cd-form* (Person (Name (Jack)))
*part-of-speech* Noun-Phrase))
(Def-Word Lobster
(Assign *cd-form* (Lobster)
*part-of-speech* Noun
*type* (Food)))
Definition for lobster is just a noun; we can include semantic information as well.
Ideally this information (e.g. food, lobster) would also be indexed into a semantic
hierarchy so that we have a better idea of what food and lobsters are.
CD Parsing - Knowledge
(Def-Word Hair
(Assign *cd-form* (hair)
*part-of-speech* Noun
*type* inanimate))
Disambiguates by
Just another sample definition, this time for Hair.
looking ahead to
(Def-Word Had next packet, put on
(Assign *part-of-speech* Verb stack and activated
*subject* *cd-form*
(Next-Packet
(Test (And (Equal *part-of-speech* Noun)
(Equal *type* Food))
(Assign *cd-form* (INGEST (ACTOR *subject*)
(OBJECT *cd-form*))
(Test (And (Equal *part-of-speech* Noun)
(Equal *type* Inanimate))
(Assign *cd-form* (POSS (ACTOR *part-of-speech*)
(OBJECT *cd-form*)))))
CD Parsing Process
• Parse from left to right
• Retrieve the packet definition for each word
• Assign any variables applicable and put the next-
packet on the stack for examination of future
packets.
• We could also look backwards and see if previous
packets have been examined for disambiguation
purposes; do this by checking to see if we can
execute the top of the stack
CD Parsing Example
• “Jack had lobster”
Jack: *pos* = Noun-Phrase
*cd-form* = (Person (Name (Jack)))
Syntactic analysis
Q/A, Database Query,
Translator, etc.
NP VP
N V NP
John had N
lobster
Semantic Analysis
Restaurant-Script
(Ptrans (Actor John)
(Ingest (Actor John) …)
(Object Lobster)) Contextual
(Ingest (Actor John)
Analysis
(Object Lobster))
…
Distinct Phases?
• The best system can ideally go back and forth across
these boundaries in the process of parsing; for example,
performing a semantic analysis can help while doing
syntactic or even morphological processing.
– People also operate this way; we don’t wait for something to
finish parsing before working on semantic analysis. We can
see this by examining the mistakes that people make in
reading “garden path” sentences like:
• The old man the boats.
• The horse raced past the barn fell.
• The player kicked the ball kicked him.
• However, for ease of computing, usually computer
programs separate parsing as these distinct phases.
Syntactic Parsing
• Most of the work has been done in
Syntactic Parsing. We can use many of the
ideas used in compilers for parsing
computer programs. A common technique
is to define a grammar, and use that
grammar to parse the sentences.
Syntax Example
Here is a sample grammar for a subset of English:
Dictionary:
an : Determiner
arrow: Noun
flies: Noun, Verb
like : Preposition, Verb
time: Adj, Noun, Verb
Recursive Transition Network
Sample Parse Trees
S
S S VP
NP VP NP VP VP PP
N V Prep NP Adj NP2 like Det NP2 Time NP2 like Det NP2
Noun_Phrase Verb_Phrase
Article Noun
Animal
Object Entity
Instrument Teeth
Pseudocode for Semantic Parser
Process_Sentence
Noun_Concept Noun_Phrase()
VP_Concept Verb_Phrase()
Bind Noun_Concept to agent in VP_Concept
Noun_Phrase Procedure
N Representation of Noun
If indefinite article and number singular, noun concept is generic
If definite article and number singular, bind marker to noun concept
If number plural, indicate that noun concept is plural
Verb_Phrase Procedure
V Representation of Verb
If verb has an object
Noun_Concept Noun_Phrase()
Bind concept for Noun_Concept to object of V
Semantic Parse Example
1. ()
4. N=dog,singular
10. S = (bite (agent dog) (object man) (instrument (teeth (part dog))))
5. V = ?
6. V = (bite (agent ?A) (object ?O) (instrument (teeth (part ?A))))
2. NP=() 9. V = (bite (agent ?A) (object man) (instrument (teeth (part ?A))))
3. N = dog, singular Sentence
Noun_Phrase Verb_Phrase
7. NP=()
8. N = man, singular
Article Noun
Action=Bite
instrument
agent object
dog
man teeth
part-of
Semantic / Discourse Analysis
• Syntactic: left with a number of parse trees
• Semantic analysis : can help us disambiguate which parse tree is
correct.
• Semantic and discourse analysis composes most of the things we
discussed in CD. A way to use the meaning of the words to further
disambiguate what is happening. Semantic analysis can rule out many
interpretations, such as that of “Time flies” being a type of fly, where
Time is an adjective.
• Scripts are one method of discourse analysis; they use the previous
context and previous sentences to interpret new sentences. All steps
together are required for a complete understanding of input text.
However, portions may be used alone to address many problems.
Additionally, often domains can be simplified to a point where a
grammar may be constructed for it and the appropriate understanding
tasks can be applied.
Demo to Try
• MIT Jupiter system applies analysis from the
phonological level up to discourse, but only in the
small domain of weather around the world.
• Via speech recognition you can ask Jupiter
questions, such as "What is the weather like in
Anchorage?" or "Where is it snowing now?" You
can try it by calling 1-888-573-TALK.
• Note that if you call, your voice will be recorded
and used for future speech recognition research.