Linguistic Structure: Dependecy Parsing
Slides
-
Two vies of lingustic structure
- Constituency = phrase structure grammar = context-free grammars(CFGs); Phrase structure organizes words into nested consitituents
- Deoendency structure shows which words depend on(modify or are arguments of) which other owrds
Dependency paths identify semantic relations; Dependency Grammar and Structure;
Dependency syntax postulates that syntactic structure consists of relations between lexical items, normally binary asmmetric relations("arrows") called dependencies , dependencies usually from a tree(connected, acyclic, single-head)- The arrows are commonly typed with the name of grammatical relations
- usually add a fake ROOT, so every word is a dependent of precisely other node
-
The rise of annotated data(Universal Dependencies treebanks)
- semms a lot slower and less useful than buliding a grammar
- reusability of the labor; broad coverage, not just a few intuitions; frenqencies and distributional information; a way to evaluate sysytems
-
What are the sources of information for dependency parsing?
- Bilexical affinities: [dicussion → issue] is plausiable
- Dependency distance: mostly with nearby words
- Intervening material: Dependencies rarely span intervening verbs or punctuation
- Valency of heads: How many dependents on which side are usual for a head?
-
Dependency Parsing constraints
- only one word is a depent of ROOT
- don't want cycles like A → B, B → A
- this makes the dependencies a tree
- final issue is whether arrows can cross(non-projective) or not
-
Methods of Dependency Parsing
- Dynamic programming
- Graph algorithms
- Constraint Satisfaction
- "Transiton-based parsing" or "deterministic dependency parsing"
-
Neural dependency parser
-
Handing non-projectivity
not neccessary all the timeNotes
- Keyphrases: Dpendency Parsing
- Two main types of parse trees: constituency structures; dependency strutures
-
Dependecy parsing problems(formally): {input: S = W0W1...Wn, (W0 is the ROOT); output: dependency tree graph G}
-
Two subproblems in dependency parsing
- Learning: given a training set D of sentences annotated with dependency graphs, induce a parsing model M that can be used to parse new sentences
- Parsing: given a parsing model M and a sentence S, derive the optimal dependency graph D for S according to M
-
Transition-based dependency parsing
rely on a state machine to implement mapping function -
Greedy Deterministic Transition-based parsing
search strategies? -
Neural Dependency Parsing
transition-based; graph methods
Suggested Readings
-
Incrementality in Deterministic Dependency Parsing
讨论约束条件对 Deterministic Dependency Parsing的影响, 没细看 -
A Fast and Accurate Dependecny Parser using Neural Networks
the model introduced in this lecure; activation function = x^3 -
Dependency Parsing
textbook with 127 pages 没看 -
Globally Normalized Transition-Based Neural Networks
global normalization(所有的1→n),比local normalization(每一步步进)更好处理label biase(处理早期决策失误);transition-based + feedforward NN VS RNN-based; -
Universal Dependencies: A cross-linguistic typology
a taxonomy of grammatical relations applicable to variety of languages 没细看 -
Universal Dependencies website
universal dependecies介绍, 没看
Comments | NOTHING