Tree Recursive Neural Networks, Constituency Parsing, and Sentiment
Slides
-
Principle of compositionality
- The meaning(vector) of a sentence is determined by:
- the meaning of its word
- the rules that combine them
- Models in this section can jointly learn parse trees and compositional vector representations
- The meaning(vector) of a sentence is determined by:
-
Recursive vs. Recurrent
- Slide 17
- Recursive neural nets require a tree structure
- Recursive neural nets for structure prediction
- Inputs: two candidate children's representation
- Outputs:
- The semantic representation if the two nodes are merged
- Score of how plausible the new node would be
-
More complex TreeRNN units
Notes
-
Recursive Neural Networks
Vector, Matrix, Tensor for the "W" -
Constituency parsing(phrase structure parsing)
Suggested Readings
-
Constituency Parsing with a Self-Attentive Encoder
factored self-attention -
Parsing with Compositional Vector Grammers
Comments | NOTHING