image

Representational correlates of hierarchical phrase structure in deep language models

Publication Date:

Abstract

While contextual representations from Transformer-based architectures have set a new standard for many NLP tasks, there is not yet a complete accounting of their inner workings. In particular, it is not entirely clear what aspects of sentence-level syntax are captured by these representations, nor how (if at all) they are built along the stacked layers of the network... (read more)

Authors

https://openreview.net/pdf?id=mhEd8uOyNTI

0001-01-01 -