Learning first-pass structural attachment preferences with dynamic grammars and recursive neural networks

Sturt, P., Costa, F., Lombardo, V. and Frasconi, P. (2003) Learning first-pass structural attachment preferences with dynamic grammars and recursive neural networks. Cognition, 88, pp. 133-169. (doi: 10.1016/S0010-0277(03)00026-X)

Full text not currently available from Enlighten.

Abstract

One of the central problems in the study of human language processing is ambiguity resolution: how do people resolve the extremely pervasive ambiguity of the language they encounter? One possible answer to this question is suggested by experience-based models, which claim that people typically resolve ambiguities in a way which has been successful in the past. In order to determine the course of action that has been “successful in the past” when faced with some ambiguity, it is necessary to generalize over past experience. In this paper, we will present a computational experience-based model, which learns to generalize over linguistic experience from exposure to syntactic structures in a corpus. The model is a hybrid system, which uses symbolic grammars to build and represent syntactic structures, and neural networks to rank these structures on the basis of its experience. We use a dynamic grammar, which provides a very tight correspondence between grammatical derivations and incremental processing, and recursive neural networks, which are able to deal with the complex hierarchical structures produced by the grammar. We demonstrate that the model reproduces a number of the structural preferences found in the experimental psycholinguistics literature, and also performs well on unrestricted text.

Item Type:Articles
Status:Published
Refereed:Yes
Glasgow Author(s) Enlighten ID:UNSPECIFIED
Authors: Sturt, P., Costa, F., Lombardo, V., and Frasconi, P.
College/School:College of Science and Engineering > School of Psychology
Journal Name:Cognition

University Staff: Request a correction | Enlighten Editors: Update this record