Extracting finite structure from infinite language

Date

2005-08-01

Advisors

Journal Title

Journal ISSN

ISSN

0950-7051

Volume Title

Publisher

Elsevier

Type

Article

Peer reviewed

Abstract

Description

This paper presents a novel unsupervised neural network model for learning the finite-state properties of an input language from a set of positive examples. The model is demonstrated to learn the Reber grammar perfectly from a randomly generated training set and to generalize to sequences beyond the length of those found in the training set. Crucially, it does not require negative examples. 30% of the tests yielded perfect grammar recognizers, compared with only 2% reported by other authors for simple recurrent networks. The paper was initially presented at AI-2004 conference where it won the Best Technical Paper award.

Keywords

RAE 2008, UoA 23 Computer Science and Informatics, artificial neural networks, grammar induction, natural language processing, self-organizing map, STORM (Spatio Temporal Self-Organizing Recurrent Map)

Citation

McQueen, T. et al. (2005) Extracting finite structure from infinite language. Knowledge-Based Systems, 18(4-5), pp. 135-141.

Rights

Research Institute