Recurrent Artificial Neural Networks and Finite State Natural Language Processing.Report as inadecuate




Recurrent Artificial Neural Networks and Finite State Natural Language Processing. - Download this document for free, or read online. Document in PDF available to download.





It is argued that pessimistic assessments of the adequacy of artificial neural networks (ANNs) for natural language processing (NLP) on the grounds that they have a finite state architecture are unjustified, and that their adequacy in this regard is an empirical issue. First, arguments that counter standard objections to finite state NLP on the grounds that these objections confuse the explanatory aims of linguistic theory with the essential technological aims of NLP are presented. A finite state NLP model that maps strings onto meaning representations is then proposed and preliminary string processing test and cluster analysis results of a computer simulation are presented, briefly addressing some problems and further developments. It is concluded that it remains to be seen whether the model can be developed for general NLP, but that it exemplifies the kind of radical departure from linguistics-based NLP that is possible once the supposed theoretical obstacles to finite state NLP are removed. In particular, it departs from linguistics-based NLP in making no use of any syntactic or compositional structure beyond the purely sequential and amounts to table lookup mapping from strings to meaning representations. Contains 33 references. (MSE)

Descriptors: Computational Linguistics, Foreign Countries, Language Processing, Language Research, Linguistic Theory, Neurolinguistics, Simulation











Author: Moisl, Hermann

Source: https://eric.ed.gov/?q=a&ft=on&ff1=dtySince_1992&pg=11163&id=ED377709



DOWNLOAD PDF




Related documents