Resum

Unlike most of traditional approaches to grammar, today's formal and computational models make use of the lexicon as an essential part of the whole system. Differences between current theoretical and computational models are more and more insignificant albeit the latter are perhaps still characterized by yielding full priority to minimalist solutions over more abstract considerations such as exhaustivity or formal elegance. The paper introduces some strategies in linguistic computation for natural language processing (NLP) according to a typology based on functional complexity and describing in every case the scope and the role performed by the dictionary: pattern matching, semàntic grammars, syntactic parsers, augmented transition networks, unification formalisms, case frame grammars, etc. It ends with an exploration into lexicographically oriented procedures of conceptual dependency, which are found beyond the NLP, deeply within the domain of artificial intelligence.

Text complet

The PDF file did not load properly or your web browser does not support viewing PDF files. Download directly to your device: Download PDF document
Back to Top
GET PDF

Document information

Published on 01/07/94
Accepted on 01/07/94
Submitted on 01/07/94

Volume 9, Issue 2, 1994
DOI: 10.7203/caplletra.17.7397
Licence: CC BY-NC-SA license

Document Score

0

Views 0
Recommendations 0

Share this document

claim authorship

Are you one of the authors of this document?