ERIC Number: EJ1270142
Record Type: Journal
Publication Date: 2020
Pages: 4
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0142-7237
EISSN: N/A
Available Date: N/A
What Counts as an Exemplar Model, Anyway? A Commentary on Ambridge (2020)
Mahowald, Kyle; Kachergis, George; Frank, Michael C.
First Language, v40 n5-6 p608-611 Oct-Dec 2020
Ambridge calls for exemplar-based accounts of language acquisition. Do modern neural networks such as transformers or word2vec -- which have been extremely successful in modern natural language processing (NLP) applications -- count? Although these models often have ample parametric complexity to store exemplars from their training data, they also go far beyond simple storage by processing and compressing the input via their architectural constraints. The resulting representations have been shown to encode emergent abstractions. If these models are exemplar-based then Ambridge's theory only weakly constrains future work. On the other hand, if these systems are not exemplar models, why is it that true exemplar models are not contenders in modern NLP? [For Ben Ambridge's "Against Stored Abstractions: A Radical Exemplar Model of Language Acquisition," see EJ1269951.]
Descriptors: Models, Language Processing, Computational Linguistics, Language Acquisition, Natural Language Processing, Linguistic Input, Abstract Reasoning, Linguistic Theory, Computer Software, Artificial Intelligence, Syntax, Figurative Language, Memory
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: http://bibliotheek.ehb.be:2814
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A