Items Outperform Adjectives in a Computational Model of Binary Semantic Classification.
Cogn Sci
; 47(9): e13336, 2023 09.
Article
en En
| MEDLINE
| ID: mdl-37695844
Semantic memory encompasses one's knowledge about the world. Distributional semantic models, which construct vector spaces with embedded words, are a proposed framework for understanding the representational structure of human semantic knowledge. Unlike some classic semantic models, distributional semantic models lack a mechanism for specifying the properties of concepts, which raises questions regarding their utility for a general theory of semantic knowledge. Here, we develop a computational model of a binary semantic classification task, in which participants judged target words for the referent's size or animacy. We created a family of models, evaluating multiple distributional semantic models, and mechanisms for performing the classification. The most successful model constructed two composite representations for each extreme of the decision axis (e.g., one averaging together representations of characteristically big things and another of characteristically small things). Next, the target item was compared to each composite representation, allowing the model to classify more than 1,500 words with human-range performance and to predict response times. We propose that when making a decision on a binary semantic classification task, humans use task prompts to retrieve instances representative of the extremes on that semantic dimension and compare the probe to those instances. This proposal is consistent with the principles of the instance theory of semantic memory.
Palabras clave
Texto completo:
1
Colección:
01-internacional
Base de datos:
MEDLINE
Asunto principal:
Semántica
/
Conocimiento
Tipo de estudio:
Prognostic_studies
Límite:
Humans
Idioma:
En
Revista:
Cogn Sci
Año:
2023
Tipo del documento:
Article
Pais de publicación:
Estados Unidos