NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED649797
Record Type: Non-Journal
Publication Date: 2022
Pages: 142
Abstractor: As Provided
ISBN: 979-8-3575-3206-0
ISSN: N/A
EISSN: N/A
An Embodiment Account of Event Concept Representation in the Brain
Jiaqing Tong
ProQuest LLC, Ph.D. Dissertation, The Medical College of Wisconsin
Though efforts have been made for centuries, how concepts are represented in the brain is still elusive. The embodiment view claims that the sensory, motor and other brain areas through which people acquire concept information during life experiences represent this information during concept retrieval. Some compelling neurobiological evidence supports the view. However, this evidence has mainly focused on the study of object concepts and less on the representation of event concepts and their underlying neural mechanisms. Componential models for concept representation were developed in the psychological and computer science field. However, the applications of these models gave limited insight into how concepts are represented in the brain since the information represented by each component is unknown. A high-dimensional componential model inspired by the embodiment view was developed by Binder et al. that provides the opportunity to probe the characteristics of neural signature for each component. We hypothesize that the preferential representation of the event category in the brain is the result of natural constraints on the representation of specific experiential features associated with event knowledge. In this work, this was approached by following experiments and analyses. First, we investigated the neural signature of event concept representation using a large dataset spanning 8 subcategories consisting of 320 nouns and a relatively large number of participants, which provided more power. We contrasted the activation of event and object conditions to show the areas preferentially representing event concepts. This analysis identified several regions involved in the event representation in previous studies, for example, the left posterior superior temporal sulcus extending into the inferior parietal cortex, and the left anterior superior temporal sulcus. In addition, we found multiple regions that were not shown in previous studies, for example, the left inferior frontal gyrus, the right inferior frontal gyrus, and the right anterior superior temporal sulcus. We leveraged the categorical structure of the stimuli to implement a classification study to investigate the performance of classifiers in classifying the concepts into a priori categories. This analysis identified overlapping regions for event and object representation across the brain surface, with regions for representation of event concepts being more extensive than that of object concepts. The contrast between accuracies of classifiers from event and object conditions identified left inferior frontal gyrus pars opercularis and supramarginal gyrus.Second, the same set of experiential features is hypothesized to be involved in the representation of event and object concepts in the brain. We investigated whether an encoding model trained to predict the similarity structure of the neural activation patterns elicited by object concepts would also predict the similarity structure of event concepts, and vice versa. Both encoding models significantly predicted the neural similarity structures across categories in areas where event or object concepts are preferentially represented. Next, the difference in activation between event and object concepts on the brain surface should reflect the difference in representation of experiential information in the underlying brain region. We used ridge regression to find the linear combination of experiential features that best predicted the activation amplitude for event trials. The weighted model was then used to predict the average object activation amplitude for each vertex. The predicted t-map from the vertex-wise encoding analysis closely resembles the pattern observed in the original univariate contrast between event and object conditions. These results indicate that event and object concepts rely on a shared representational code based on experiential information, suggesting that differences in neural activation between object and event concepts arise from quantitative differences in the experiential content of these concept categories. Third, we hypothesized that the event-salient experiential features may give rise to differences in activation between the event and object concepts in the brain areas where event concepts are preferentially represented. We tested this hypothesis with an encoding model composed of similarity structures of event-salient features or object-salient features, defined by feature important analyses on ratings, in the brain areas where event (the event network ROI) or object (the object network ROI) concepts are preferentially represented. We found that in the event network ROI, the encoding model composed of the similarity structure of event-salient features had the best performance in predicting left-out neural similarity structures of event concepts. Conversely, in the object network ROI, the encoding model composed of similarity structures of object-salient features had the best performance in predicting left-out neural similarity structures of object concepts. These results suggest that the difference in activation in these networks may arise from the difference in the experiential information. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml.]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A