- Research
- Open access
- Published:
Taxonomy of digital curation activities that promote critical thinking
Smart Learning Environments volume 12, Article number: 17 (2025)
Abstract
Critical thinking (CT) consists of a deliberate and reflective process that can lead to informed decisions. It involves scrutinizing the trustworthiness and consistency of underlying assumptions, the sources of data, and the validity of other information. CT embodies deliberate, self-regulated judgment incorporating cognitive abilities such as analysis, evaluation, and inference. In this follow-up study to Gadot & Tsybulsky (2023), we confirm the role of Digital Curation (DC) as a catalyst for CT and present a structured taxonomy of DC practices that enhance CT. This taxonomy is designed to act as a systematic framework for categorizing DC tasks while capturing the personal and social facets of curation activities. The curation workflow is broken down into the acquisition and assimilation stages and incorporates the level of complexity. This classification can thus be harnessed by educators and practitioners to define the aims and competencies that are critical to fostering CT skills through DC practices.
Introduction
Critical thinking (CT) consists of a pragmatic and introspective approach to assessing the reliability and credibility of information, assumptions, and data sources. It embodies rational and contemplative judgment, allowing individuals to reach conclusions grounded in dependable data (Ennis, 1989, 1993; Kennedy et al., 1991). In today’s digital epoch, the need for CT has increased significantly due to the proliferation of information and the rapid pace of technological advancements, which demand individuals to evaluate, analyze, and synthesize data effectively (Van Laar et al., 2017, 2020). The burgeoning internet landscape has broadened access to diverse types of information; simultaneously, it has amplified the difficulty of differentiating between authentic and spurious content. Individuals often encounter distorted, prejudiced, or inaccurate information within the internet's expansive data troves. CT equips individuals with the tools to appraise the trustworthiness of online sources, scrutinize the information, and recognize potential biases or agendas (Halpern, 1998). CT also underpins effective communication and judicious decision-making. Amidst the proliferation of social media and digital platforms, the susceptibility to influence and the dissemination of misinformation are prevalent concerns (Roozenbeek et al., 2022). CT empowers individuals to cognitively express their thoughts and navigate decision-making by evaluating evidence and weighing various viewpoints (Saadé et al., 2012).
CT is crucial not only in societal and professional contexts but also in academia. It bolsters students' capabilities to assimilate novel concepts, question preconceptions, and foster inventive and autonomous thought processes. CT is a prerequisite for college students to cultivate research acumen, achieve proficiency in data analysis, and generate evidence-based reasoning to substantiate their arguments (Pithers & Soden, 2000). These competencies are particularly critical in an educational landscape increasingly shaped by digital tools and platforms.
Digital curation (DC), defined as the systematic process of gathering, organizing, and evaluating digital resources, serves as both a beneficiary of and a vehicle for CT. DC tasks—such as filtering, synthesizing, and classifying information—demand higher-order thinking skills, mirroring the core components of CT (Gadot & Tsybulsky, 2023). At the same time, engaging in DC fosters the incremental development of CT by providing learners with structured opportunities to practice evaluation, analysis, and inference within a meaningful context.
Building on our prior work, this study explores the intricate relationship between DC and CT. In our previous study (Gadot & Tsybulsky, 2023), we proposed the DC2CT model, which outlines how the iterative processes of DC align with and promote CT. Specifically, we argued that DC facilitates the development of CT through a recursive interplay of cognitive activities, such as assessing the reliability of sources and synthesizing diverse perspectives into coherent digital collections. In this follow-up study, we introduce a stratified taxonomy similar to those found in biology, where organisms are arranged hierarchically according to their shared traits (Black, 2008), akin to Bloom’s Taxonomy, which organizes learning objectives into hierarchical levels of complexity and specificity (Bloom, 1994; Forehand, 2010). Below, we detail this taxonomy, which classifies the DC activities hierarchically according to their CT complexity.
Research gap and study objectives
While DC has been recognized as a valuable pedagogical practice (Dayan & Tsybulsky, 2024), existing studies have primarily focused on its outcomes, such as promoting personalized learning (Tsybulsky, 2020), fostering CT (Gadot & Tsybulsky, 2023), and enhancing digital literacy (e.g., Aguilar-Peña et al., 2022). However, these studies lack a systematic framework that links specific DC activities with the cognitive processes underlying CT. This gap limits our ability to understand how DC can be harnessed to scaffold the gradual development of CT skills, particularly in structured and measurable ways.
The current study addresses this lacuna by introducing a hierarchical taxonomy of DC activities, grounded in theoretical models of CT complexity (Davies, 2015) and informed by practical educational needs. This taxonomy provides a dual contribution: it enriches the theoretical foundation of DC by mapping its alignment with CT processes and offers a practical framework for educators to design and evaluate DC practices systematically. Through this structured approach, we aim to illuminate the pathways through which students progress from basic cognitive tasks, such as sorting and filtering information, to more complex activities, such as synthesizing and critiquing data within collaborative and social contexts.
Two research questions were addressed to construct the taxonomy:
RQ1: What is the optimal way to classify curators' actions as a function of the cognitive level of complexity required?
RQ2: Which taxonomic structure best captures the participants' curatorial activities?
Theoretical background
Digital curation
DC systematically employs digital tools to assemble digital compilations by identifying and assessing digital material's authenticity, dependability, and veracity. The curator undertakes content classification, preservation, and extraction, formulates novel conceptual frameworks, and disseminates these materials (Gadot, 2017; Forkosh Baruch & Gadot, 2021). Curators reflect their unique viewpoints and craft digital compilations that provide a comprehensive, content-rich perspective on a topic (Gadot, 2017; Forkosh Baruch & Gadot, 2021).
Curators orchestrate their DC endeavors as a recursive, iterative process. They first establish search parameters that resonate with their personal context (Tsybulsky, 2020). Then, they examine media resources, appraise the content for its merits and integrity, and assemble items into a coherent digital narrative targeted at specific audiences (Mihailidis, 2015; Mihailidis & Cohen, 2013). DC is both personal and social since it involves the efforts of a collective rather than a solitary adjudicator. This results in the co-creation of knowledge within digital compendia (Gadot & levin, 2014).
Empirical investigations have consistently shown that DC is a vital facet of digital literacy and a transformative cultural agent (Mihailidis, 2015; Mihailidis & Cohen, 2013; Mihailidis & Fromm, 2014; Mills, 2013; Ungerer, 2016; Author 1 & Other, 2020). It involves users in higher-order cognitive processes, such as curating content, sifting through data, and synthesizing disparate information into a cohesive account (Antonio & Taffley, 2015; Antonio et al., 2012; Minocha & Petre, 2012). DC is thus an instrumental approach for cultivating CT abilities in the contemporary knowledge era (Authors, 2023).
DC in education encompasses the application of DC to learning, pedagogy, and the development of individual students’ comprehension of subject matter. As a didactic strategy, DC positions learners at the forefront by enhancing their digital competency, equipping them to handle the deluge of digital data, strengthening their modern-day skills, and integrating cognitive, metacognitive, and social abilities (Mihalidis & Fromm, 2014; Forkosh Barich & Gadot, 2021). DC is a rich and dense educational activity, prompting significant, personalized, emotional, and intellectual learning by empowering the curator to forge a personal thematic structure (Tsybulsky, 2020).
Progress in technology is having a profound impact on the fabric of society (Levin & Kojukhov, 2009). As advances continue to unfold, integrating technology into the classroom has become pivotal, as is catering to students' proclivities, competencies, and necessities. Given the growing consumption of digital knowledge and nearly limitless access to information, educational paradigms are also transforming. Learning is transitioning towards more personalized, inventive, and informal methods; learners are increasingly influenced by in-person and online peers, not only by pedagogical authorities. This evolution suggests that integrating DC into educational agendas would be advantageous (Gadot & Levin, 2012).
One of the benefits of tapping social media in education relates to the interactive and communicative learning it facilitates among peers (Barak, 2017). Vygotsky stressed the criticality of social contexts in education and argued that the interplay of individual and communal elements enhances learning (Vygotsky, 2012). In the context of DC, the educator collaborates in the learning journey, not as the exclusive informant but as a facilitator, by encouraging students to construct and share knowledge collectively and independently. Thus, DC in education merges the advantages of digital tools that can process copious amounts of information for advanced communication with the human capacity to utilize information cognitively for educational ends (Gadot, 2017).
Curators impart new meaning to data by interpreting it through fresh perspectives (Antonio & Tuffley, 2015; Minocha & Petre, 2012). The Five C’s of DC, as suggested by Deschaine and Sharma (2015), constitute a set of progressive stages termed collection, categorization, critiquing, conceptualization, and circulation, where each phase leads seamlessly to the next. Content is meticulously refined and vetted for accuracy in these phases and culminates in a distinct digital compilation. Curators gather, evaluate, reassess, reorganize, and ultimately share data, thus reinforcing digital literacy and fostering critical inquiry, evaluative judgment, knowledge structuring and credibility, all of which correspond to the needs of the twentyfirst-century learner.
Critical thinking
CT skills are pivotal to developing the competencies needed to observe, contemplate, investigate, interrogate, and methodically resolve scenarios within a scientific context (Demir, 2015). CT encapsulates efficacious, innovative, and autonomous reasoning, decision-making, and problem resolution. Efficacy pertains to considering all aspects of an issue and arriving at logical conclusions; innovation is engendered by finding solutions to intricate problems; autonomy emerges when learners seek divergent avenues beyond conventional pedagogy (Willingham, 2007). Willingham posited that robust CT is contingent on sufficient knowledge in the relevant domain and frequent practice, which suggests that CT is best cultivated within the framework of specific subject areas.
Osborne (2014) associated CT with higher-order cognitive functions such as evaluation, critique, and synthesis. Similarly, Pedrosa-de-Jesus et al. (2014) posited that CT represents a prime intellectual capability and is an indispensable competence within the ambit of higher education.
Our taxonomy below implements the Delphi Project's (1988) definition of CT as "purposeful, self-regulatory judgment resulting in interpretation, analysis, evaluation, and inference…" (Facione, 1990), which emphasizes the engagement of self-regulation and metacognitive strategies for inquiry, dialogue, and knowledge acquisition. As delineated by Dwyer et al. (2014), we interpret CT as manifesting on dual planes. The first involves the application of reflective judgment in the context of complex cognitive activities and learning. The second evolves from the former and covers specific tasks such as analysis, evaluation, and inference within the problem-solving and educational processes. The learner's disposition, readiness, motivation to apply reflective judgment, and capacity to carry them out during the educational experience are crucial to CT.
DC, which involves selecting and scrutinizing potentially pertinent information, formulating judgments, integrating perspectives, and creating digital collections, taps specific skills foundational to CT in the modern information age. These DC evaluation, analysis, and inference activities are related to the core components of CT (Gadot & Tsybulsky, 2023) and as shown by Hürsen (2020), have the positive effects of problem-based learning supported by Web 2.0 tools on the CT abilities of candidate teachers. Hürsen also suggested that CT manifests on dual planes and that reflective judgment is crucial in complex cognitive activities and learning. This aligns with recent findings suggesting that digital competence, including CT skills, can predict academic success. This is crucial, especially in the contemporary era, when everyone must be able to appraise credibility and source quality critically and identify misinformation (Cabero-Almenara et al., 2023; Tseng et al., 2021).
Methods
This study implemented a mixed methods research design known as Concurrent Triangulation, where quantitative and qualitative data are collected, coded, and analyzed simultaneously. Both methods are given equal weight, and the use of both methods is designed to take advantage of each, and increase the reliability of the findings through triangulation (Hanson et al., 2005; Johnson & Onwuegbuzie, 2004; Tashakkori & Creswell, 2007).
Setting and participants
The participants (n = 107) were divided into three groups. Group A (n = 94) consisted of undergraduate students (n = 45) and graduate students (n = 49) from three Israeli universities. Group B consisted of expert curators (n = 5) who curate for professional or academic purposes. Group C consisted of active curators in social networks in various fields (n = 8).
In Group A, three lecturers agreed to take part in the study. They supplied the researchers with non-identifiable, cumulated data on student demographics. The students could opt for pseudonyms for their digital data compilations, although some chose to disclose their names. The participants were heterogeneous, and represented a variety of nationalities, faiths, age groups, and genders. All used the web-based curation tool Scoop.it and spent from two weeks to three months to create their digital collections. Throughout, the students were advised to judiciously select information, with a keen eye towards its veracity, currency, and source diversity, and provide written evaluations. Each digital collection had to comprise at least 20 entries.
For a subset of 47 participants, the curation task was integral to their coursework on a topic of their choice. In addition, 30 participants enrolled in a seminar that gave them approximately four academic hours of theoretical grounding in DC before they embarked on the practical curation endeavor. These students received instruction on the underlying concepts and tenets of DC, supplemented by exemplars of digital collections and an exposition of the curation methodology.
The institutional review board approved this study. Confidentiality and anonymity were respected.
Data sources and collection
The data collection analysis covered: (a) the DC activity, (b) the content of the digital collections, and (c) the participants' experience with DC. The data on DC activity were derived from log files of the DC activity, the content of the digital collections, reflective reports, and interviews.
Data on the DC activities for the Group A participants (n = 94) were obtained from automated log files that recorded user engagement and contributions with their digital collections over 101 days, with observations conducted twice daily at 8:00 AM and 8:00 PM. Consequently, each log entry corresponded to an individual's activity within a specified timeframe. The digital collections were cataloged based on their thematic focus. The data points included the total count of published items, the curator's personal commentaries, the number of views received, the tally of items curated outside the established curation feed, the number of items shared by peers, and the frequency of items reposted by the curators into their own collections.
The digital collections created by all participants (n = 107) were examined in depth to evaluate the quality of the DC practices. This examination was anchored in a set of optimal DC activities defined in the literature. Group B's data collections, consisting of expert curators from Scoop.it, facilitated defining DC quality. Seven specific attributes were examined to appraise curation quality, comprising:
-
1.
The suitability of the items for the task.
-
2.
The updating of the information in the items.
-
3.
The trustworthiness of the sources from which the information was derived.
-
4.
The coherence and relevance of the items to one another.
-
5.
The relevance and contribution of the items to the overarching theme.
-
6.
The range of data sources to ensure a broad spectrum of perspectives.
-
7.
Commentary by the curators.
The reflective reports by Group A at the end of the DC activity (n = 37) were also examined. These enabled the participants to address internal issues while describing their experiences and their feelings throughout the reflective narratives. The instructions were to write down their reflections on their experiences and describe their experiences during the DC process.
Semi-structured, in-depth interviews were carried out with the group C participants (n = 8). The interviews focused on the goal and how they conducted their curation activities. Each interview lasted about 45 min, at a place and time chosen by the interviewees. The interviews were recorded and transcribed to probe the insights and meanings that came up during the curation process.
Data analysis
Log files of the DC activity and the content of the digital collections
We analyzed the data in two phases: based on preliminary research, we coded the content of the digital collections according to the criteria listed below. Then, a statistical analysis of the data was conducted. Data coding involved assigning scores ranging from 1 (lowest) to 5 (highest) for factors a to f above, as assessed by a researcher and a university lecturer for all criteria in the digital collections. These assessments were then aggregated to derive the average score. To solidify the validity and reliability of the findings, the first 10 digital collections underwent a rigorous joint analysis by the evaluators until they reached full agreement. The interrater agreement was 0.67** (0.01 > p**). For factor g concerning personal insights, the percentage of personal insights was calculated along with their quality. The resulting score was expressed as the mean score from both judges. The criteria used for evaluating personal insights are detailed in Table 1.
Statistical analysis
The analysis was performed using IBM-SPSS statistical processing software. It included a factor analysis of the curators' activities. A MANOVA was applied to test for significant differences between undergraduate and graduate students, and between students who had taken the course in DC before the DC activity and students who were not given instruction.
Reflective reports and interviews
Altogether, the 37 reflective reports and eight interviews resulted in 45 transcriptions. The transcriptions were analyzed according to the principles of Grounded Theory (Strauss & Corbin, 1990) using both a bottom-up and a top-down directed content analysis method (Hsieh & Shannon, 2005) to identify different aspects of the DC activity.
Results
The findings are presented in two subsections to correspond to each research question.
The classification of DC activity as a function of the cognitive levels of CT
To explore the first research question, we analyzed the DC activity data and the content of the digital collections. The curators' activities were personal and social. Therefore, they were divided into the curators’ personal activities and the curators’ social activities. The analysis yielded three categories of personal DC activities: 1. Sorting, filtering, and judging information items. This included the curators' assessment of digital item quality, verifying the credibility of the source, and the item's suitability for the topic of the digital collection and its contribution or innovation. 2. Combining new items and integrating them coherently and judiciously into the data collection. 3. Adding the curators' perspectives to the data items. The analysis yielded two types of social activities: active knowledge-sharing with curators in the knowledge community, and following peer curators; i.e., building a knowledge community about the DC topic, commenting on other curators' publications, and providing recommendations to other curators on items appropriate to their curatorial topic.
As shown in Table 2, the findings indicated that the quality of the Group A novice curators followed a hierarchy in the level of complexity of their DC activities. Most personal activities were rated as high quality. Social activities were rated lower, in that some students shared knowledge but did not actively sustain a knowledge community. The mean score for personal activity (3.53) was thus much higher than the mean score for social activity (1.82). Among the personal actions, the highest score was for matching the published items to the topic of DC (4.41); The second highest score was for the quality of the combination of the data items (3.82), and the lowest score was for adding personal insights to the digital collections (2.35).
Table 3 compares the mean social activity of the expert curators (group B) to the novice curators (group A). There were notable differences between the two. Social activity was a significant component of DC activity in the expert curators. By contrast, only a few group A curators engaged in social activity. Similarly, the expert curators engaged in more social-community activities.
The results of a Pearson correlation to examine the relationship between personal and social activity and the quality of the novice curators' digital collections appear in Table 4. Personal activity was significantly related to the quality of the digital collection such that increases in personal activity were directly related to the quality of the digital collection. There was no correlation between amount of social activity and the digital collection quality.
A simple regression test was conducted to examine whether the level of personal and social activity would predict the quality of the digital collection. The digital collection quality variable was included in the regression as the dependent variable. Personal and social activity were included in the regression as independent variables. Table 5 presents the standardized and non-standardized coefficients of the regression variables. The findings indicated that personal activity and social activity explained 52% of the variance for the quality of the digital collection (F(2,91) = 49,256, p < 0.001). Of the predictive variables, only personal activity significantly contributed to the variability in the quality of the digital collection.
Next, potential differences between the background variables and the quality of the digital collection and between personal and social activity were examined. A MANOVA was conducted to test for differences between the undergraduate and the graduate students in the quality of their digital collections, personal activity, and social activity. There were significant differences between the undergraduate and graduate students in terms of the quality of the digital collection, personal activity, and social activity (F(3,90) = 8.96, P = 0.000). The two groups only differed significantly for social activity (F(1,92) = 24,621, P = 0.000). In the graduate students, the mean values for social activity were higher (M = 35.01, SD = 37.93) than the mean values for the undergraduate students (M = 9.96, SD = 11.39). No differences were found between groups for personal activity (F(1,78) = 0.684, P = 0.411) or the quality of the digital collection (F(1,78) = 0.009, P = 0.924).
Another MANOVA was conducted to test for differences between students who enrolled in a DC training course prior to making their collections and students who did not in terms of the quality of their digital collection, personal activity, and social activity. There were significant differences between groups for the quality of the digital collection, personal activity, and social activity (F (3,90) = 10,579, P = 0.000). Specifically, the groups only differed in terms of the amount of social activity (F (1,92) = 24,635, P = 0.000). Students who enrolled in the training course had higher mean values for social activity (M = 39.01, SD = 42.18) compared the mean values for students who did not (M = 11.03, SD = 12.71). No differences were found between groups for personal activity F (1,78) = 0.187, P = 0.666) or the quality of the digital collection (F (1,78) = 1.971, P = 0.164).
Thus overall, the Group A novice curators had more proficiency in personal than in social activities, with notable disparities between their performance and that of the Group B expert curators in social-community activities.
The correlation and regression analyses revealed a significant relationship between personal activities and the quality of digital collections and indicated that personal engagement was a critical factor impacting quality. The statistical tests also highlighted differences between undergraduate and graduate students and between participants who enrolled in the DC training course and those who did not, particularly in terms of social activities. These findings thus point to the importance of instructional support in enhancing curators' engagement and the quality of DC.
Taxonomic structure of the participants' curatorial activities
Based on the RQ1 results, which showed a hierarchy in terms of the curators' activity, a qualitative analysis was conducted to increase reliability through triangulation. The qualitative data analysis served to better understand the quantitative findings. Analysis of the transcriptions identified four categories—data item properties, the curator, the target audience, and ‘other’—covering different reasons why curators published particular data items and not others. Table 6 lists the issues that motivated the curators to publish information and sample quotes that illustrate these considerations.
There were more personal statements and insights in the interviews (Group C) than in the reflective reports (Group A). The quantitative analysis showed that DC activities were considered relatively complex, were not done routinely, and were lower in quality than for expert curators. By contrast, many expert curators often added personal statements: "I almost always add comments, just a few words so that [they] understand what made me disseminate. [I] rarely share without adding my own remarks" (b); "I add a statement in every publication, and sometimes I publish the same item in different media and add different comments according to the medium or the target audience" (e). Some only added personal statements if they felt it was necessary: "I add comments if the published content is ambiguous" (a). Expert curators considered that curation was above all a social activity that involves active participation in the knowledge community. They often mentioned the social aspects of sharing, commenting, and following peer curators. Some (3) noted the quality of the social side as the most crucial consideration when choosing the curation platform: "I choose the media according to the target audience. If I curate information in a specific field, I do it in a media where there is an active community on this topic" (a).
To develop the taxonomy of curatorial activities, the quantitative findings forming a hierarchy of curatorial actions were combined with the qualitative analysis to enhance understanding through triangulation. The qualitative analysis revealed four motivational categories influencing curators' publishing decisions: the properties of data items and curator characteristics, audience consideration and other factors, as presented in Table 6. Differences between novice and expert curators were observed such that the experts frequently integrated personal statements into their publications and viewed curation as a social activity within their knowledge communities.
Discussion and implications
This study probed the hypothesis that DC significantly enhances learners' CT skills. The importance of CT in education, especially with the growing influence of AI and LLMs in our lives, highlights the need for pedagogical approaches that focus on teaching students not only to absorb information but to question and evaluate it actively (Wu, 2023). This approach is particularly critical given that scientific misinformation can have significant consequences. Thus, CT emerges as a foundational skill in both scientific education and broader societal contexts. Researchers have voiced the pressing need for educational practices that foster CT and epistemic vigilance to prepare the younger generations to scrutinize and reason about the wealth of content they encounter, which is crucial to developing informed citizens and scientists (Bielik & Krüger, 2024; Tseng et al., 2021).
The findings here contribute to the literature showing that DC is an essential component of digital literacy and a cultural game-changer that promotes 21st-century skills (Mihailidis, 2015; Mihailidis & Cohen, 2013; Mihailidis & Fromm, 2014; Mills, 2013; Ungerer, 2016; Forkosh Baruch & Gadot, 2021). In a previous study, we discussed the inherent potential of DC as a pedagogical approach to promoting CT (Gadot & Tsybulsky, 2023). In the current study, we added a taxonomy that will help use DC as a CT promoter in a structured and efficient way.
The results showed that new curators are able to gradually internalize DC activities in the first stage, which corresponds to the lowest level of complexity. The curators assimilate actions that we defined as personal activity; namely, sorting, filtering, and judging information items. Here the curators assess the quality of digital items, verify the credibility of their sources, and the item's suitability for the topic of the digital collection and its contribution or innovation. The next level consists of combining new items and integrating them coherently and judiciously into the data collection. The third level consists of adding the curators' perspectives to the data items such as their opinion of the item and its contribution to the DC topic. The expert curators incorporated more social activities, including active knowledge sharing with peers in the knowledge area and following other curators. These activities involved building a knowledge community around the DC topic, commenting on other curators' publications, and providing recommendations on information items relevant to their curatorial focus, which tend to be more complex.
The findings indicated that novice curators gain proficiency in adopting DC activities commensurate with the complexity levels of CT in Davies's (2015) five-step model. At a lower level of complexity, the three stages of personal activity emerged. The two stages of social activity were primarily present in the activity of the expert curators.
Overall, DC emerged as a process that requires CT in that it involves analysis, evaluation, expressing a critical opinion, and other skills (Gadot & Tsybulsky, 2023). These findings confirm the hypothesis that DC activities promote CT skills by engaging learners in a multilayered curation process that parallels the cognitive complexity of CT outlined by Davies (2015). Novice curators begin with personal activities such as evaluating, sorting, and filtering information, which are foundational to CT. As they progress, they engage in more complex social activities, which include knowledge sharing and community building within the domain of DC. This progression not only corroborates the inherent potential of DC as a pedagogical tool but also substantiates its structured application through the taxonomy by enabling a gradual enhancement of CT from basic understanding to advanced application in collaborative settings.
This five-stage DC taxonomy is shown in Fig. 1.
Theoretically, the findings enrich the discourse on DC and CT by proposing a taxonomy that delineates the stages of cognitive engagement in DC activities, thereby bridging a crucial gap in the literature. This taxonomy highlights the alignment between the core processes of CT—evaluation, analysis, and inference—and the iterative stages of DC. It provides a structured framework for understanding the cognitive and metacognitive demands of DC tasks.
Methodologically, the study advances the field by employing a hybrid approach combining quantitative CT measures with qualitative analyses of DC practices. This mixed-methods approach offers a more robust and triangulated perspective, enabling researchers to capture the multifaceted nature of DC as it relates to CT development. By integrating diverse data sources, this methodology provides a richer and more nuanced understanding of the interplay between curation tasks and CT processes, offering a replicable model for future studies.
Practicaly, the proposed taxonomy has significant implications for educators, serving as a valuable framework for designing and implementing educational activities aligned with the developmental stages of CT skills. For example, educators can use the taxonomy to scaffold students' progression from lower-order to higher-order cognitive skills within DC tasks, ensuring that the learning activities are appropriately challenging and effective. Additionally, the taxonomy underscores the pedagogical potential of DC in the digital age, positioning it as a powerful tool for fostering twentyfirst-century skills such as information literacy, digital literacy, and CT.
The taxonomy also constitutes a pivotal tool for researchers and educators alike. For researchers, it offers a structured framework to evaluate participants' curatorial stages, assess their cognitive engagement in DC tasks, and measure educational progress pre-and post-intervention. Moreover, the taxonomy can inform the design of empirical studies to explore the long-term effects of DC on students' CT abilities and broader educational outcomes. For educators, it provides a practical scaffold to craft and implement learning interventions that leverage the cognitive and social dimensions of DC. The taxonomy makes several contributions to teaching, as listed below:
-
1.
Formulating and acknowledging learners' skills to engage in a quality educational social curation task.
-
2.
Planning a gradual learning activity to support the students' incorporation of CT goals.
-
3.
Monitoring the students' achievements in acquiring and assimilating the required skills.
Ultimately, by highlighting the interconnections between DC and CT, this study contributes to the growing recognition of DC as a transformative pedagogical practice in the digital era.
Limitations and directions for future work
This study has several limitations. The sample consisted of university students from Israel, who are predominantly native speakers of Hebrew or Arabic. However, their digital collections were in English, which is not their first language. Future inquiries should explore the implementation of DC as an educational method with participants using material in their mother tongue. In addition, given that DC represents a viable instructional activity for a broader age spectrum, future research should assess its effectiveness as an educational strategy to bolster CT in middle and high school students. The role of the teacher in the DC process should also be examined.
Future research can build on the findings by exploring the application of the taxonomy in diverse educational settings and disciplines to determine its generalizability and adaptability. The long-term impact of DC on learners' CT skills could be explored through longitudinal studies. Experimental research could be designed to test the efficacy of specific pedagogical interventions based on the stages of the taxonomy to enhance students' CT skills. Another promising avenue would be the exploration of the interplay between DC activities and other twentyfirst-century skills, such as collaborative problem-solving and digital communication. Researchers could also consider the role of emerging technologies, such as artificial intelligence, in supporting and advancing curatorial and CT practices. Finally, comparative studies between novice and expert curators could yield more profound insights into the progression of skill development and inform targeted instructional strategies for different levels of expertise.
Availability of data and materials
Due to the classroom nature of this research, the participants did not agree to authorize the public sharing of their data; hence, the supporting data are not available.
References
Aguilar-Peña, J. D., Rus-Casas, C., Eliche-Quesada, D., Muñoz-Rodríguez, F. J., & La Rubia, M. D. (2022). Content curation in e-learning: A case of study with Spanish engineering students. Applied Sciences, 12(6), 3188.
Antonio, A., Martin, N., & Stagg, A. (2012). Engaging higher education students via digital curation. In M. Brown, M. Hartnett & T. Stewart (Eds.), Future challenges, sustainable futures. In Proceedings of the 29th annual conference of the Australasian Society for Computers in Learning in Tertiary Education [ASCILITE 2012] (pp. 55–59). University of Southern Queensland.
Antonio, A. B., & Tuffley, D. (2015). Promoting information literacy in higher education through digital curation. M/C Journal, 18(4), 1–11. https://doi.org/10.5204/mcj.987
Barak, M. (2017). Cloud pedagogy: Utilizing web-based technologies for the promotion of social constructivist learning in science teacher preparation courses. Journal of Science Education and Technology, 26(5), 459–469. https://doi.org/10.1007/s10956-017-9686-5
Bielik, T., & Krüger, D. (2024). Perceived relevance of critical thinking aspects for biology graduate students. Journal of Biological Education, 58(1), 166–181. https://doi.org/10.1080/00219266.2022.2026806
Black, B. (2008, September). Critical thinking–a definition and taxonomy for Cambridge Assessment: supporting validity arguments about critical thinking assessments administered by Cambridge Assessment. Paper presented at the 34th International Association of Educational Assessment Annual Conference, Cambridge, UK.
Bloom, B. (1994). Reflections on the development and use of the taxonomy. In Anderson, L. Sosniak, L (Eds.) Bloom’s taxonomy: A forty-year retrospective. The National Society for the Study of Education, (pp. 1–8).
Cabero-Almenara, J., Gutiérrez-Castillo, J. J., Guillén-Gámez, F. D., et al. (2023). Digital Competence of Higher Education Students as a Predictor of Academic Success. Tech Know Learn, 28, 683–702. https://doi.org/10.1007/s10758-022-09624-8
Davies, M. (2015). A model of critical thinking in higher education. In M. Paulsen (Ed.), Higher education: Handbook of theory and research (pp. 41–92). Springer.
Dayan, E., & Tsybulsky, D. (2024). Designing and teaching socio-scientific issues online: Digital curation in the classroom. International Journal of Science Education. https://doi.org/10.1080/09500693.2024.2381133
Demir, S. (2015). Evaluation of critical thinking and reflective thinking skills among science teacher candidates. Journal of Education and Practice, 6(18), 17–21.
Deschaine, M. E., & Sharma, S. A. (2015). The five Cs of digital curation: Supporting twenty-first-century teaching and learning. InSight: A Journal of Scholarly Teaching, 10, 19–24.
Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity, 12, 43–52. https://doi.org/10.1016/j.tsc.2013.12.004
Ennis, R. H. (1989). Critical thinking and subject specificity: Clarification and needed research. Educational Researcher, 18(3), 4–10. https://doi.org/10.3102/0013189X018003004
Ennis, R. H. (1993). Critical thinking assessment. Theory into Practice, 32(3), 179–186. https://doi.org/10.1080/00405849309543594
Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. The Delphi Report. California Academic Press.
Forkosh Baruch, A., & Gadot, R. (2021). Social curation experience: Towards authentic learning in preservice teacher training. Technology, Knowledge and Learning, 26(1), 105–122. https://doi.org/10.1007/s10758-020-09449-3
Forehand, M. (2010). Bloom’s taxonomy. Emerging Perspectives on Learning, Teaching, and Technology, 41(4), 47–56.
Gadot, R. (2017). Social Curation as Learning Activity. Ph.D. thesis in technology education. Tel Aviv, Israel: Tel Aviv University. Supervision: Levin, I. [In Hebrew with abstract in English]
Gadot R., Levin I. (2012) Digital Curation as Learning Activity. Proceedings EDULEARN. Barcelona, Spain, 6038-6045
Gadot, R., & Levin, I. (2014). Networked learning based on digital curation. In ECSM 2014 European Conference on Social Media (p. 635). Brighton, England.
Gadot, R., & Tsybulsky, D. (2023). Digital curation as a pedagogical approach to promote critical thinking. Journal of Science Education and Technology, 32(6), 814–823. https://doi.org/10.1007/s10956-022-10016-x
Halpern, D. F. (1998). Teaching critical thinking for transfer across domains: Disposition, skills, structure training, and metacognitive monitoring. American Psychologist, 53(4), 449–455. https://doi.org/10.1037/0003-066X.53.4.449
Hanson, W. E., Creswell, J. W., Clark, V. L. P., Petska, K. S., & Creswell, J. D. (2005). Mixed methods research designs in counseling psychology. Journal of Counseling Psychology, 52(2), 224. https://doi.org/10.1037/0022-0167.52.2.224
Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288.
Hürsen, C. (2020). The effects of problem-based learning supported by Web 2.0 tools on the critical thinking abilities of teacher candidates. Technology, Knowledge and Learning, 25(3), 503–521. https://doi.org/10.1007/s10758-020-09458-2
Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26. https://doi.org/10.3102/0013189X033007014
Kennedy, M., Fisher, M. B., & Ennis, R. H. (1991). Critical thinking: Literature review and needed research. In L. Idol & B. F. Jones (Eds.), Educational values and cognitive instruction: Implications for reform (pp. 11–40). Lawrence Erlbaum & Associates.
Levin, I., & Kojukhov, A. (2009). Personalizing education in postindustrial society [conference presentation abstract]. Third International Conference on Digital Society, Cancun, Mexico. https://doi.org/10.1109/icds.2009.13
Mihailidis, P. (2015). Digital curation and digital literacy: Evaluating the role of curation in developing critical literacies for participation in digital culture. E-Learning and Digital Media, 12(5–6), 443–458. https://doi.org/10.1177/2042753016631868
Mihailidis, P., & Cohen, J. N. (2013). Exploring curation as a core competency in digital and media literacy education. Journal of Interactive Media in Education, 1, 2. https://doi.org/10.5334/2013-02
Mihailidis, P., & Fromm, M. E. (2014). Scaffolding curation: Developing digital competencies in media literacy education. In M. Stocchetti (Ed.), Media and education in the digital age: Concepts, assessments, subversions (pp. 91–104). Germany: Frankfurt am Main.
Mills, M. S. (2013). Facilitating multimodal literacy instruction through digital curation. In J. Whittingham, S. Huffman, W. Rickman, & C. Wiedmaier (Eds.), Technological tools for the literacy classroom (pp. 46–63). IGI Global.
Minocha, S., & Petre, M. (2012). Handbook of social media for researchers and supervisors. The Open University.
Osborne, J. (2014). Teaching critical thinking? New directions in science education. School Science Review, 352, 53–62.
Pedrosa-de-Jesus, H., Moreira, A., Lopes, B., & Watts, M. (2014). So much more than just a list: Exploring the nature of critical questioning in undergraduate sciences. Research in Science & Technological Education, 32(2), 115–134. https://doi.org/10.1080/02635143.2014.902811
Pithers, R. T., & Soden, R. (2000). Critical thinking in education: A review. Educational Research, 42(3), 237–249.
Roozenbeek, J., Van Der Linden, S., Goldberg, B., Rathje, S., & Lewandowsky, S. (2022). Psychological inoculation improves resilience against misinformation on social media. Science Advances, 8(34), eabo6254. https://doi.org/10.1126/sciadv.abo6254
Saadé, R. G., Morin, D., & Thomas, J. D. (2012). Critical thinking in E-learning environments. Computers in Human Behavior, 28(5), 1608–1617. https://doi.org/10.1016/j.chb.2012.03.025
Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Sage Publications.
Tashakkori, A., & Creswell, J. W. (2007). Editorial: The new era of mixed methods. Journal of Mixed Methods Research, 1(1), 3–7.
Tseng, A. S., Bonilla, S., & MacPherson, A. (2021). Fighting “bad science” in the information age: The effects of an intervention to stimulate evaluation and critique of false scientific claims. Journal of Research in Science Teaching, 58(8), 1152–1178. https://doi.org/10.1002/tea.21696
Tsybulsky, D. (2020). Digital curation for promoting personalized learning: A study of secondary-school science students’ learning experiences. Journal of Research on Technology in Education, 52(3), 429–440. https://doi.org/10.1080/15391523.2020.1728447
Ungerer, L. (2016). Digital curation as a core competency in current learning and literacy: A higher education perspective. International Review of Research in Open and Distributed Learning: IRRODL, 17(5), 1–27. https://doi.org/10.19173/irrodl.v17i5.2566
Van Laar, E., Van Deursen, A. J., Van Dijk, J. A., & De Haan, J. (2017). The relation between 21st-century skills and digital skills: A systematic literature review. Computers in Human Behavior, 72, 577–588. https://doi.org/10.1016/j.chb.2017.03.010
Van Laar, E., Van Deursen, A. J., Van Dijk, J. A., & De Haan, J. (2020). Determinants of 21st-century skills and 21st-century digital skills for workers: A systematic literature review. SAGE Open, 10(1), 2158244019900176.
Vygotsky, L. S. (2012). Thought and language. MIT Press (Originally published in 1934).
Willingham, D. T. (2007). Critical thinking: Why it is so hard to teach? American Federation of Teachers Summer, 8–19.
Wu, Y. (2023). Integrating generative AI in education: How ChatGPT brings challenges for future learning and teaching. Journal of Advanced Research in Education, 2(4), 6–10.
Acknowledgements
We extend our gratitude to the students and lecturers who collaborated with us on this study. We also thank Prof. Ilya Levin for his valuable contributions.
Funding
The authors declare that no funds, grants, or other support were received during the preparation of this manuscript.
Author information
Authors and Affiliations
Contributions
RG: Conceptualization, Data Collection and Analysis, Investigation, Project Administration, Writing—Original Draft Preparation. DT: Conceptualization, Investigation, Project Administration, Writing—Review & Editing.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
This study was approved by the authors’ IRB. The participants were informed of the research goals and procedure and indicated their willingness to participate by signing a written informed consent form.
Competing interest
The authors have no conflicts of interest to declare that are relevant to the content of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Gadot, R., Tsybulsky, D. Taxonomy of digital curation activities that promote critical thinking. Smart Learn. Environ. 12, 17 (2025). https://doi.org/10.1186/s40561-025-00365-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s40561-025-00365-6