ERIC Number: ED644753
Record Type: Non-Journal
Publication Date: 2023
Pages: 312
Abstractor: As Provided
ISBN: 979-8-3814-0554-5
ISSN: N/A
EISSN: N/A
Construct Validation of a Learning Resources Rubric [LRR]: A Modified Delphi Study
Lei Wang
ProQuest LLC, Ph.D. Dissertation, Syracuse University
Central to the landscape of education is the understanding that instruction is fundamentally purposeful, aiming to facilitate effective learning. Grounded in this premise, instructional design emerges as a complex and multifaceted profession. In the digital era, Instructional Designers (IDs) harness technological advancements to shape robust learning resources. However, with a burgeoning array of resources, each varying in efficacy, arises the pressing need for designs firmly anchored in evidence-based learning principles. The challenges encountered by IDs, especially in crafting resources that accentuate deeper learning, cannot be overlooked. Existing rubrics, while extensive, often fall short in addressing the nuances imperative for designing interactive and engaging learning resources. To bridge this theory-to-practice chasm, this study proposes the Learning Resources Rubric (LRR) - a tool tailored for IDs. Derived from principles of three well-established deeper learning theories: Generative Learning Theory (GLT), Cognitive Flexibility Theory (CFT), and Reflection Theory (RT), the LRR offers a scaffold that elucidates the design, selection, and evaluation of learning resources, fostering deeper learning. The genesis of the LRR can be traced back to the Research in Designing Learning Resources (RIDLR) working group, a confluence of scholars and practitioners who, through an integrative inquiry approach, navigate the creation of resources, informed by GLT, CFT, RT, and the overarching theme of learner engagement. In this dissertation, an online three-round modified Delphi method was used to validate the LRR (Version 3), built upon a prior study (Wang & Koszalka, 2023). Of 576 potential expert IDs identified via social media, a university alum listserv, and referrals, 351 completed Round 1. Given the consensus from Round 1, Round 2 was bypassed, leading to 6 focus groups in Round 3 with 22 experts. Round 1 featured a 74-item Qualtrics survey, rated on a 5-point Likert scale, and provided insights into the LRR's validity. Round 3's focus group interviews, guided by open-ended questions, gathered qualitative data on the rubric's integration into instructional practices. A mix of quantitative and qualitative data collected across two rounds helped establish the rubric's validity. Multiple methods, including surveys, focus groups, and online document analysis, cross-validated the findings. The dissertation honed the LRR, establishing its construct validity and potential in guiding IDs towards applying the LRR in real-life scenarios. Construct validity, rubric scores, and insights from surveys and focus groups are highlighted. Data, analyzed via Confirmatory Factor Analysis (CFA) and Structural Equation Modeling (SEM), endorsed the LRR indicators' measurement model, with SEM underscoring their theoretical alignment with GLT, CFT, and RT frameworks. The findings solidified the LRR's effectiveness in steering IDs to enhance learning resources. The updated rubric and suggested next steps hold the potential to advance instructional design practices and scholarship. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml.]
Descriptors: Educational Resources, Scoring Rubrics, Construct Validity, Instructional Design, Instructional Effectiveness, Evidence Based Practice, Educational Practices
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site: http://bibliotheek.ehb.be:2222/en-US/products/dissertations/individuals.shtml
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A