ERIC Number: EJ1181941
Record Type: Journal
Publication Date: 2018
Pages: 17
Abstractor: As Provided
ISBN: N/A
ISSN: EISSN-2196-7822
EISSN: N/A
Available Date: N/A
Participation and Performance on Paper- and Computer-Based Low-Stakes Assessments
Nissen, Jayson M.; Jariwala, Manher; Close, Eleanor W.; Van Dusen, Ben
International Journal of STEM Education, v5 Article 21 2018
Background: High-stakes assessments, such the Graduate Records Examination, have transitioned from paper to computer administration. Low-stakes research-based assessments (RBAs), such as the Force Concept Inventory, have only recently begun this transition to computer administration with online services. These online services can simplify administering, scoring, and interpreting assessments, thereby reducing barriers to instructors' use of RBAs. By supporting instructors' objective assessment of the efficacy of their courses, these services can stimulate instructors to transform their courses to improve student outcomes. We investigate the extent to which RBAs administered outside of class with the online Learning About STEM Student Outcomes (LASSO) platform provide equivalent data to tests administered on paper in class, in terms of both student participation and performance. We use an experimental design to investigate the differences between these two assessment conditions with 1310 students in 25 sections of 3 college physics courses spanning 2 semesters. Results: Analysis conducted using hierarchical linear models indicates that student performance on low-stakes RBAs is equivalent for online (out-of-class) and paper-and-pencil (in-class) administrations. The models also show differences in participation rates across assessment conditions and student grades, but that instructors can achieve participation rates with online assessments equivalent to paper assessments by offering students credit for participating and by providing multiple reminders to complete the assessment. Conclusions: We conclude that online out-of-class administration of RBAs can save class and instructor time while providing participation rates and performance results equivalent to in-class paper-and-pencil tests.
Descriptors: Research Design, College Students, Science Instruction, Physics, Hierarchical Linear Modeling, Academic Achievement, Electronic Learning, Outcomes of Education, STEM Education, Student Participation, Comparative Testing, Computer Assisted Testing, Tests, Differences, Pretests Posttests, Statistical Analysis, Least Squares Statistics, Regression (Statistics)
Springer. Available from: Springer Nature. 233 Spring Street, New York, NY 10013. Tel: 800-777-4643; Tel: 212-460-1500; Fax: 212-348-4505; e-mail: customerservice@springernature.com; Web site: https://bibliotheek.ehb.be:2123/
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Colorado
Grant or Contract Numbers: N/A
Author Affiliations: N/A