ERIC Number: EJ1344512
Record Type: Journal
Publication Date: 2022-Sep
Pages: 33
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0023-8333
EISSN: EISSN-1467-9922
Crowdsourced Adaptive Comparative Judgment: A Community-Based Solution for Proficiency Rating
Language Learning, v72 n3 p853-885 Sep 2022
The main objective of this Methods Showcase Article is to show how the technique of adaptive comparative judgment, coupled with a crowdsourcing approach, can offer practical solutions to reliability issues as well as to address the time and cost difficulties associated with a text-based approach to proficiency assessment in L2 research. We showcased this method by reporting on the methodological framework implemented in the Crowdsourcing Language Assessment Project and by presenting the results of a first study that demonstrated that a crowd is able to assess learner texts with high reliability. We found no effect of language skills or language assessment experience on the assessment task, but judges who had received formal language assessment training seemed to differ in their decisions from judges who had not received such training. However, the scores generated by the crowdsourced task exhibited a strong positive correlation with the rubric-based scores provided with the learner corpus used.
Descriptors: Comparative Analysis, Decision Making, Language Proficiency, Reliability, Computer Software, Cost Effectiveness, Language Skills, Language Tests, Scores, Scoring Rubrics, Computational Linguistics, Evaluation Methods, Evaluators, Training
Wiley. Available from: John Wiley & Sons, Inc. 111 River Street, Hoboken, NJ 07030. Tel: 800-835-6770; e-mail: cs-journals@wiley.com; Web site: https://bibliotheek.ehb.be:2191/en-us
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Data File: URL: https://osf.io/89vm4/