Publication Date
In 2025 | 0 |
Since 2024 | 1 |
Since 2021 (last 5 years) | 1 |
Since 2016 (last 10 years) | 5 |
Since 2006 (last 20 years) | 10 |
Descriptor
Source
Language Testing | 13 |
Author
Alderson, J. Charles | 1 |
Brown, Annie | 1 |
Carey, Michael D. | 1 |
Davis, Larry | 1 |
Dunn, Peter K. | 1 |
Haerim Hwang | 1 |
Harding, Luke | 1 |
Hyunwoo Kim | 1 |
Isbell, Dan | 1 |
Kang, Okim | 1 |
Kermad, Alyssa | 1 |
More ▼ |
Publication Type
Journal Articles | 13 |
Reports - Research | 9 |
Reports - Evaluative | 4 |
Tests/Questionnaires | 1 |
Education Level
Higher Education | 2 |
Postsecondary Education | 1 |
Secondary Education | 1 |
Audience
Location
Australia | 1 |
Canada | 1 |
China | 1 |
France | 1 |
Germany | 1 |
India | 1 |
Italy | 1 |
Netherlands | 1 |
South Korea | 1 |
Laws, Policies, & Programs
Assessments and Surveys
Test of English as a Foreign… | 4 |
ACTFL Oral Proficiency… | 1 |
What Works Clearinghouse Rating
Haerim Hwang; Hyunwoo Kim – Language Testing, 2024
Given the lack of computational tools available for assessing second language (L2) production in Korean, this study introduces a novel automated tool called the Korean Syntactic Complexity Analyzer (KOSCA) for measuring syntactic complexity in L2 Korean production. As an open-source graphic user interface (GUI) developed in Python, KOSCA provides…
Descriptors: Korean, Natural Language Processing, Syntax, Computer Graphics
van Batenburg, Eline S. L.; Oostdam, Ron J.; van Gelderen, Amos J. S.; de Jong, Nivja H. – Language Testing, 2018
This article explores ways to assess interactional performance, and reports on the use of a test format that standardizes the interlocutor's linguistic and interactional contributions to the exchange. It describes the construction and administration of six scripted speech tasks (instruction, advice, and sales tasks) with pre-vocational learners (n…
Descriptors: Second Language Learning, Speech Tests, Interaction, Test Reliability
Isbell, Dan; Winke, Paula – Language Testing, 2019
The American Council on the Teaching of Foreign Languages (ACTFL) oral proficiency interview -- computer (OPIc) testing system represents an ambitious effort in language assessment: Assessing oral proficiency in over a dozen languages, on the same scale, from virtually anywhere at any time. Especially for users in contexts where multiple foreign…
Descriptors: Oral Language, Language Tests, Language Proficiency, Second Language Learning
Kang, Okim; Rubin, Don; Kermad, Alyssa – Language Testing, 2019
As a result of the fact that judgments of non-native speech are closely tied to social biases, oral proficiency ratings are susceptible to error because of rater background and social attitudes. In the present study we seek first to estimate the variance attributable to rater background and attitudinal variables on novice raters' assessments of L2…
Descriptors: Evaluators, Second Language Learning, Language Tests, English (Second Language)
Davis, Larry – Language Testing, 2016
Two factors were investigated that are thought to contribute to consistency in rater scoring judgments: rater training and experience in scoring. Also considered were the relative effects of scoring rubrics and exemplars on rater performance. Experienced teachers of English (N = 20) scored recorded responses from the TOEFL iBT speaking test prior…
Descriptors: Evaluators, Oral Language, Scores, Language Tests
Yan, Xun – Language Testing, 2014
This paper reports on a mixed-methods approach to evaluate rater performance on a local oral English proficiency test. Three types of reliability estimates were reported to examine rater performance from different perspectives. Quantitative results were also triangulated with qualitative rater comments to arrive at a more representative picture of…
Descriptors: Mixed Methods Research, Language Tests, Oral Language, Language Proficiency
Carey, Michael D.; Mannell, Robert H.; Dunn, Peter K. – Language Testing, 2011
This study investigated factors that could affect inter-examiner reliability in the pronunciation assessment component of speaking tests. We hypothesized that the rating of pronunciation is susceptible to variation in assessment due to the amount of exposure examiners have to nonnative English accents. An inter-rater variability analysis was…
Descriptors: Oral Language, Pronunciation, Phonology, Interlanguage
Macqueen, Susy; Harding, Luke – Language Testing, 2009
In 2002 the University of Cambridge Local Examinations Syndicate (UCLES) implemented a revised version of the Certificate of Proficiency in English (CPE). CPE, which is the highest level of the Main Suite of Cambridge ESOL exams, comprises five modules, "Reading," "Writing," "Use of English," "Listening" and "Speaking," the latter of which is the…
Descriptors: Speech Communication, Test Reviews, Examiners, English (Second Language)
Alderson, J. Charles – Language Testing, 2009
In this article, the author reviews the TOEFL iBT which is the latest version of the TOEFL, whose history stretches back to 1961. The TOEFL iBT was introduced in the USA, Canada, France, Germany and Italy in late 2005. Currently the TOEFL test is offered in two testing formats: (1) Internet-based testing (iBT); and (2) paper-based testing (PBT).…
Descriptors: Oral Language, Writing Tests, Listening Comprehension Tests, Test Reviews
Lee, Yong-Won – Language Testing, 2006
A multitask speaking measure consisting of both integrated and independent tasks is expected to be an important component of a new version of the TOEFL test. This study considered two critical issues concerning score dependability of the new speaking measure: How much would the score dependability be impacted by (1) combining scores on different…
Descriptors: Language Tests, Second Language Learning, English (Second Language), Generalizability Theory

Brown, Annie – Language Testing, 2003
Examines the question of variation among interviewers of oral language proficiency interviews in the ways that they elicit demonstrations of communicative ability and the impact of this variation on candidate performance and raters' perceptions of candidate ability. A discourse analysis of two interviews involving the same candidate with two…
Descriptors: Discourse Analysis, Interrater Reliability, Interviews, Language Proficiency

McNamara, T. F.; Lumley, Tom – Language Testing, 1997
Investigates potential problems with training native speaker interlocutors to carry out a series of oral interactions with a candidate for overseas employment with the Australian government. Findings reveal the effects of interlocutor variables and audiotape quality on ratings. Evaluates the overall feasibility of the procedure and examines…
Descriptors: Audiotape Recordings, Employment Interviews, English for Special Purposes, Foreign Countries

Wigglesworth, Gillian – Language Testing, 1997
In this study, planning time was manipulated as a variable in a trial administration of a semi-direct oral interaction test. Discourse analytic techniques were used to determine the nature and/or significance of difference in the elicited discourse across two conditions in terms of complexity and accuracy. Findings suggest that planning time may…
Descriptors: Cognitive Development, Communicative Competence (Languages), Comparative Analysis, Discourse Analysis