Posts Tagged ‘TOEFL’

I ran across this very helpful article recently that does with reading and listening comprehension skills that I have been doing with reading-writing subskills.

Song, M-Y. (2009). Do divisible subskills exists in second language (L2) comprehension? A structural equation modeling approach. Language Testing, 25 (4), 435-464.

In essence, the author asks: to what degree does student performance on a test suggest that reading, listening, and 2-3 identified subskills exist as separate constructs (as evidenced by structural equation modeling)? In comparison, I have been trying to identify the degree to which reading comprehension, writing ability, synthesis comprehension, and paraphrase writing ability are all subskills of reading-to-write tasks. So this article provided quite a bit of theoretical framing and technical analysis that I need to further refine my own study. (more…)


Read Full Post »

Given the University’s need to assess the English language proficiency of in-coming transfer students from the community college system, I was able to continue assessing the effectiveness of our ESL placement test. Earlier this week, a group of transfer students took the new grammar/vocabulary test, as well as the existing oral interview, and a revised writing test. Now that I have additional examinee data, I have been able to conduct an initial item analysis. (more…)

Read Full Post »

I just picked this one up off the latest edition of Language Testing journal. I’m always a little cautious of ETS articles written only by ETS researchers. They invariably tend to support ETS practices and promote ETS products. Still, they do lots of research and serve as a model for lots of interesting approaches to language testing, so I don’t begrudge – I just take anything they publish with a grain of salt. After all, any researcher regardless of the institutional association has some agenda or other.

Here’s the reference, and the review is after the jump:

Sawaki, S. Stricker, L. J., and Oranje, A. H. (2008). Factor structure of the TOEFL Internet-based test. Language Testing, 26, 5-30.


Read Full Post »

My current institution uses the SPEAK test (retired forms of the TSE – Test of Spoken English) in International Teaching Assistant (ITA) assessment. Earlier this semester, many departments sent their students to us for testing. We completed the assessments and sent the scores back to departments and students, who were not pleased with their performance.

In reality, student scores on this semester’s test was not any different from previous semesters. However, more departments are relying on new, in-coming international students to work as ITAs without really understanding their English proficiency. What makes this all the more frustrating is that these departments thought that they were being proactive in assessing students’ oral skills before they arrived by using telephone interviews. As a result of these unexpectedly low scores, many departments were left with classes without TAs and scrambled to find alternate means of funding these international students.

In an effect to mitigate these problems in future semesters, my department set up a meeting with graduate directors. We prepared to show them how TOEFL iBT speaking section scores (which many students are now submitting) can be a good predictor of SPEAK test scores. Even though many of these departments had access to iBT scores, many of them did not know how to interpret these scores, nor how they related to SPEAK benchmarks. Although we were prepared for some negative criticism in our meeting, we were pleased to learn that they were grateful for our help and explanations. I hope it helps guide them in their admissions and funding decisions.

This is just one of many ways in which this new position requires cooperation and good communication across schools and departments. I’m pleased to see that the ESL program here is so well connected throughout the university.

Read Full Post »