Feeds:
Posts
Comments

Posts Tagged ‘testing’

I have begun reading Language Testing Reconsidered (2007) edited by Fox et al (including Carolyn Turner who I met in person at LRTC 2008 this summer). I’m enjoying this volume of testing issues written by many leading Language Testers whom I also met at LTRC this summer. Their articles take on more meaning now that I have a face and a personality to pair with the research. Rather than review the book as a whole, I decided to post about each chapter individually, starting with Bernard Spolsky’s “On Second Thoughts.”

Spolsky, B. (2007). On second thoughts. In J. Fox, M. Wesche, D. Bayliss, L. Cheng, and C. E. Turner (eds.), Language testing reconsidered (pp. 9-18). Ottawa, ON: University of Ottawa Press.

In this chapter, Spolsky summarizes what he feels are the most important language testing issues that emerged during his career. The overarching theme of this chapter (and the couple proceeding it) is “What does it mean to know a language?” Spolsky offers no simple definition for this term, but asserts that this is a question that language assessment researcher need to constantly be asking. As I consider my dissertation research, I now ask myself, “What does it mean to complete an integrated reading-writing task?” In other words, what skills, knowledge, or abilities are accessed by that task, and what inferences can we make about an examinees level of performance on that task. What do integrated tasks have to do with other tasks that students do, and what do these tasks reveal about language acquisition?

My tentative answer to those questions is that reading-writing tasks appear to be about more than simply reading and writing. They require students to use advanced literacy skills including synthesis and paraphrase. Through my research design, I hope to shed some light on what it means to do these tasks. And although my study will not directly assess this idea, it’s my hope that further research can show a relationship between ability on integrated tasks and ability to complete advanced literacy assignments in university such as writing a research paper or a case synthesis. I also hope that my research will help present a model of advanced literacy skill development, perhaps a hierarchy of reading and writing sub-skills.

One last note on Spolsky: he describes the challenge of revising an existing high-stakes exam as “steering a tanker” (p. 14). I have only even revised much smaller-tasks, local exams. Even my current plans to revise the university’s ESL placement exam is small peanuts to working with a government or commercially-distributed test. If I feel trepidation at the thought of improving our local exams, I would certainly want plenty of evidence (and a whole lot of help) if I were to take on a larger project.

Read Full Post »

My current institution uses the SPEAK test (retired forms of the TSE – Test of Spoken English) in International Teaching Assistant (ITA) assessment. Earlier this semester, many departments sent their students to us for testing. We completed the assessments and sent the scores back to departments and students, who were not pleased with their performance.

In reality, student scores on this semester’s test was not any different from previous semesters. However, more departments are relying on new, in-coming international students to work as ITAs without really understanding their English proficiency. What makes this all the more frustrating is that these departments thought that they were being proactive in assessing students’ oral skills before they arrived by using telephone interviews. As a result of these unexpectedly low scores, many departments were left with classes without TAs and scrambled to find alternate means of funding these international students.

In an effect to mitigate these problems in future semesters, my department set up a meeting with graduate directors. We prepared to show them how TOEFL iBT speaking section scores (which many students are now submitting) can be a good predictor of SPEAK test scores. Even though many of these departments had access to iBT scores, many of them did not know how to interpret these scores, nor how they related to SPEAK benchmarks. Although we were prepared for some negative criticism in our meeting, we were pleased to learn that they were grateful for our help and explanations. I hope it helps guide them in their admissions and funding decisions.

This is just one of many ways in which this new position requires cooperation and good communication across schools and departments. I’m pleased to see that the ESL program here is so well connected throughout the university.

Read Full Post »