Feeds:
Posts
Comments

Posts Tagged ‘assessment’

I ran across this very helpful article recently that does with reading and listening comprehension skills that I have been doing with reading-writing subskills.

Song, M-Y. (2009). Do divisible subskills exists in second language (L2) comprehension? A structural equation modeling approach. Language Testing, 25 (4), 435-464.

In essence, the author asks: to what degree does student performance on a test suggest that reading, listening, and 2-3 identified subskills exist as separate constructs (as evidenced by structural equation modeling)? In comparison, I have been trying to identify the degree to which reading comprehension, writing ability, synthesis comprehension, and paraphrase writing ability are all subskills of reading-to-write tasks. So this article provided quite a bit of theoretical framing and technical analysis that I need to further refine my own study. (more…)

Advertisements

Read Full Post »

I work with my institution’s international teaching assistant (ITA) program. The program, like those at many universities and colleges in the USA, is designed to provide language and pedagogical training to non-native English speaking graduate students who serve as teaching assistants for their departments.

Last fall, after testing all potential ITAs for oral proficiency training, our program received many concerned calls from departments whose international students did not meet the minimum English language standards to be TAs. In fact, a large number of the students were well below cut-off in the university’s policy regarding ITAs. Departments were concerned that these students needed funding and now they did not qualify to work as TAs due to their oral proficiency. To put it mildly was a very frustrating experience for many stakeholders.

(more…)

Read Full Post »

Given the University’s need to assess the English language proficiency of in-coming transfer students from the community college system, I was able to continue assessing the effectiveness of our ESL placement test. Earlier this week, a group of transfer students took the new grammar/vocabulary test, as well as the existing oral interview, and a revised writing test. Now that I have additional examinee data, I have been able to conduct an initial item analysis. (more…)

Read Full Post »

I just picked this one up off the latest edition of Language Testing journal. I’m always a little cautious of ETS articles written only by ETS researchers. They invariably tend to support ETS practices and promote ETS products. Still, they do lots of research and serve as a model for lots of interesting approaches to language testing, so I don’t begrudge – I just take anything they publish with a grain of salt. After all, any researcher regardless of the institutional association has some agenda or other.

Here’s the reference, and the review is after the jump:

Sawaki, S. Stricker, L. J., and Oranje, A. H. (2008). Factor structure of the TOEFL Internet-based test. Language Testing, 26, 5-30.

(more…)

Read Full Post »

I finally took a moment this afternoon to read through this article that I found at least a couple months ago. It was one of the few cases where my Google Alert actually showed up something useful. In fact, in the last year that I have had that Google Alert, this may have been the ONLY useful link that came through. Even so, I maintain that it was worth it because:

  1. This article came from a journal that I was previously unaware of and would never have thought to browse, and
  2. This article is critically tied to my research topic, and my planned study could be considered the “next step.”

Here’s the full reference citation:

Ascension Delaney, Y. (2008). Investigating the reading-to-write contruct. Journal of English for Academic Purposes, 7, 140-150.

(more…)

Read Full Post »

Chapelle, C. A., Enright, M. K., & Jamieson, J. M. (eds). (2008). Building a validity argument for the Test of English as a Foreign Language. New York, NY: Routledge.

This editied volume guides readers through the story of the TOEFL 200 project, now known as TOEFL iBT. The volume revolves around the framework of interpretive arguments. This structure, involving claims, grounds, and backing, is the most commonly accepted validity argumentation among educational assessment experts today. Chapelle et al. begin the volume explaining this struture and then detailing how it informed not only the inception but all validation activities of the TOEFL 2000 project. Before delving into the development stages, the authors provide a history of the TOEFL and a rationale for revision. Further chapters describe recursive design and prototyping studies that shaped what has become TOEFL iBT.

Although the book is written for test developers, other TOEFL stakeholders may be interested in learning the history of the test and the arguments and evidence that have shaped it present form. The later chapters are also helpful in providing a discussion of concurrent validity related to test-takers academic performance in English-medium environments. These final pages could help form discussions of admission standards for English language admission standards.

My purpose in reading this book relates to its account of integrated language tasks. In my last job, I spent a year setting the framework for program-wide integrated writing tasks. This stemmed from my role as the writing skill area coordinator, and I could see that students struggled to communicate their summary and synthesis skills in writing. So first, I read this book wanting to see if ETS had come to the same conclusions that I had regarding the need to assess integrated writing.

In addition, I was also reading to inform my dissertation research. In testing and teaching students to perform integrated writing tasks, I began to ponder the demands of such tasks. It seemed to go beyond merely reading, listening, and writing; these tasks required numerous advanced skills such as summary, paraphrase, and synthesis. This academic skills can be difficult in one’s own language, let along in L2. Because students struggled with these tasks, I wanted to figure out where the frustration was coming from: the reading passages, comparing reading passages, basic writing skills, paraphrasing skills, or other related skills.  So I read this volume to see whether ETS saw the same kinds of issues with integrated writing that I did, and had they begun to isolate the sources of task difficulty.

Lastly, in my position, I am involved in oral communication courses for ESL students, many whom are preparing to become teaching assistants. Before these international students can be approved as TAs, they need to pass an oral exam. The current exam is outdated and not as academic as we would like. So my third reason for reading this volume was to learn what ETS has been researching regarding the assessment of oral academic English.

As I present my summary of this volume, I will focus on integrated tasks and attempt to show how this volume has helped with my three questions.

(more…)

Read Full Post »