Feeds:
Posts
Comments

Posts Tagged ‘diagnostic testing’

I ran across this very helpful article recently that does with reading and listening comprehension skills that I have been doing with reading-writing subskills.

Song, M-Y. (2009). Do divisible subskills exists in second language (L2) comprehension? A structural equation modeling approach. Language Testing, 25 (4), 435-464.

In essence, the author asks: to what degree does student performance on a test suggest that reading, listening, and 2-3 identified subskills exist as separate constructs (as evidenced by structural equation modeling)? In comparison, I have been trying to identify the degree to which reading comprehension, writing ability, synthesis comprehension, and paraphrase writing ability are all subskills of reading-to-write tasks. So this article provided quite a bit of theoretical framing and technical analysis that I need to further refine my own study. (more…)

Read Full Post »

I work with my institution’s international teaching assistant (ITA) program. The program, like those at many universities and colleges in the USA, is designed to provide language and pedagogical training to non-native English speaking graduate students who serve as teaching assistants for their departments.

Last fall, after testing all potential ITAs for oral proficiency training, our program received many concerned calls from departments whose international students did not meet the minimum English language standards to be TAs. In fact, a large number of the students were well below cut-off in the university’s policy regarding ITAs. Departments were concerned that these students needed funding and now they did not qualify to work as TAs due to their oral proficiency. To put it mildly was a very frustrating experience for many stakeholders.

(more…)

Read Full Post »

Given the University’s need to assess the English language proficiency of in-coming transfer students from the community college system, I was able to continue assessing the effectiveness of our ESL placement test. Earlier this week, a group of transfer students took the new grammar/vocabulary test, as well as the existing oral interview, and a revised writing test. Now that I have additional examinee data, I have been able to conduct an initial item analysis. (more…)

Read Full Post »

Last week, I spent a half hour in the depths of the university library searching through the P/PE stacks (the call number section of linguistics and language teaching). I picked up a few books that I hadn’t read before, including this volume that focuses on language learning contexts in India and the Middle East.

Singh, G. (2006). Summarisation skills: An analysis in text comprehension and production. In V. Narang (Ed.), Contemporary Themes and Issues in Language Pedagogy (pp. 17-32). Delhi, India: Nagri Printers.

This chapter summarizes research (from the 1980s) on summarization skills, and then applies those concepts and methodologies to an analysis of summaries generated by graduate student English language students in India. The results suggest that even graduate students struggle to understand source texts and write effective summaries. (more…)

Read Full Post »

Not long after arriving at my new job, I was informed that I would be handling the administration and grading of the university’s English language placement exam. I was shocked by the experience for a couple reasons:

  1. The exam is delivered on a Sunday. Having previously worked for a Christian university, I was surprised that any program would think it was necessary to hold an exam on a weekend, let alone a Sunday.
  2. A major portion of the exam consisted of a 100-item grammar test. The university does not even offer a grammar ESL class. There seemed little use in spending 45-minutes delivering a grammar exam that would have little consequential value.
  3. The exam was older. Really old. No one had an original version of the exam – only photocopies of photocopies. And the instructions to the exam contained the phrase, “Please do not smoke during the exam.” Really? How many decades ago would it have been likely that students would have assumed it *would* be okay to attempt to smoke during an exam?

Despite these reservations and the wacky rating scales for the oral and writing components, we made it through the administration of the exam. I decided that I would work towards updating the exam, but my plans were diverted by somewhat defensive suggestions that the exam could use revisions. Besides, I was too busy to do a test analysis study. Then, four months later I was asked to administer the exam again – and was once again subjected to the embarrassment of asking students to complete an exam that I personally did not strongly believe in. That shame was enough to finally convince me to do something about it.

(more…)

Read Full Post »

I finally took a moment this afternoon to read through this article that I found at least a couple months ago. It was one of the few cases where my Google Alert actually showed up something useful. In fact, in the last year that I have had that Google Alert, this may have been the ONLY useful link that came through. Even so, I maintain that it was worth it because:

  1. This article came from a journal that I was previously unaware of and would never have thought to browse, and
  2. This article is critically tied to my research topic, and my planned study could be considered the “next step.”

Here’s the full reference citation:

Ascension Delaney, Y. (2008). Investigating the reading-to-write contruct. Journal of English for Academic Purposes, 7, 140-150.

(more…)

Read Full Post »

Alderson, J. C. (2007). The challenge of (diagnostic) testing: Do we know what we are measuring? In J. Fox, M. Wesche, D. Bayliss, L. Cheng, and C. E. Turner (eds.), Language testing reconsidered (pp. 21-39). Ottawa, ON: University of Ottawa Press.

In the second chapter of Language Testing Reconsidered, Alderson (who is a burly, wildly-bearded British Academic) questions the use of diagnostic tests. His thoughts focus primarily on the Common European Framework (CEF) for language ability. Alderson’s studies into CEF diagnostic testing suggested that either the diagnistic tests were inappropriate or the framework is not as refelctive of true language acquisition as it is designed to be. His greatest concerns focuses on what it is that diagnostic tests aim to measure and what it is that the CEF is designed to describe. Are task-based frameworks useful for diagnosing language proficiency? What does the diagnostic test measure, and how can that data be used?

Here my thoughts on how Alderson’s work relates to my research.
(more…)

Read Full Post »