Feeds:
Posts
Comments

Posts Tagged ‘test scores’

These thoughts come from reading an article comparing summarization skills in L1 and L2:

Yu, G. (2008). Reading to summarize in English and Chinese: A tale of two languages? Language Testing 25 (4), 521-551.

  • Reading comprehension is accepted as a prerequisite skill to summarization (pp. 521-522)
  • Summary skills can be used to improve reading ability (p. 522)
  • Summarization ability is necessary for academic succes (p. 522)
  • Is summarization a reading skill or a writing skill, or both. Perhaps neither, it’s a hybrid subskill (p. 522)
  • There has been a recent revival in integrated reading-writing tasks (p. 523)
  • Summarization skills are more complex and seperate from basic reading skills (p. 524)
  • Students with weaker overall proficiency were more likely to do verbatim copying (p. 525)
  • Students who copied claimed that it was easier since they did not have to understand the meaning of the words/phrases (p. 542, 544-545)
  • Problem-solving strategy use was more common among better summarizers (p. 526)
  • L2 summaries tend to be of poorer quality than L1 summaries, have less important information, and have more false information (p. 527)
  • Although general reading comprehension scores slightly relate to summary writing scores, there is still a great deal of difference between the skills (p. 536, 544)
  • General comprehension reading skills may be very different from the reading skills needed for summaries, and other skills or factors may be involved in summarization ability (p. 544)
Advertisements

Read Full Post »

I ran across this very helpful article recently that does with reading and listening comprehension skills that I have been doing with reading-writing subskills.

Song, M-Y. (2009). Do divisible subskills exists in second language (L2) comprehension? A structural equation modeling approach. Language Testing, 25 (4), 435-464.

In essence, the author asks: to what degree does student performance on a test suggest that reading, listening, and 2-3 identified subskills exist as separate constructs (as evidenced by structural equation modeling)? In comparison, I have been trying to identify the degree to which reading comprehension, writing ability, synthesis comprehension, and paraphrase writing ability are all subskills of reading-to-write tasks. So this article provided quite a bit of theoretical framing and technical analysis that I need to further refine my own study. (more…)

Read Full Post »

Given the University’s need to assess the English language proficiency of in-coming transfer students from the community college system, I was able to continue assessing the effectiveness of our ESL placement test. Earlier this week, a group of transfer students took the new grammar/vocabulary test, as well as the existing oral interview, and a revised writing test. Now that I have additional examinee data, I have been able to conduct an initial item analysis. (more…)

Read Full Post »

I just picked this one up off the latest edition of Language Testing journal. I’m always a little cautious of ETS articles written only by ETS researchers. They invariably tend to support ETS practices and promote ETS products. Still, they do lots of research and serve as a model for lots of interesting approaches to language testing, so I don’t begrudge – I just take anything they publish with a grain of salt. After all, any researcher regardless of the institutional association has some agenda or other.

Here’s the reference, and the review is after the jump:

Sawaki, S. Stricker, L. J., and Oranje, A. H. (2008). Factor structure of the TOEFL Internet-based test. Language Testing, 26, 5-30.

(more…)

Read Full Post »

I finally took a moment this afternoon to read through this article that I found at least a couple months ago. It was one of the few cases where my Google Alert actually showed up something useful. In fact, in the last year that I have had that Google Alert, this may have been the ONLY useful link that came through. Even so, I maintain that it was worth it because:

  1. This article came from a journal that I was previously unaware of and would never have thought to browse, and
  2. This article is critically tied to my research topic, and my planned study could be considered the “next step.”

Here’s the full reference citation:

Ascension Delaney, Y. (2008). Investigating the reading-to-write contruct. Journal of English for Academic Purposes, 7, 140-150.

(more…)

Read Full Post »

Of all the authors from this volume whom I met at LTRC this summer, Elana Shohamy was the most personable. I sat next to her during a couple session and not only was she very friendly, but she seemed very humble and genuinely inquisitive when others shared their research, even though I knew that she is well-published. On another note, I have to say that she’s the closest thing to a language testing rock star; I’m not sure what that means, but somehow it seems fitting.

Shohamy, E. (2007). Tests as Power Tools: Looking Back, Looking Forward. In J. Fox, M. Wesche, D. Bayliss, L. Cheng, and C. E. Turner (eds.), Language testing reconsidered (pp. 141-152). Ottawa, ON: University of Ottawa Press.

Shohamy starts out her essay by detailing her experience with language tests (much like Charles Alderson did during his address at LTRC this summer). Elana’s experiences connected with my own:

  • She saw herself as a “victim of tests” (p. 142) during her educational experience. I can relate. I never saw the purpose of tests since I never felt that they were effective measures of what I had learned. I felt that they often measured test-taking ability or obscure facts related to the material rather than the important and key concepts I had been learning.
  • She also describes how tests replaced learning as she got closer to finishing high school and teachers prepared students for exit exams and university admissions. My experiences:
  1. Although high school exams in Canada are not so ridiculous (there is no national high school exit exams nor any university entrance test), I do remember have a difficult time in Physics 12. I just couldn’t get into the course material and as a result I never did most of my labs. However, in the 3 weeks before the final exam, our instructor let use review old provincial exams (the closest things to systematized high school exams in Canada). Instead of studying the material for the class, I learned what kinds of questions would be asked and strategies for completing the questions. As such, I did well on my Physics 12 exam, not because I understood the principles of Physics 12, but because I knew how to use basic algebra and apply a list of equations to test items that asked provided variables that fit that list of equations.
  2. I also saw this tendency towards testing and not learning while working as an English teacher in China. When I began teaching, I only taught Grade 10 students and we would do a variety of role-playing and communicative activities. But occasionally one of the upper-level instructors would ask me to visit their classes and the atmosphere was very different. Students in these classes never spoke; instead they listened as the instructor explained complex grammar rules and then they would take long grammar, multiple-choice quizzes and listen again while the instructor explained why wrong answers were wrong. When they asked for my explanations, I often could not provide one given that many of the subtleties in this grammar points were either British English or were unimportant to real communication in English.  China is only now starting to recover from this learning-less method of teaching English.
  • Shohamy explains that this aversion to tests continued in university where she was determined to become a testing expert in order to change the world. My experiences:
  1. I admit that my path to language testing was similar. I was extremely critical of traditional tests, and ended up taking a language testing class during my first semester of graduate school – not so much because I wanted to learn about test theory, but because I wanted to graduate faster and the testing course was another one that I could sandwich into my schedule. Through the course, I learned to enjoy the mathematical aspects of language testing, but I was also encouraged to learn that new testing practices extended beyond traditional True/False or multiple choice questions and included an array of performance and “alternative” modes of testing. The instructor of that course asked me to become her research assistant, and that’s what led me to do my MA thesis research into ESL writing rating. This has continued in my PhD research where I am investigating the use of integrated writing tasks as a means of assessment.
  2. Despite all this, I still rarely given tests in my classes. The only time that I ever taught an exclusively reading class, I never gave a single test and instead relied on other forms of formative assessment to see what learning was happening and what learning needed to happen.
  3. I frequently criticized the program-wide speaking tests at my old institution because I could see how students could pass these tests by simply memorizing mini-speeches instead of communicating spontaneously. Before leaving, I pushed for alternate forms of speaking and writing assessment that would result in improved classroom washback and would encouarge instructors to focus on skills that students would realistically use in academic situations.
  • Shohamy spend the majority of her article explaining the connection between tests as policy tools, both for good and for the detriment of examinees and communities. In connection with Spolsky’s and McNamara’s chapters, Shohamy helps testers see how the impact of language tests moves beyond grades but can have significant societal influence. As test developers, we need to consider these issues and work with test score users to ensure responsible test use and policy application.
  1. I am experiencing this now, more than ever before. At my old job, our test scores were an internal measure. Students were not impacted for good/poor test scores outside of our program. We had been working towards greater continuity with the main campus, but only now are such issues being considered. However, with my current position, scores for test that we administer have a major impact on students and their sponsoring departments. I have already presented on this issue at a couple graduate admissions meetings, and I have another one tomorrow. Our goal is to help score users understand what these scores mean and to make responsible choices in admissions, assignments, and advancement that will be beneficial to all students who are involved. It’s a complex issue and solutions are not immediate.

Read Full Post »

… it’s worth it.

I just got out of a meeting in which an ETS representative spoke to various admissions staff and ESL faculty at our institution. Having attending the language testing conference this summer in China and then having spent my bus rides over the past 4 weeks reading through a book on the TOEFL validation study (the review will be posted on the robblog soon), I felt very prepared for this meeting. In fact, I was able to answer quetsions about TOEFL that the representative could not, given the extra research I’ve read and conference session I’ve attended. There have been times over the past two months when I have questioned whether my interest and background in language testing would be useful to my current institution, but today’s meeting helped me to realize that to some degree, I am the resident expert on ESL testing issues here, and I can make a positive impact on the quality of our programs.

Read Full Post »