May 31, 2012
Can An Exam Really Evaluate Student Writing?
I am pretty sure that teachers have been complaining about poor student writing since the earliest days of having places that we would call schools. And it's difficult to get teachers to agree on what constitutes good writing and especially on how you can evaluate student writing.
The portfolio is a popular approach that has gained some traction, but many schools from elementary to college level still use (or are required to use) a writing essay.
Doug Hesse, director of the writing program at the University of Denver, prefers grading student portfolios assembled over several semesters.“Within the writing community,” he said, “there’s a lot of wariness especially of single-measure, single-sitting exams.”
A recent article on InsideHigherEd.com looked at two colleges that differ on the role of exams in evaluating student writing. But before I comment on those examples, let me talk about Passaic County Community College's own writing exam.
The exam came about decades ago when today's employers and faculty members were themselves college students. In PCCC's case (so the story goes), an important local business man complained to the College President that he was getting job applicants who had graduated from PCCC and "couldn't write."
And so, the PCCC College Writing Exam (CWE) was implemented. It was seen as away to validate that a graduating student was an adequate writer. It is an important graduation requirement for all PCCC students because failing it means you don't graduate, no matter what you have as a grade average. As far as we know, we are the only college in NJ (2 or 4 year) that has this requirement. That is seen by some as admirable, and by others as foolish.
For the CWE, students are asked to write that most basic of forms - the five paragraph essay. That is about 450 words about a general topic or a topic from their major that they have not seen before. Students will be given the general theme (such as "survival" or "personal health" or "elections") but not the actual question so that they may prepare some ideas for their essay.
Students use a computer to type their essay, but it does not allow spell check, grammar check or Internet access. The testing allows for two hours.
The essay will be read and holistically evaluated by at least two readers. Readers are faculty and staff members from PCCC who have been trained in scoring the exam using a standard rubric.
The article I read talks about Old Dominion University and Hampden-Sydney College who also decided that the best way to make sure their graduates could write was to require them to pass a writing test before graduation.
Students at Hampden-Sydney’s have a 33% failure rate for students taking the test for the first time. That's not very different from our failure rate. The article states that faculty work with those students to "make sure they improve their writing and graduate on time."
Old Dominion University (a much larger public university) had problems with the test results. More than 600 of 42,000 seniors in the past decade finished every other degree requirement but couldn’t pass the test and graduate. Big universities and smaller community colleges don't like students not graduating because of one item.
So this year Old Dominion announced it was phasing the test out and instead asking students to instead earn at least a C in three classes -- two English courses and a writing-intensive class in their major. Current students who prefer to take the assessment still can.
It might sound like an easy way out, but the university Provost, Carol Simpson, said the assessment did a poor job of gauging writing ability. She believes writing should be evaluated in the classroom, not on a test that students take on their way out the door.
Simpson's quote in the article has been echoed many times on the PCCC campus by faculty: “Many students waited until they had completed all other requirements and then go, ‘Oh gosh, I have to take this exam.’ They didn’t prepare for it. It was a timed, stressful exam. It just wasn’t really a good example of their ability to write.”
Students don’t seem to be mourning the exam’s death. “Does one test, like a five-paragraph essay, really determine your ability to write analytically?” asked Student Body President Luis Ferreira, who passed the exam last year. “It didn’t really serve a serious purpose for me. It was just kind of a formality.”
Those colleges give a reading on a topic which students thyen need to take a position on and respond with a well-reasoned paper of around 500 words. The essays are graded by multiple faculty members.
Smaller and all-male Hampden-Sydney has a different approach. Before the exam, students must pass two rhetoric classes (3 if they test poorly as incoming freshmen) which are capped at 14 students and focus on grammar and essay composition.
If a student fails the test, generally taken late in his sophomore year, he has two opportunities to pass it again as a junior and to seek help from writing instructors. The rare student who hasn’t passed by senior year is placed into a writing-intensive course in which he is tutored. The student is then given an alternative assessment. The student then writes three essays but isn’t held to a time limit.
Can one essay taken under conditions that do not resemble the way students do the majority of their academic writing really be a fair evaluation of their ability to write?