Thanks for stopping by!
Let me begin today’s post with a disclaimer. I know that grading comment banks are used by many English teachers who are eager to save time and take back their lives. I totally get it.
And I used to use them myself. Then, I reflected that this was more a hack for me than a game-changer for student learning.
Advocates of grading codes suggest that they save teachers time (which they do). However, I’ve found that comment banks do little to address elephant in the room. These same teachers looking for a time-saving strategy are likely focusing more on summative than formative assessments, rendering these codes and pre-made comments virtually useless for student revision and learning.
The real game changer comes, as discussed in a previous post, in shifting one’s ideas about feedback itself, including the difference between feedback and grading, when feedback should happen, and what it should look like.
What are Comment Banks, Anyway?
It’s called “minimalist” grading, and it’s a pretty popular recommendation. And it does help teachers to save time when assessing student work.
Proponents of minimalist grading techniques and comment banks, dating all the way back to Borja and Spader in their article, “AWK: Codes in Grading Essays Making Essays More “Objective,” say that a system of codes for commonly-used comments helps students to be more responsible for revising their own work. These writers also advocate for withholding the grade until a student has viewed and responded to comments, ensuring that students actually read the comments and reflect on their skills instead of just wanting the final grade.
A colleague of mine who happens to like comment banks falls into line with this mentality. She says that tallying the number of times she gives a comment to students is informative data which can be used to re-teach.
Absolutely! But if the focus is trend-gathering, I see the comment numbers as an arbitrary step and one that may do more harm than good for student learning.
I’m all for scanning a group of student papers for trends and grouping students for mini-lessons or re-teaching or directing students to screencasts and other resources/exemplars. All of these resources are useful to students as they revise. I’m just not convinced that spending my time writing #’s or impersonal comments is necessary for student learning.
Comment Banks = “Rubber-Stamp” Feedback
Professor Kathy Pezdek argues, in her article, “Grading Student Papers, Reducing Faculty Workload While Improving Feedback to Students,” that if teachers see the feedback process as less time-consuming and daunting, they are more likely to assign writing and give feedback that is useful.
Pezdek suggests creating a list of the “top ten” comments you find yourself leaving on student writing, making sure that these represent feedback on how students can effectively convey their ideas (rather than smaller, more nitpicky items).
She then creates codes for her top ten comments and includes this “cheat sheet” in her syllabus, using it all semester long. In addition to using these codes, she makes additional comments that are more specific to the individual student’s writing.
I think Pezdek’s solution represents a nice compromise.
If we limit our feedback to a bunch of comment bank responses or codes that seem arbitrary and impersonal to students, for whom the writing is a personal act, we teach students that there is a “right and a wrong” way to write.
This perpetuates the idea that revision is simply a matter of “fixing” pieces of a larger whole…instead of considering their overall purpose and effectiveness as writers.
This black and white mentality is part of what Nancy Sommers, who voiced her critique of comment banks way back in 1982, argues, saying that “most teachers’ comments are not text-specific and could be interchanged, rubber-stamped, from text to text. The comments are not anchored in the particulars of the students’ texts, but rather are a series of vague directives that are not text-specific.” I wonder if a bank of comments can get at the complex nuances of organization, diction, and idea development in a way that will be meaningful for student revision.
She discusses interviews with students who found these pre-written directives from comment banks such as “think more about your audience” or “choose precise language” puzzling and, ultimately, a guessing game to figure out what they did “wrong” and how to “fix it.”
As a solution, Sommers proposes that teachers offer strategies for improvement instead of generalized comments which are interpreted by students as a series of abstract and impersonal “rules for composing.” She suggests that “we need to reverse this approach. Instead of finding errors or showing students how to patch up parts of their texts, we need to sabotage our students’ conviction that the drafts they have written are complete and coherent. Our comments need to offer students revision tasks of a different order of complexity and sophistication from the ones that they themselves identify, by forcing students back into the chaos, back to the point where they are shaping and restructuring their meaning.”
Whew.
That’s a mouthful.
Do pre-made comments grapple with the nuances of student writing in a way that will prompt deep revision? Not in my experience, especially for my regular-level students who would ask “what does this number mean?”, reference the code sheet to see that the number meant something like “awkward” or “use ICE,” and move on…some of them didn’t even make it that far.
Personal Response is Better
In a nutshell, Sommers advocated for what Peter Elbow would later write about, an active and wholistic response to student writing as a reader with questions, which is hard to capture in a pre-written comment bank. He would add to this that one should personalize feedback to how the text is impacting him or her. i.e. instead of saying “deepen analysis” say “I tripped here because I was unable to see how your evidence developed your thesis statement idea.
7 Takeaways for Best Writing Feedback from Peter Elbow
I find, in particular, a memo written by Elbow to be instructive for teacher feedback practices. I’ve highlighted seven particularly interesting thoughts below.
Evaluative Vs. Descriptive Writing Feedback
Elbow’s ideas support research that says descriptive feedback, feedback which provides information that helps a student to grow, is better than evaluative feedback, feedback which provides a judgment about student performance (a letter grade, written praise/criticism, or a command).
An easy first step to take while reading is to do what Elbow calls the squiggle and underline. Draw a wavy line underneath sections of the paper that need work and underline sections that are strong. After this, focus on one or more of these helpful descriptive response options from Elbow.
This, of course, is assuming that you still want to bear full responsibility for in-depth response to student drafts. I would suggest doing this for only a section of the paper. There are also some more creative ways to build student capacity for self and peer assessment, as well as purposeful teaching with modeling and exemplars, that I’ll dig into soon.
Looking Ahead to Practical Writing Feedback, “Feedforward” Tips
I think that teachers can do better than a bank of rubber-stamped comments, and by “better,” I don’t mean that we should simply “suck it up” and invest tons of personal time. There are other ways to work “smarter, not harder” that I think are in alignment with Elbow’s invitation to make feedback about the student, about personal response and discourse (not necessarily always given by the teacher), rather than about judgment or grade-justification.
Coming up in the next installment of our grading hacks series, The Ultimate Guide to Grading in High School! In the meantime, feel free to leave me a comment about today’s post!
Anderson, R., & Speck, B. (1997). Suggestions for Responding to the Dilemma of Grading Students’ Writing. The English Journal,86(1), 21-27. doi:10.2307/820775
Borja, F., & Spader, P. (1985). AWK: Codes in Grading Essays Making Essays More “Objective”. College Teaching, 33 (3), 113-116. Retrieved from http://www.jstor.org/stable/27558119
Butler R, Nisan M. (1986). Effects of no feedback, task-related comments, and grades on intrinsic motivation and performance. J Educ Psychol, 78 (210).
Elbow, P. (n.d.). About Responding to Student Writing. Retrieved March 6, 2018, from http://peterelbow.com/pdfs/Responding_to_Student_Writing.pdf
Robertson, M. (1986). “Is Anybody Listening?”: Responding to Student Writing. College Composition and Communication, 37(1), 87-91. doi:10.2307/357385
Pezdek, Kathy. “Grading Student Papers: Reducing Faculty Workload While Improving Feedback to Students.” Observer, vol. 22, no. 9, Nov. 2009, www.psychologicalscience.org/observer/grading-student-papers-reducing-faculty-workload-while-improving-feedback-to-students.
Schmoker, M. (2006). Results Now: How We Can Achieve Unprecedented Improvements in Teaching and Learning. Alexandria, VA: Association for Supervision and Curriculum Development
Sommers, N. (1982). Responding to Student Writing. College Composition and Communication, 33(2), 148-156. doi:10.2307/357622
Hey, if you loved this post, I want to be sure you’ve had the chance to grab a FREE copy of my guide to streamlined grading. I know how hard it is to do all the things as an English teacher, so I’m over the moon to be able to share with you some of my best strategies for reducing the grading overwhelm.
Click on the link above or the image below to get started!
[…] Well, it turns out that it’s really easy and a huge time-saver. It also allows teachers to get away from using comment banks to grade more efficiently and shift their thinking to creating re-usable opportunities for re-teaching. This puts the responsibility on the student for his/her learning, but also saves you a ton of typing and allows you to personalize feedback instead of leaving a cookie-cutter comment. […]