Project CRISS is based in Montana. It came out of Kalispell School District 5 and expanded under the helm of Carol Santa and Lynn Havens and their cadre of dedicated trainers who helped spread the word. Spreading the word is actually tough to do in Montana because there is a lot of room to spread out here. It doesn't take much of a drive out of a town or city center to lose cell coverage. And when you head east and cross over the continental divide, suddenly you see rolling hills only occasionally dotted by homes. And upon closer inspection, those dots may not be homes but cattle. You know that scene in Lord of the Rings when Gondor lights the signal flames to send a message to Rohan? That might be a more reliable communication technique in rural Montana than cell phones. (Note: Link to the fun science blog about the beacons good as of Aug 2017.) Why am I pointing this out? Mostly because I liked the link above. My real focus here is on this news:
If you want to read more, you can check out this Missoulian article. Essentially, though, it's straightforward: There were a variety of formatting mistakes by many of the applicants for Upward Bound funds and the U.S. DOE used that as their reason to reject the applications. They were within their rights; the application clearly said that not following formatting guidelines could leave your application rejected but using that as the first way to cut is probably ridiculous and we as educators know that. This incident brings up a couple of things to reflect on regarding grading before you go into the 2017-18 school year:
- What do we really want to assess in an assignment? Is it the ability to follow directions? Then maybe ensuring all formatting is 100% correct (Montana only had one section that was apparently single-spaced) might be right. Otherwise, if you're assessing, for instance, how well a program helps first-generation college goers who come from disadvantaged communities, then maybe you need to assess differently. Assign points to evidence of effectiveness of the program or the research that will go behind a new program and then either only read up to a certain point (1 page single-spaced or 2 pages double-spaced) OR take off a fraction of point for formatting issues. Then it's more justifiable when certain programs are picked over others.
- When to allow re-dos? I'm a Wormelli fan so here are some links (good as of Aug 2017): Video, Redos and Retakes Done Right article, and a blog entry. If your goal is for students to be successful then it's really hard to NOT support redos. Now, the grant application isn't an attempt to teach states/organizations anything. It's an evaluation. But it does trickle down to the students and shouldn't the DOE be trying to help those students, even if the state forgot to change one page of their application to double-spaced? Could someone have sent a quick email or made a call that said, "You have 24 hours to get us this page in double-spaced font or we can't grade it." I'd bet that even if the file owner was out on one of those hilltops, lighting a fire beacon to save Gondor, someone else in the office would just sit down and retype and send. Because it's about the students.
- What about alternative assessments? Now, I'm not saying this grant application that has to be read and scored by professionals should allow interpretive dance as means to demonstrate ability to help students find success in college... though I'm guessing that would make their day. But why not allow an infographic? Or a video? Or, um, a diorama? Anyway, if you're looking for evidence, look for EVIDENCE. Score based on evidence. We want our students to learn the best way to present information and that involves balancing their own strengths and interests with the needs and norms of their audience. For a grant application that's probably not a diorama or comic strip but in our classrooms, there are often lots of ways to demonstrate understanding that will keep students engaged. For example, taking notes based on a textbook reading. I get it, it's something that has to happen. Instead of outlining the chapter, can you ensure you have a clear list of success standards and allow students to create a Concept Map? Or find quotes that they find interesting or confusing and then journal in response using the vocabulary from the reading? Or write Magnet Summaries? If they have evidence that they understand what you needed them to understand, does it matter to you how they did it? (It might for some things - but I'm talking holistically. I'd guess in most cases there are places where no, it doesn't matter.)
I'd imagine that eventually the DOE will be pressured into reassessing these applications (Note: I was right. As of Aug 2017, the DOE accepted fixed applications. Read more here.) but in the meantime, use this as a way to reflect on what matters when you grade your students. And consider maybe using this entire episode and the related articles in your classroom. You can have students take on different perspectives like an overworked scorer of applications, a well-funded/evidenced organization that made a minor error, a small and new group who knows they will score low but turned in everything correctly, a student who would be impacted. What is the PROBLEM from their perspective and what could be the causes, effects, and solutions to the issue? You could also use the article to talk about "fairness" in grading and allow it to be your lead into talking about your grading policies.
Want to discuss this? Contact us at firstname.lastname@example.org or tweet your comment to us @ProjectCRISS. We'll add comments here!