Sign in or register
for additional privileges

ENGL665: Teaching Writing with Technology

Shelley Rodrigo, Author

You appear to be using an older verion of Internet Explorer. For the best experience please upgrade your IE version or switch to a another web browser.

Kim Reading & Thinking Notes 9/23

NCTE (2013). Machine Scoring Fails the Test. Retrieved from http://www.ncte.org/positions/statements/machine_scoring


This annotated bibliography of sources discussing research on and experience with machine scoring of writing confirmed what I would have guessed: machine scoring cannot match the type and quality of feedback offered by a human responded. Three of the cited sources particularly caught my attention. Bridgeman, Trapani, & Yigal (2012) had interesting findings in the differences between human scores and e-raters. The finding that "it appears that, for certain groups, essays that are well organized and developed," but are flawed in terms of "grammar, usage, and mechanics, tend to get higher scores from e-rater than human scorers" (39), was really surprising to me because I would have expected it to be the other way around. I guess it just goes to show that people can be distracted by surface errors when trying to assess an essay holistically. Scott (2011), while not peer reviewed scholarship indicated my worst fears about machine scoring: that the machines could be tricked and misinterpret what they are scoring based on overgeneralized assessment criteria. Wilson (2006) demonstrated how machine scoring could eliminate the possibility for creativity within writing by revising Cisneros’s piece “My Name” into a five paragraph essay. If I had any doubt before of the downsides and problems of machine scoring for writing, a review of the research certainly confirmed it.


Vie, S. (2013). A pedagogy of resistance towards plagiarism detection technologies. Computers and Composition, 30, 3-15.


If I had a “preaching to the choir reaction” to the NCTE statement, I had a much more complicated reading experience of Vie’s article. I have never used Turnitin, but I have used SafeAssign, a plagiarism detection software within Blackboard. Before discussing SafeAssign and my experience with it, I want to think about Vie’s points about plagiarism detection technologies generally. I have to admit, I sympathize with Vie’s colleague questioning her about the availability of these types of technologies. When I was an adjunct, I was often teaching 4 or 5 composition and developmental writing courses a semester. I didn’t feel like I had time to check on student’s sources. Yet, I also agree with Vie that there are ways to scaffold assignments that can help prevent plagiarism (intended and unintended). For instance, I generally have an assignment prior to a research paper/documented argument that asks students to incorporate sources  we have examines together in class, so I can offer guidance for those that struggle with effective and appropriate quoting and paraphrasing. Then I generally assign an exploratory essay prior to the research essay so students have to think critically about their sources before writing that essay. These methods have generally worked for me. In a recent semester as student did not complete the scaffolding assignments and when I read his final essay, I got a sense that something wasn’t right. The language seemed different from essays he had produced in the past. A quick Google search and I learned that he had lifted his paper word for word (including the title) from a sample student paper posted on a course website for another institution. In this case the theft had been so obvious I hadn’t needed any software to find it. But, I know most cases generally aren’t so obvious.


What struck me the most in Vie’s indictment of turnitin was the discussion of copyright and fair use. I confess, I had never even thought of that, and this made me immediately want information on SafeAssign. I have used SafeAssign sort of sporadically in my teaching, only be asked to specifically use it once when teaching a course for the theatre department at an institution that universally used it. I have had varying experiences with it. In the aforementioned theatre class, I found its use rather annoying because since I was asking students to close read various aspects of play scripts for their response papers, these papers would use a high number of quotes, resulting in a high percentage of “unoriginal material” and I ended up having a lengthy discussion in class to assuage the fears of several students who got high percentages on the SafeAssign report and were afraid they would be accused of plagiarism. We talked about how we’re “smarter than the technology” because we know that quotation marks indicate that you are intentionally using exact phrasing, but the software doesn’t. 


After this experience, any time I have used SafeAssign, I am sure to have this conversation with students upfront. I did have a positive experience using SafeAssign, however. I had used the same scaffolding techniques I described above and a particular student had completed them relatively well. Yet, when it came to the documented argument which was submitted through SafeAssign, the report indicated that there were two paragraphs within the draft that had been inserted almost word for word from two sources. When we talked about this in a conference, it became a great teachable moment as we worked together to paraphrase some of the ideas within those paragraphs that he had really wanted to maintain and include. In my experience, like machine scoring, plagiarism detection software is a crude technology that requires careful human use. After reading Vie’s article, however, I was worried about what I had asked students to do in terms of fair use. Had I forced them to essentially give ownership of their work to a corporate entity? In doing a little research on SafeAssign I was at least relieved to see that SafeAssign through Blackboard is institutionally specific and that Blackboard does not claim ownership of submitted papers. Even still, though, I find I am much more conflicted about my use of SafeAssign. Currently it is on the syllabus for my composition course for students to submit their documented arguments through SafeAssign, but I am thinking about making sure if I do, of having a frank discussion of exactly what this means and offering alternatives for those that aren’t comfortable.


Glassick, C. E., Huber, M. T. & Maeroff, G.I. (1997). Scholarship assessed: Evaluation of the professoriate. New York, NY: The Carnegie Foundation for the Advancement of Teaching.


In the last three chapters, the authors discuss documenting the four types of scholarship, offer suggestions for organizing the process of scholarship assessment, and name what they believe the qualities of a scholar should be. I appreciated in the chapter about the process that they used the same six categories that they had presented for good scholarship to the process of assessment itself. I also thought they made a good point about the competing values of privacy and transparency that need to be carefully balanced for a process to be successful. What struck me the most from these chapters though, were the qualities of a scholar, particularly courage. In the examples offered about courageous scholarship, most were from the sciences, leading me to wonder, what does courage look like in the humanities? In thinking about my own interests, I was reminded of a book I recently read, Peripheral Visions for Writing Centers. In it, McKinney (2013) examines the dominant narratives of writing centers and writing center studies, and how these narratives might obscure other visions. This decision to critically examine some of the most beloved and tightly held notions within writing centers seems to me could be considered courageous scholarship.
Join this page's discussion (1 comment)
 

Discussion of "Kim Reading & Thinking Notes 9/23"

assessing your responses

I'm glad you are allowing yourself to respond to the pieces and then trying to assess how/why you do so. Great job being critical.

Posted on 24 September 2014, 10:47 am by Shelley Rodrigo  |  Permalink

Add your voice to this discussion.

Checking your signed in status ...

Previous page on path Kim Fahle Bio, page 8 of 25 Next page on path

Related:  Heather's Reading and Thinking Notes Week 6: 9/30Reading and Thinking NotesChvonne's Reading and Thinking Notes 9/2Kevin's Reading and Thinking Notes Week SevenMike's Reading and Thinking Notes - 10/21Chvonne's Reading and Thinking Notes 9/9Kelly's Reading and Thinking Notes: Week 5Heather's Reading and Thinking Notes Week 4: 9/16Chvonne's Reading and Thinking Notes 10/14Kim Reading & Thinking Notes 9/9Kim Reading & Thinking Notes 10/7Mike's Reading and Thinking Notes - 9/23Kim Reading & Thinking Notes 10/21Mike's Reading and Thinking Notes - 9/16Mike's Reading and Thinking Notes - NL 9 - 10/28Kim Reading & Thinking Notes 10/14Chvonne's Reading and Thinking Notes 10/21Mike's Reading and Thinking Notes - 10/7Kevin's Reading and Thinking Notes, Week 9Kelly's Reading and Thinking Notes: Week 3Kim Reading & Thinking Notes 9/16Chvonne's Reading and Thinking Notes 11/11Kelly's Reading and Thinking Notes: Week 4K.C. Reading and Thinking Notes: Week 1Heather's Reading and Thinking Notes Week 3: 9/9Kim Reading & Thinking Notes 9/2Mike's Reading and Thinking Notes - 9/9Reading Notes: Week 2 (Amy)Shantal, Reading and Thinking Notes 9/2Shantal Reading Notes, Week 2, 9/10 and Brain Rules 2 note challengeMike's Reading and Thinking Notes - 10/14Mike's Reading and Thinking Notes - For 9/2Kim Reading & Thinking Notes 9/30Heather's Reading and Thinking Notes Week 2: 9/2