Writing Rubrics

  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.
  • : Function ereg() is deprecated in /Library/WebServer/drupal/includes/file.inc on line 646.

Error Map

Assignment
This assignment is based on the assumption that correctly identifying a problem is a large part of the solution to the problem. Therefore, here is an opportunity for you to examine your writing habits to identify existing writing problems and then to plan strategies designed to repair those problems.

You will examine the compositions you have written so far in this class in order to identify and then analyze patters of errors. For this assignment, error covers three general categories:

  1. any incidence in your work in which your teacher noted grammatical/mechanical problem (fragment, run-on, comma splice, agreement, dangling modifier, etc.)
  2. any comment of marking from a peer reviewer or teacher during conferences; and
  3. your own assessment of patterns you are already aware of and that you make conscious efforts to address (clarity, word choice, intro/conclusion paragraph issues, etc.)

Steps:
First, prepare a list of errors. List at least two different writing issues from any three of your documents. You need a total of 10 errors on this list. Each of the 10 items will contain the following information:

  • source text title (error in journal/fact sheet/vocabulary/etc.);
  • the handbook reference for the error;
  • a label for the error (e.g. comma splice, subject-verb agreement, faulty parallel construction, spelling error, tense shift, tone/language, fragment, etc.);
  • original source text that clearly shows the error; and
  • the corrected text

Second, prepare a memo for your teacher summarizing your list. In the memo, you will identify and analyze the patterns that you found; please use the language from the handbook. In the memo, you will provide three examples from your list that reveal a pattern of errors. In addition, reflect on the strategies you have already put into place or plan to implement in order to effectively address your writing problems in subsequent compositions.

This material will count as one essay grade, so treat it as such.

Format for the list:

A. Composition 1

  1. handbook reference number (i.e. Faigley, page number)
  2. a label for the error (i.e. spelling: there/their)
  3. text from the journal which indicates the error
  4. the correct text

B. Composition 2

  1. etc. until you reach 10 errors

Format for the memo:

Date: 8 November 2006
To: Your Teacher
English 104, Section 14
From: Your Name

Re: Error map assignment

While completing this assignment, I noticed that I make quite a lot of comma errors. Etc.(Then show examples.)

Example:
While searching through my past assignments, I realized that I have a large amount of misspelled words. A lot of the spelling errors were homonyms that spellcheck didn't catch. I find it very difficult to proofread my own papers because I know what I wanted to type, so I don't always read what I actually typed. One error was the word "there," of the posessive form of the word "their" that I wanted to use in the sentence. "Many people that get charged high risk insurance just because they have had a few unlucky accidents that may or may not have been there fault." The sentence now correctly reads, "Many people that get charged high risk insurance just because they have had a few unlucky accidents that may or may not have been their fault." When I proofread quickly I don't always catch words spelled wrong that sound the same.

Another error that I found in my interview paper is subject/verb agreement error. In the sentence "RoundUp is one fo the safest chemicals that are used to kill weeds." I used the word "are" instead of the word "is." The word "is," which is a singular verb, needs to be used in this sentence because the subject is "RoundUp" which is also singular. I used the word "are" because I thought that "chemicals" was the subject of the sentence. The verb does not relate to the word "chemicals" because it is part of a prepositional phrase. The sentence now correctly reads "RoundUp is one of the safest chemicals that is used to kill weeds." This error is very difficult to find if you don't spend a lot of time looking at the different parts of the sentence> The grammar check on my computer also told me that I should have used the word "are."

Through this assignment I have realized that I have to pay more attention while I proofread my assignments. I have also realized that it is better to proofread a day after writing the assignment.

Adapted from Alzire Messenger

Profile and Newsletter Rubric

The following criteria determine the essay grade. Not all criteria have equal weight in the grade. For example, numerous grammar errors or failure to participate in the interview process yourself negatively affect the grade even if other criteria are perfect.

Criteria
Participation in assigned topic
Interview content and quotes
Observation content, details
Reader/writer/purpose
Organization
Paragraphing, transitions
Grammar/punctuation/spelling
Use of visuals in newsletter
Other
Excellent     Good     Average     Needs Work

Explanation of Criteria

Focus on assigned topic: conducted interview and observation as assigned
Interview content and quotes: give clear sense of person and place through quotes and facts
Observation content, details: give clear sense of person and place through observed details
Reader, writer, and purpose: demonstration of relationship between you as writer, intended reading audience, and purpose of essay
Organization: strong introduction, well planned essay body, solid conclusion
Paragraphing, transitions: well-structured paragraphs and smooth transitions between ideas
Grammar, punctuation, and spelling: lack of or frequency of errors, meets course standard
Use of visuals in newsletter: used elements of good design as covered in class
Other: anything unexpected, whether good or not so good

Grading Scale

90–100 points = A
80–89 points = B
70–79 points = C
60–69 points = D
  0–59 points = F

Grade and Comments

Visual Presentation in Class

Criteria

Use of visuals—fonts, pictures

Explanation of how essay and newsletter met criteria

Long enough to cover content

Excellent Good    Average   Needs Work

—Michelle Ramthun, Iowa Central Community College

Communication Assessment Rubrics

Resources for Assessing Communication Activities

Types of Rubrics

Analytic Rubrics

Perhaps the most common type of assessment is one that identifies key features of a given communication task either because such features are critical to general success in the activity or because they are the pedagogical focus for particular learning. Such rubrics provide descriptive feedback rather than specific advice for student improvement. The individual factors can be weighted. Overall, an analytic rubric can reinforce valuable communication principles, suggest specific areas of strength and weakness, and provide the basis for future improvements and goal setting. No list of descriptive features, however, no matter how detailed, equates precisely to the overall communicative effect and thus should not be confused with holistic assessment.

Holistic Rubrics

When it is not possible or desirable to assess communication work based on independent features or when such features significantly overlap or interact, holistic assessment is the appropriate choice. Holistic rubrics typically focus on areas for improvement and provide qualitative feedback on designated competency levels.

Rubric Specificity

While mixing analytic and holistic assessment approaches within a single rubric is possible, such a strategy lends itself all too easily to overpenalizing students for particular weaknesses or misleadingly suggesting that the designated features constitute an exhaustive list of communication concerns. In that sense, the more detailed and precisely weighted the rubric, the more it may distort any holistic assessment.

Sometimes a rubric needs to be quite specific because the learning objectives of the assignment or the subject of the student's work dictate a narrower focus. Whenever an assignment addresses objectives not covered by any other course assignment, the rubric needs to reflect that level of specificity. When several assignments share learning objectives (broad communication concerns about purpose, context, organization, etc.), then the rubrics likewise will feature these rhetorical principles. Here, too, rubrics can be hybrids that carry forward general communication concepts from other assignments while introducing new ones specific to the current assignment. General rubrics may extend beyond the classroom to express programmatic or even institutional assessment concerns.

Assessment Resources

The number of resources on assessment can be overwhelming. Moreover, many would argue that assessment is best developed locally, not only because it is naturally situated to the iindividual teacher, student body, and institution, but because the very process of creating rubrics helps build and refine learning objectives within a given community. Certainly the ERIC Clearinghouse on Assessment and Evaluation [ERIC/AE] can serve as a useful resource. Its Scoring Rubrics—Definitions & Constructions can be a useful starting point for those new to assessment rubrics or those seeking general resources on creating and using scoring rubrics. Internet searches can be narrowed by communication activity, subject, type of rubric, and educational level.

Selected Resources

Brookhart, S. M. (1999). The Art and Science of Classroom Assessment: The Missing Part of Pedagogy. ASHE-ERIC Higher Education Report (Vol. 27, No.1). Washington, DC: The George Washington University, Graduate School of Education and Human Development.

Chicago Public Schools (1999). Rubric Bank.

Danielson, C. (1997a). A Collection of Performance Tasks and Rubrics: Middle School Mathematics. Larchmont, NY: Eye on Education Inc.

Danielson, C. (1997b). A Collection of Performance Tasks and Rubrics: Upper Elementary School Mathematics. Larchmont, NY: Eye on Education Inc.

Danielson, C.; & Marquez, E. (1998). A Collection of Performance Tasks and Rubrics: High School Mathematics. Larchmont, NY: Eye on Education Inc.

Delandshere, G. & Petrosky, A. (1998) "Assessment of complex performances: Limitations of key measurement assumptions." Educational Researcher, 27 (2), 14-25.

ERIC/AE (2000a). Search ERIC/AE draft abstracts.

ERIC/AE (2000b). Scoring Rubrics - Definitions & Construction.

Gay, L. R. (1987). "Selection of measurement instruments." In Educational Research: Competencies for Analysis and Application (3rd ed.). New York: Macmillan.

Glendale Community College Communication Faculty. Five-Objective Speaking Rubric and Overview.

Haswell, R., & Wyche-Smith, S. (1994) "Adventuring into writing assessment." College Composition and Communication, 45, 220-236.

Knecht, R., Moskal, B. & Pavelich, M. (2000). The design report rubric: Measuring and tracking growth through success. Proceedings of the Annual Meeting American Society for Engineering Education, St. Louis, MO.

Leydens, J. & Thompson, D. (August, 1997), Writing Rubrics Design (EPICS) I, Internal Communication, Design (EPICS) Program, Colorado School of Mines.

Moskal, B. M. (2000). Assessment Resource Page.

Moskal, B. M. (2000). Scoring rubrics: What, when and how? Practical Assessment, Research & Evaluation, 7 (3).

Moskal, Barbara M. & Jon A. Leydens (2000). Scoring rubric development: validity and reliability. Practical Assessment, Research & Evaluation, 7(10).

Rafilson, F. (1991). The case for validity generalization. Practical Assessment, Research & Evaluation, 2 (13).

Schrock, K. (2000). Kathy Schrock's Guide for Educators.

State of Colorado (1998). The Rubric.

Yancey, K. B. (1999). Looking back as we look forward: Historicizing writing assessment. College Composition and Communication, 50, 483-503.