Writing Across the Curriculum

A Four-Year Study of Students Writing Across the Curriculum at Notre Dame

Principal Investigators:

  • Stuart Greene, Director, University Writing Program
  • Christine Venter, Adjunct Instructor, University Writing Program
  • Amy Orr, Visiting Instructor, Sociology

Contact: Stuart Greene, Stuart.Greene.19@nd.edu

The purpose of this study is to investigate the extent to which the kinds of writing students complete in the first-year writing course at Notre Dame are compatible with the types of writing they do in different courses of study across the curriculum.  The primary questions motivating this study are: (a) Are students in first-year writing composing essays that are similar to or different from the kinds of writing students do in other disciplines? (b) What types of papers are students writing? (c) Are students writing papers more in some major fields of study than others? (d) Are they writing certain types of papers more in one area of study than another? (e) Are students writing more at certain times during their four years here than others?

Participants of the study were chosen by random among the incoming first-year class in the fall of 1999.  Students were asked to keep all of the writing they did for each class and maintain a record of this writing.  Investigators will interview each participant at the beginning and end of each semester; these interviews will focus on the assignments that professors gave and the students' interpretations of these assignments.  The investigators are also interviewing faculty about their assignments and expectations of students.

Outcomes

  • Every semester, we set up Portfolio Assessment Groups of our current instructors. These are groups of approximately 6 instructors, with varying degrees of experience in our program. Each group is led by one person (we began with program administrators in these roles, and this year began giving these positions to second-year graduate instructors, who gain a bit of administrative/leadership experience as well as a small monetary reward for their work), who calls the group together three times a semester and organizes the meetings so that every instructor is able to have at least one of their students’ papers workshopped by the group. The object of these Assessment Groups is to give all instructors more experience and support with commenting on student papers, and to make sure all instructors are using the course’s Memo on Goals for this assessment. Reviews of these groups from have been very positive, as they allow instructors to get to know one another better, to learn from one another’s strategies, and, most importantly, to help guarantee that our students are receiving equitable experience in FYC, in terms of instruction, written feedback, and evaluation.
  • We have developed more detailed assessment rubrics for our course: a Memo on Goals and a Memo on Grades. These rubrics are used by all instructors, and are also used in the Portfolio Assessment Groups.
  • At the end of Spring 2003, we randomly collected 40 final portfolios (with each students’ permission, though their names, grades, and instructors were not included in our evaluation in any way), in order to gain a sampling of student writing that would help us assess how well our students are able to accomplish the course goals by the end of the semester. We developed a rubric for assessing these goals and scoring them on a scale of 1-5. We then trained and paid 4 outside assessors to read and score these papers, looking for such skills as a clearly developed argument, inclusion of counter-arguments, use of scholarly sources to develop and advance the argument, etc. This assessment was performed over a 3-day period in August, 2003. These numbers are still being processed, but we have already been able to see that we are falling short in some of our instruction, such as the teaching of how to include counter-arguments effectively. It is also clear, interestingly, that the writing our students do in mid-semester is more developed than the writing they do in the final assignment. Given this, and the analysis that students may need more time to work on these final projects, we have a subcommittee currently examining ways of providing more support for students during this final project, so that they may produce their best work in this final research assignment.

We have appreciated the funding that has made it possible for us to design, conduct, and compensate instructors and assessors for doing this work. We will continue to analyze the data from our summer assessment project. The Portfolio Assessment Groups are now a standard part of our instructor-support, and will continue to be for the foreseeable future.