Navigation

Search IAS

Writing Competency 2014-2015

The Office of Institutional Assessment and Studies coordinated the 2014-2015 assessment of undergraduate competency in written communication. A faculty committee composed of representatives of the undergraduate schools provided oversight for the process, from establishment of the learning outcomes and standards for the assessment to the determination of findings and recommendations.

Individual schools and programs were invited to participate. At least one academic program in all eight schools with undergraduate programs as well as programs in the three disciplinary divisions in the College conducted assessments of student writing. Fourteen separate assessments constituted the overall assessment of third- and fourth-year students’ writing. All programs conducted assessments of the five learning outcomes by applying the same rubric—the AAC&U VALUE Rubric for Written Communication—with adjustments as needed to reflect disciplinary definitions. Total scores could range from 5 to 20, scores for each of the five outcomes from 1 (minimally competent) to 4 (highly proficient).

Student Learning Outcomes

The five learning outcomes were:

  1. Context of and purpose for writing, consideration of audience
  2. Content- development of ideas/argument, flow, appropriate use of content
  3. Genre and disciplinary conventions for writing in the academic field; organization
  4. Sources and evidence- use of credible, relevant sources to support ideas; argument
  5. Control/Syntax-mechanics, tone, use of language

Standards

The following standards were established for third- and fourth-years:

  • 40% of undergraduates are expected to be highly competent;
  • 85% competent or above;
  • 100% minimally competent or above;
  • 0% not competent.

Methodology

Assessments were designed to provide information about student competency. In addition, one program employed a pre-post design that compared performance by the same students at the beginning of the academic year and at the end (SCC).

When samples were required, IAS employed stratified random sampling technique to identify students. Students’ papers were de-identified to protect student confidentiality and to reduce the possibility of rater bias. To ensure reliability of the scoring, each student paper was scored by two raters separately and by a third rater if the scores from first two differed substantially.

Across the 14 programs, 425 students’ papers were assessed according to the rubric. Results were analyzed by learning outcome, by school, and within the College, by discipline. In addition, each of the 14 participating programs received a customized report on their results. Faculty in each of those programs then could draw program-specific conclusions and plan improvements based on their results.

List of 2014-15 Committee Members

James Seitz, co-chair, College-English
Lois Myers, co-chair, Assessment and Studies
Timothy Beatley, School of Architecture
Jon D’Errico, College-English
Charity Fowler. Batten School of Leadership and Public Policy
Lynn Hamilton. McIntire School of Commerce
Stephen Levine, School of Continuing and Professional Studies
Aaron Mills, College- Environmental Sciences
Kay Neeley, School of Engineering and Applied Science
Randall Robey, Curry School of Education
Josipa Roksa, College-Sociology
Adriana Streifer, College-English
Diane Szaflarski, School of Nursing