Assessment

bar graph of learning goals

High-Quality Assessment

The Essential Studies Program assessment process uses “performance tasks” to collect consistent information from students across campus for most of the ES Program learning goals. This process has been featured in the article

Designing Effective Classroom Assignments: Intellectual Work Worth Sharing
By Pat Hutchings, Natasha A. Jankowski, and Kathryn E. Schultz
Change: The Magazine of Higher Learning Vol. 48 , Iss. 1,2016

and an example of the type of performance task used is available at https://www.assignmentlibrary.org/assignments/559c3986afed17c65f000003

Data Summaries

A total of 86 faculty and staff scorers interacted with approximately 350 students at the Spring 2017 UNDergraduate Showcase poster session.

Data quality considerations and assessment methodology:

  • Pre-scoring “norming” process was used to achieve scorer consistency.
  • Scorers were randomly assigned to interact with students to avoid systematic scoring biases.
  • Scoring sample came from a subset of the total population of ES capstone students, and thus was not a true random sample.
  • Only students scored by at least two independent scorers were considered for the final analysis.
  • In the large majority of cases the spread between scorers’ results for the same student differed by 2 points or less (out of a maximum of 6).

oral comm pie chart

Summary of results

For results with the highest-possible reliability and validity, we focus on the 836 scoring instances with a maximum scorer spread of 0 or 1:

oral comm score

Interpreting the results:

  • These results suggest UND students’ oral communication abilities are rarely “unsatisfactory,” with more than 2/3 scoring as “accomplished.”

A total of 20 faculty and staff scored approximately 85 students’ written work from a performance task developed by UND faculty and which was focused on the ES Written Communication learning goal. Students completed their work during the February 2016 Assessment Week.

Data Quality Considerations and Assessment Methodology:

  • Pre-scoring “norming” process was used to achieve scorer consistency.
  • Scorers were randomly assigned to assess students’ work to avoid systematic scoring biases.
  • Scoring sample came from a subset of the total population of ES capstone students, and thus was not a true random sample.
  • Only students scored by at least two independent scorers were considered for the final analysis.
  • In the large majority of cases the spread between scorers’ results for the same student differed by 2 points or less (out of a maximum of 6).

written comm pie chart

Summary of results

  • For results with the highest-possible validity, we focus on the 183 scoring instances of ES capstone students with a scorer spread of 0 or 1:

written comm chart

Interpreting the results

  • There is clear room for improvement, as only approximately 30% of students produced work in the “accomplished” category.

A total of 22 faculty and staff scored approximately 100 students’ written work from a performance task developed by UND faculty which was focused on the ES Diversity learning goal (this was the name and framework for the goal before being revised to become Intercultural Knowledge & Skills). Students completed their work during the February 2016 Assessment Week.

Data Quality Considerations and Assessment Methodology:

  • Pre-scoring “norming” process was used to achieve scorer consistency.
  • Scorers were randomly assigned to assess students’ work to avoid systematic scoring biases.
  • Scoring sample came from a subset of the total population of ES capstone students, and thus was not a true random sample.
  • Only students scored by at least two independent scorers were considered for the final analysis.
  • In the large majority of cases the spread between scorer’s results for the same student differ by 2 points or less (out of a maximum spread of 6).

intercultural pie chart

Summary of results:

  • For results with the highest possible validity, we focus on the 270 scoring instances of ES capstone students with a scorer spread of 0 or 1:

intercultural score

Interpreting the results:

  • These results were discouraging, as only approximately 15% of students produced work in the “accomplished” category, and almost a majority showed scores in the “unsatisfactory” category.
  • Because of the key piece of evidence provided by these results, the ES requirements in this particular area have been revised, and involve strengthened course criteria and expectations.

A total of 15 faculty and staff scored approximately 120 students’ written work from a performance task developed by UND faculty which was focused on the ES Quantitative Reasoning learning goal. Students completed their work during the February 2015 Assessment Week.

Data Quality Considerations and Assessment Methodology:

  • Pre-scoring “norming” process was used to achieve scorer consistency.
  • Scorers were randomly assigned to assess students’ work to avoid systematic scoring biases.
  • Scoring sample came from a subset of the total population of ES capstone students, and thus was not a true random sample.
  • Only students scored by at least two independent scorers were considered for the final analysis.
  • In the large majority of cases the spread between scorer’s results for the same student differ by 2 points or less (out of a maximum spread of 6).

reasoning pie chart

Summary of results:

  • For results with the highest possible validity, we focus on the 363 scoring instances of ES capstone students with a scorer spread of 0 or 1:

reasoning score

Interpreting the results:

  • Roughly 45% of students scored in each of the “accomplished” and “developing” categories, indicating clear room for improvement.
  • A previous instance of quantitative reasoning assessment showed a greater percentage of students in the “Unsatisfactory” and “Accomplished” categories. Thus there was both positive and negative progress made between the two assessment instances.

In February 2018, senior students (n=171) in ES Capstone courses volunteered to take a specially designed “performance task” that presented them with a scenario asking them to produce work focused on the ES Information Literacy (IL) learning goal. The task was designed by UND faculty members to determine the level of accomplishment of UND students relative to this aspect of the ES Program. The task was aligned with the Association of American Colleges & Universities’ VALUE rubric for IL. In May 2018, faculty and staff (n=28) participated in a “scoring session” in which they assessed the students’ work from February 2018. Below are summarized the results from the scoring session.

information literacy pie chart

Data Summaries

In February 2017, senior students (n = 240) in ES Capstone courses volunteered to take a specially designed “performance task” that presented them with a scenario asking for them to produce work focused on the ES Critical Inquiry & Analysis learning goal. The task was designed by UND faculty members to determine the level of accomplishment of UND students relative to the ES CI&A learning goal. The task was aligned with both UND’s ES CI&A criteria and UND’s CI&A Assessment Rubric. In December 2017, faculty and academic staff (n = 28) participated in a “scoring session” in which they assessed the students’ work from February 2017. Below are summarized the results from the scoring session for the 195 student work products scored the requisite number of times (2 under most circumstances, 3 times when the first two scorings disagreed substantially).

critical inquiry pie chart

Data Summaries

Rubrics

ES Written Communication Rubric

ES Oral Communication Rubric (updated 2015)

ES Quantitative Reasoning Rubric

ES Information Literacy Rubric

ES Intercultural Knowledge & Skills Rubric

ES Critical Inquiry & Analysis (November 2016)

Each rubric assesses a particular learning outcome. The rubrics can serve multiple uses:

    • Use as is, to assess student work.
    • Use in a modified form to better match the instructor’s or department’s intentions.
    • Use as a model for designing a different rubric.