Quantitative Reasoning Assessment Report
Overview and Audience
The Assessment Support Committee focused on assessing Quantitative Reasoning in AY 2018-19 using a modified rubric approved by Faculty Congress. The rubric targets three categories: 1- Analysis, 2- Calculations, and 3- Visual Representations of Data and Information (e.g. tables and graphs). The first category, Analysis, is also used to simultaneously assess Critical Thinking.
Evaluations of programs were planned for the Spring 2019 semester. As with the previous campus assessment of Quantitative Reasoning in 2014-15, a common quiz was administered in 300-level, 400-level or capstone courses. A member of the committee from the Mathematics department re-designed the quiz with the understanding that it would be administered across all campus disciplines. Of the 30 undergraduate programs on campus, 24 programs completed the evaluation.
The quiz was anonymous but asked students to identify their class standing. The quiz questions required using information provided in two graphs to determine the correct answers. The creator of the quiz identified the order from easiest to most difficult for the questions was 1, 3, 2, and 4.
The full results are summarized below. Of those who participated, 93% (277 our ot 295) were Juniors or Seniors.
Assessment results by Class Standing and in the Aggregate
Total Mean, Median, and Mode
Count and Percent of Students with each Possible Aggregate Score
|0||1||2||3||4||3 or 4|
The results of the quiz indicate some improvement in basic skills from the last assessment. There was an increase of Juniors and Seniors who correctly answered a majority of the questions this year (61%) compared to the 2014 assessment (45%). Overall, more students scored better this year, averaging 2.66 out of 4 questions compared to 1.31 out of 3 questions. However, there is still room for improvement. Students could do the simple, one-step problems but had difficulty answering complex problems.
The committee observed that issues with basic quantitative reasoning skills may also reflect issues with skills such as reading comprehension as it seemed that students may have had difficulty identifying important information from the questions needed to answer them correctly. For example, the committee noticed in question two that a high number of students chose b or c, which demonstrates that they had a misunderstanding of the concept of ratio or proportion.
The committee recommends further discussion by departments of the skills needed for student competency in this area including understanding of visual representation of data and reading comprehension across the curriculum.
Submitted by Lari-Anne Au
Chair of the Assessment Support Committee (AY 2017-2018)