Asee peer logo

A tag-based framework for collecting, processing, and visualizing student learning outcomes

Download Paper |

Conference

2023 ASEE Annual Conference & Exposition

Location

Baltimore , Maryland

Publication Date

June 25, 2023

Start Date

June 25, 2023

End Date

June 28, 2023

Conference Session

Framework Studies

Tagged Division

Educational Research and Methods Division (ERM)

Page Count

6

DOI

10.18260/1-2--42533

Permanent URL

https://peer.asee.org/42533

Download Count

193

Paper Authors

biography

Tonghui Xu University of Massachusetts Lowell

biography

Hsien-Yuan Hsu University of Massachusetts Lowell Orcid 16x16 orcid.org/0000-0003-2155-2093

visit author page

Dr. Hsien-Yuan Hsu is an Assistant Professor in Research and Evaluation in the College of Education at the University of Massachusetts Lowell. Dr. Hsu received his PhD in Educational Psychology from Texas A&M University and has a background of statistics

visit author page

author page

Melissa Nemon University of Massachusetts Lowell

author page

Christopher Hansen University of Massachusetts Lowell Orcid 16x16 orcid.org/0000-0002-2958-3014

author page

John Hunter Mack University of Massachusetts Lowell Orcid 16x16 orcid.org/0000-0002-5455-8611

biography

David J. Willis University of Massachusetts Lowell

visit author page

David Willis is an Associate Professor of Mechanical Engineering at UMass Lowell. His interests are in aerodynamics and engineering education.

visit author page

Download Paper |

Abstract

The Mechanical Engineering faculty at a public four-year, comprehensive university in the Northeast region is developing and piloting a tag-based framework to systematically identify, collect, process, and visualize large volumes of student learning outcomes data for course- and program-level outcomes assessments. Student learning outcome identifier tags are applied by course instructors to link the questions on assignments, quizzes, projects, and exams to their course outcomes and the overall program outcomes. The goal of this pilot effort is to inform instruction improvement, course design, and program delivery. To support program-level outcomes assessment, the department has developed pilot rubrics aligned with ABET’s student outcomes 1-7. Each rubric performance indicator has 4 performance levels, with each performance level assigned an identifier tag. For example, ABET Student Outcome 1 has six performance indicators ([ABET1a]- [ABET1f]), with each performance indicator divided into four performance levels which together form the associated tag set (e.g., [ABET1aL1], [ABET1aL2], [ABET1aL3], and [ABET1aL4], where L1 means novice level and L4 means expert level). In total, there are over 50 program-level outcomes performance indicators with over 200 associated tag identifiers. The framework also allows instructors to introduce course specific content and skills tags to identify course outcomes. Each course in the pilot has a defined tag syntax associated with a course content and skills mapping. In the initial pilot effort, Gradescope is used to apply program and course level tags as relevant. In the full paper, we will provide examples of the program-level outcomes tags as well as their implementation in the Gradescope tool.

The tag data collected from grading a given assessment is de-identified, cleaned, and entered into a SQL server database. This data is then processed in a Python-based visualization platform (V-TAG). V-TAG exploits Python plotting libraries to create course- and program-level interactive visualizations to inform instructors of students’ formative and summative performance in specific skills areas. The V-TAG plots (e.g., heatmap, wind rose, sunburst, and polar bar charts) present data to facilitate data-driven decision-making. For example, instructors could interrogate the aggregate tag assessment data to tailor their teaching as well as to redesign class activities in future offerings to increase student learning. Similarly, the program level student outcomes tags can be examined and interrogated to understand the formative and summative performance within the curriculum and help to identify curriculum redesign opportunities. Ultimately, the ability to collect, process, and visualize larger amounts of student learning outcomes data enables both courses and the program to better use higher volume assessment data to drive continuous improvement decisions.

Xu, T., & Hsu, H., & Nemon, M., & Hansen, C., & Mack, J. H., & Willis, D. J. (2023, June), A tag-based framework for collecting, processing, and visualizing student learning outcomes Paper presented at 2023 ASEE Annual Conference & Exposition, Baltimore , Maryland. 10.18260/1-2--42533

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2023 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015