Baltimore , Maryland
June 25, 2023
June 25, 2023
June 28, 2023
Educational Research and Methods Division (ERM)
2
10.18260/1-2--42406
https://peer.asee.org/42406
282
Dr. Sara Kraemer is a systems engineer with deep experience working in higher education and K12 education systems. Dr. Kraemer’s program evaluation expertise, technical expert practice, and writing has focused on the application of system design principles to the fields of education and STEM. Her research experience includes critical infrastructure protection, decision support systems in education, and systems to recruit and retain educators. Dr. Kraemer is the lead technical expert and owner of Blueprint for Education, a consultancy that focuses on system design, program evaluation, and technical expert services. Her Ph.D. is in Industrial and Systems Engineering from the University of Wisconsin-Madison.
This theory and methods paper presents a human factors and systems engineering evaluation framework to support holistic and comprehensive program and course evaluation in engineering education. Many times, program evaluation focuses on a single factor or small set of factors to assess program impact, which may oversimplify the context in which the program operates, leading to a lack of impact or ambiguous evaluation results. A more robust, holistic framework for evaluating programs in engineering higher education is needed.
This paper articulates the integration of the fields of human factors and systems engineering into a program evaluation framework for engineering higher education. This framework draws from human factors and systems engineering [1] and from health care programs that use human factors and systems engineering approaches in their evaluation designs [2, 3]. The levels of analysis for program evaluation may be at the course, department, division, or college levels. Further, while the framework is informed by human factors and systems engineering, its application is for evaluation in all engineering education disciplines.
The human factors and systems engineering evaluation framework for engineering programs in higher education (referred to as “the framework” in this paper) is specified in three unique ways. First, the student experience is focal and describes how individual factors such as motivation, cognitive load, and physiological states (including stress) influence student learning and programmatic outcomes. Individual experience is multifaceted, shaped by personal and social identities, and dependent upon situational context. Student experiences are central to the evaluation analysis in order to capture the influence of programmatic intervention on learning outcomes.
Second, the framework articulates how student experiences interact with system elements of the program as well as the broader engineering education context. The system elements are adapted from Carayon’s [1] conceptualization of a work system, which includes the individual (student) experience; learning and project tasks; course schedules and assignment timelines; organizational structures such as project teams, student study groups, or tutoring; technologies such as lab equipment or specific software; and the learning environment, such as classrooms, laboratories, or field work. These elements interact and interplay with one another to influence the individual experience of the student, which in turn influences learning and programmatic outcomes.
Third, the framework aligns to relevant programmatic outputs and outcomes associated with the program’s goals and objectives. Meaning, the framework does not exist in the abstract, but rather is tightly integrated with the specific objectives of the program, course, or initiative. The framework guides the selection of programmatic outcomes and bounds the articulation of system elements to ensure the resulting program evaluation design captures the nuance, relevance, and the interplay of the elements salient to the program's design.
This framework builds faculty, course instructors’, and program evaluators’ capacities to incorporate human factors and systems engineering principles into their course and program evaluation design and planning. This paper consists of a review of the relevant research base and resulting framework, figures, and graphics to illustrate the framework and its components, and a description of how to use the framework to plan and design a program evaluation.
Kraemer, S. (2023, June), A Human Factors and Systems Engineering Evaluation Framework for Engineering Programs in Higher Education Paper presented at 2023 ASEE Annual Conference & Exposition, Baltimore , Maryland. 10.18260/1-2--42406
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2023 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015