Salt Lake City, Utah
June 23, 2018
June 23, 2018
July 27, 2018
Systems engineering and technical leadership (SETL) is a multidisciplinary practice that is as much an art as a science. While a traditional model of education can teach the fundamental body of knowledge, it is not until this knowledge is put into practice in an integrated, real world environment that a systems engineer can develop the necessary insights and wisdom to become proficient. Organizations and enterprises not only need to improve the existing workforce to enable them to keep up with the demands of the work place, but also require a better approach to assess and evaluate the competencies and learnings of prospective and practicing systems engineering practitioners. Learning assessment is a critical component of accelerated learning. It is imperative to understand individual learning and the efficacy of the various learning experiences. This is critical both in determining the capabilities of the learner, but also enable the continual improvement of the capabilities of the learning experience.
This paper describes a set of Automated Learning Assessment Tools (ALATs) that measure a subject’s proficiency in a set of systems engineering competencies and the efficacy simulated learning experiences through analysis of the data recorded throughout the learners’ participation in a simulation experience. The vehicle that it uses is the Systems Engineering Experience Accelerator which is a new approach to developing the systems engineering and technical leadership workforce, aimed at accelerating experience assimilation through immersive, simulated learning situations where learners solve realistic problems. A prototype technology infrastructure and experience content has been developed, piloted, and evaluated. Traditionally, learning assessment has been done through examinations and experts’ reviews and opinions on students’ work which requires substantial effort. In addition, most approaches emphasize comparing learners’ performance against those of the experts’ and less about the evaluation the actual learning performance of the individuals. Though simulation has been widely adapted by systems engineering learning, it has yet to be used to assess learner competencies and learnings performance in systems engineering and technical leadership learning. The ALATs described in this paper address these issues. This paper describes the evaluation of the capabilities of these tools through their performance in a number of pilot studies. Evidence of systems engineering competencies and learning trajectories is analyzed, compared and contrasted from the perspective of the learner’s performance, behaviors, self-evaluation and finally expert assessments. The limitations and strengths of the various approaches are discussed. Finally, areas of future research in pilot studies and learning assessment tool capabilities are described.
Zhang, P., & Wade, J. P. (2018, June), Automated Assessment of Systems Engineering Competencies Paper presented at 2018 ASEE Annual Conference & Exposition , Salt Lake City, Utah. 10.18260/1-2--29840
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2018 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015