New Orleans, Louisiana
June 26, 2016
June 26, 2016
June 29, 2016
978-0-692-68565-5
2153-5965
Systems Engineering
31
10.18260/p.26162
https://peer.asee.org/26162
580
Peizhu Zhang is currently a PhD student in Systems Engineering at Stevens Institute of Technology, having earned a master's degree in Computer Science there in July 2012. His research interest includes systems engineering, competency assessment, software engineering and serious games.
Douglas A. Bodner is a principal research engineer in the Tennenbaum Institute at the Georgia Instituteof Technology. His research focuses on computational analysis and decision support for design, operation and transformation of enterprise systems. His work has spanned a number of industries, including aerospace and defense, automotive, electronics, energy, health care, paper and pulp, semiconductors and telecommunications. Dr. Bodner is a senior member of the Institute of Electrical and Electronics Engineers (IEEE) and the Institute of Industrial Engineers (IIE), and a member of the American Society for Engineering Education (ASEE) and the Institute for Operations Research and Management Science (INFORMS). He is a registered professional engineer in the State of Georgia.
Dr. Richard Turner has forty years of experience in systems, software and acquisition engineering in both private and public sectors. Currently a Distinguished Service Professor and a Principle Investigator for the Systems Engineering Research Center at the Stevens Institute of Technology in Hoboken, New Jersey, Dr. Turner is active in the agile, lean and kanban communities and was a core team author of the IEEE Computer Society/PMI Software Extension for the Guide to the PMBOK. Dr. Turner’s current research includes using kanban and service concepts to transform systems engineering and applying lean and complexity concepts to critical system development and acquisition. He is a Golden Core awardee of the IEEE Computer Society, a fellow of the Lean Systems Society, a Senior Member of the IEEE, and co-author of four books: The Incremental Commitment Spiral Model: Principles and Practices for Successful Systems and Software, Balancing Agility and Discipline: A Guide for the Perplexed, CMMI Survival Guide: Just Enough Process Improvement, and CMMI Distilled.
Mr. Arnold has over 12 years of experience in software development, systems engineering, and technical leadership for the defense industry. He has independently developed and licensed a variety of software products, as well as published over 30 technical papers focused primarily on fire control and mission command systems. Mr. Arnold holds a B.S. in Computer Science from Rutgers University as well as an M.S. in Software Engineering from Stevens Institute of Technology. He is currently a PhD Candidate in Systems Engineering at Stevens, where his research focuses on systems thinking and its assessment.
Jon Wade, Ph.D., is a professor of practice at the Jacobs School of Engineering at the University of California at San Diego where he is the director of convergent systems engineering designing transdisciplinary education and research programs oriented around the fundamental principles of contemporary closed-loop systems engineering design. Previously, Dr. Wade was a research professor in the School of Systems and Enterprises at the Stevens Institute of Technology where he also served as the chief technology officer of the Systems Engineering Research Center (SERC) UARC. His industrial experience includes serving as executive vice president of Engineering at International Game Technology (IGT), senior director of Enterprise Server Development at Sun Microsystems and director of Advanced System Development at Thinking Machines Corporation. His research interests include complex systems, future directions in systems engineering research and the use of technology in systems engineering and STEM education. Dr. Wade received his S.B., S.M., E.E. and Ph.D. degrees in electrical engineering and computer science from the Massachusetts Institute of Technology.
The Systems Engineering Experience Accelerator (SEEA) project created a new approach to developing the systems engineering workforce which augments traditional, in-class education methods with educational technologies aimed at accelerating skills and experience with immersive simulated learning situations that engage learners with problems to be solved. Although educational technology is used in a variety of domains to support learning, the SEEA is one of the few such technologies that supports development of the systems engineering workforce.
While the existing technology infrastructure and experience content is useful, it is limited in its ability to support a community of educators and developers. The SEEA was developed with a goal of transitioning to an open-source sustainment model which will provide long-term support for a community of educators and learners in creating learning exercises to address their specific needs. Currently, it is difficult to design and develop educational content without significant knowledge of the SEEA design.
This research task is focused on developing a set of tools specifically for educators and developers outside the SEEA research and development team, to support their designing and developing learning modules for their use. It concentrates on a subset of possible tools prioritized by the likelihood of having the most impact on facilitating module development. The tools development efforts fall into four major categories – simulation tools for building and testing simulation models that mimic the behavior and results of programs that focus on system design and development, experience building tools that provide the structure for such system engineering experiences and the events that occur in them, learning assessment tools to measure the efficacy of the experience, and EA infrastructure changes to support this work.
This paper describes the capabilities of these tools and provides an evaluation of their capabilities in the update of an existing experience and the development of a number of new educational experiences. In addition, their use is learning assessment is discussed. The paper concludes with a description of our future directions.
Zhang, P., & Bodner, D. A., & Turner, R. G., & Arnold, R. D., & Wade, J. P. (2016, June), The Experience Accelerator: Tools for Development and Learning Assessment Paper presented at 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana. 10.18260/p.26162
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2016 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015