Virtually Hosted by the section
November 12, 2021
November 12, 2021
November 13, 2021
The Accreditation Board for Engineering and Technology (ABET) and Middle States Commission on Higher Education (MSCHE) accredit engineering degree programs. Their accreditation efforts assure the public that programs successfully prepare their graduates to enter critical STEM fields in the global workforce. An ongoing assessment and evaluation process must support engineering degree programs in higher education institutions. These assessments result in quantitative and qualitative data used in evaluation processes that integrate learning attainment goals and all assessment data. Engineering degree programs present unique organizational and logistical challenges in meeting accreditation requirements, such as integrating qualitative and quantitative data and information in a meaningful way that facilitates continuous improvement and supports inferences about student achievement of learning objectives. Programmatic success is predicated on using the results from programmatic evaluation in an ongoing program control process, commonly known as "continuous improvement." This control capability requires the ability to detect deviations from the program baseline or learning outcomes and promptly intervene to correct the problem. In 2020, the XXX, XXX school of engineering was readying for an accreditation visit. Part of that process was preparing a self-study report. A survey of other program reports indicated a common approach to integrating data and information was to use weights to combine the various data elements (quantitative and qualitative). The practice then was to set a "target" value that signified an acceptable level of student attainment. A criticism of this approach is that it isn't sufficiently granular to detect early problems or trends and, more importantly, doesn't adequately support corrective action by the program. The XXX BSCE program did not choose this integration method. The program choice was a method often used in the Federal government where data sources are highly diverse and varied in quality. The Federal government, like accreditation agencies, wants to integrate all assessment data and learning objectives to make valuable inferences about programmatic success. The methodological approach in those Federal contexts is to focus on broad themes and longitudinal studies. This results in programmatic decisions based upon trends with pre-established trigger points that signal the need to intervene and is well adapted for application in an environment such as ABET/MSCHE accreditation efforts. A literature search indicates this approach is known in other disciplines as mixed methods. While mixed methods offer rich potential, there has not been extensive research on specific applications to solve problems such as academic program evaluation. The significant finding of this paper is that this approach to academic program evaluation by XXX is innovative and constitutes a new application of mixed methods relevant to the engineering education community. It also presents recommendations for applying the mixed-method approach in an environment where engineering degrees conduct program assessments that must meet ABET and MSCHE requirements.
O'Connor, M. B. (2021, November), The Use of Mixed Methods in Academic Program Evaluation Paper presented at 2021 Fall ASEE Middle Atlantic Section Meeting, Virtually Hosted by the section. https://peer.asee.org/38449
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2021 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015