Seattle, Washington
June 14, 2015
June 14, 2015
June 17, 2015
978-0-692-50180-1
2153-5965
Educational Research and Methods
19
26.1176.1 - 26.1176.19
10.18260/p.24513
https://peer.asee.org/24513
592
Brian Frank is the DuPont Canada Chair in Engineering Education Research and Development, and the Director of Program Development in the Faculty of Engineering and Applied Science at Queen's University where he works on engineering curriculum development, program assessment, and developing educational technology. He is also an associate professor in Electrical and Computer Engineering.
Educational Researcher and Adjunct Professor (Msc ’06, PhD ’12) at Queen’s University, Kingston, Ontario, Canada in the Faculty of Engineering and Applied Science. Educational research interests include engineering education development, cultural change in higher education, higher-order thinking development and assessment, outcomes-based data-informed continuous improvement, educational data visualization & reporting and authentic performance-based assessment.
Natalie Simper coordinates a Queen's research project investigating the development and measurement of general learning outcomes. Natalie comes from an Australian Senior-Secondary/ Post-Secondary teaching background, with experience at the State-wide level in curriculum development, large-scale assessment, and evaluation and assessment of outcomes based education.
Jill Scott is Vice-Provost (Teaching and Learning) and Professor, Department of Languages, Literatures and Cultures. She is leading a number of assessment initiatives across the university, including assessment of transferable intellectual skills, assessment of active learning spaces, and program and curriculum assessment.
Multi-method longitudinal assessment of transferrable intellectual learning outcomes[University X, (blinded)] is part of a six-‐institution consortium in Canada committed to developing assessment techniques for generic learning outcomes and cognitive skills. These skills, including critical thinking, problem solving, communication and lifelong learning, are the subject of discussion in higher education generally (e.g. the American Associate of Colleges and Universities (AAC&U) Essential Learning Outcomes) and engineering in the form of accreditation requirements. The three-‐year project at [University X, (blinded)] is using multiple methods to assess the longitudinal development of these skills in engineering, humanities, physical science, and social science sectors. Methods are aimed at sustainable assessment achieved within standard course contexts and developing internal processes for the implementation, management, and assessment of university-‐wide learning outcomes that recognize and enhance disciplinary expectations. We have taken a three-‐pronged approach to the longitudinal assessment of general learning outcomes across targeted programs, including engineering: A. Assessment using standardized quantitative instruments and qualitative processes B. Working with course instructors to align teaching, learning and assessment of complex cognitive skills, embedding course-‐based “authentic problem tasks” for course grading and assessment C. Assessment of course artefacts using meta-‐rubrics scored independently of course grading The project is using multiple instruments to measure identified learning outcomes, including two standardized instruments: Collegiate Learning Assessment Plus (CLA+) Test, Critical Thinking Assessment (CAT) Test, think-‐aloud sessions, AAC&U VALUE rubrics, and a new triangulated qualitative/quantitative measure of Transferable Learning Orientations (TLO) based on the VALUE rubric for lifelong learning and the Motivated Strategies for Learning Questionnaire (MSLQ). In the first year of the project a double cross-‐sectional study of first and fourth year students was conducted to pilot the instruments and identify key themes. Over 2000 first and fourth year students from the Faculty of Arts and Science (Psychology, Drama and Physics), and from the Faculty of Engineering and Applied Science (Chemical Engineering, Civil Engineering, Geological Engineering, and Mechanical Engineering) have consented to participate in the project. Preliminary analysis indicates significant correlations in student scores on specific CLA+ and VALUE rubrics dimensions, and significant changes in learning outcomes from first year to fourth year in engineering. The TLO was piloted among approximately 1000 students in Engineering and Applied Science over two years. The instrument was refined through quantitative analysis and focus groups, and found significant improvement between first to fourth year learning orientations. This paper will present results from the first year of the study including descriptive statistics and factor analysis for the first year and fourth year student samples, correlations between the CLA+ and assessment course artefacts using VALUE rubrics, approach to calibration and scoring, and demographic analysis. It will discuss observations about instructor buy-‐in and feedback, and broader impact. It will also present the development and dimensions of the TLO, alignment with the VALUE rubrics, and results of the pilot study.
Frank, B. M., & Kaupp, J. A., & Simper, N., & Scott, J. (2015, June), Multi-method Longitudinal Assessment of Transferrable Intellectual Learning Outcomes Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.24513
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2015 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015