Indianapolis, Indiana
June 15, 2014
June 15, 2014
June 18, 2014
2153-5965
Educational Research and Methods
14
24.1268.1 - 24.1268.14
10.18260/1-2--23201
https://peer.asee.org/23201
537
Claudia Elena Vergara is a research scientist in the Center for Engineering Education Research (CEER) at Michigan State University. She received her Ph.D. in plant biology from Purdue University. Her scholarly interests include improvement of STEM teaching and learning processes in higher education, and institutional change strategies to address the problems and solutions of educational reforms considering the situational context of the participants involved in the reforms. She is involved in several research projects focusing on competencies-based curriculum redesign and implementation aimed at integration across curricula; increasing the retention rate of early engineering students; and providing opportunities for STEM graduate students to have mentored teaching experiences.
Mark Urban-Lurain is an associate professor and associate director of the Center for Engineering Education Research at Michigan State University. He is responsible for teaching, research, and curriculum development, with emphasis on engineering education and, more broadly, STEM education.
His research interests are in theories of cognition, how these theories inform the design of instruction, how we might best design instructional technology within those frameworks, and how the research and development of instructional technologies can inform our theories of cognition. Dr. Urban-Lurain is also interested in preparing future STEM faculty for teaching, incorporating instructional technology as part of instructional design, and STEM education improvement and reform.
Jon Sticklen is the director of the Center for Engineering Education Research at Michigan State University. He also serves MSU as director of applied engineering sciences, an undergraduate bachelor of science degree program that is highly interdisciplinary, focusing on both engineering and business. He also is a faculty member in the department of computer science and engineering. In the 1990s, Dr. Sticklen founded and led a computer science laboratory in knowledge-based systems focused on task-specific approaches to problem solving, better known as expert systems. Over the last decade, Dr. Sticklen has pursued engineering education research focused on early engineering with an emphasis on hybrid course design and problem-based learning. His current research is supported by NSF/DUE and NSF/CISE.
Michael Cavanaugh is a third-year graduate student in the anthropology department at Michigan State University. He has a B.A. in American Indian studies from the University of Wisconsin, Eau-Claire and an M.A. in American Indian studies from UCLA. His Ph.D. research at MSU is focused on American Indian experiences in postsecondary education and the retention of Native students in STEM disciplines.At the Center for Engineering Education Research (CEER), he has worked on the CPACE Project (Collaborative Process to Align Computing Education with Engineering Workforce Needs) and helped to build a framework for assessing computational competencies within engineering education.
Towards Improving Computational Competencies for Undergraduate Engineering Students The overarching goal for the Collaborative Process to Align Computing Education withEngineering Workforce Needs (CPACE) team is to redesign the role of computing withinengineering programs at and to develop computationalcompetencies – informed by industry needs. In CPACE I we: a) identified the computationalcompetencies needed in the engineering workplace; and b) aligned them with Computer Science(CS) concepts to be used in curricular implementation (CPACE Computational Competencies).In CPACE II our goal is to infuse computational problem-solving competencies throughout thecurricula starting with courses in Chemical and Civil Engineering. Our strategy involves usingproblems derived from contemporary industrial engineering practice. We collected studentsurveys at the beginning and end of target courses, and student interviews and focus groups, aswell as sample course work e.g. final project reports and relevant homework assignments. Tocompare students’ self-reported data regarding their computational abilities and their actualperformance in classroom assignments, we developed and continue testing rubrics based on theCPACE computational competencies. Preliminary survey analyses indicated that students who are engaged in project or classroomactivities that encourage the use of computational skills report statistically significant gains in theuse and application of important computational competencies. Very importantly both Chemicaland Civil engineering students entering their disciplinary courses either at the sophomore orjunior level reported a marked drop in confidence in their ability to use and apply importantcomputational skills (e.g. MATLAB and EXCEL-related). One of the most challenging aspectsduring the implementation phase has been the lack of assessment instruments to measure studentcomputational competencies. Initial analyses of student artifacts using the CPACE computationalcompetencies rubric revealed that to understand the students’ computational thinking processesas they solve engineering problems, we needed to complement the artifact analyses with studentinterviews. We conducted a total of 15 one –hour semi-structured interviews with students enrolled inthe target courses. Our objectives were to better understand the process by which students solveengineering problems using computational tools, specifically how, when and why computationaltools are employed during the problem solving process. Students described their thought processas they solved the engineering problem using their course projects as a guide for both interviewerand interviewee. Interview analyses have enabled us to more clearly alignments between the CPACEcomputational competencies [using the rubric] and the student artifacts. Hence we have a betterunderstanding about the ways in which students use computational skills as they solveengineering problems (table 1). A major conclusion of our study is that students have difficultiesmaking connections between different computational tools and skills and fail to integrate theirknowledge in ways that would allow them to extend what they learn in one context to new anddifferent contexts. This points to a limited understanding about underlying computationalprinciples. In this paper we will present the results of the CPACE implementation emphasizingstudents’ impact. 1Table 1. Sample interview data aligned to CPACE computational competencies Exemplar Computational Competencies Interview QuotesAlgorithmic thinking and programming: students who “It's such a multicomponent system that to try to do it bydescribe the use/function of particular computational tools hand would be just tedious … went to Polymath …hit goand as a result demonstrate an understanding (or lack of) and it'll, it spits out a table and a graph if you want it.”of the tool. How/why a particular tool works the way it “MATLAB has more built-in functions in terms of math.does; references to programming, coding, debugging, Also, you know, Python is a powerful language, but itdesign, iteration, parameterization, refinement, and utility will require me as a programmer to code most of theof particular applications*. functions and math myself, and that just leaves more room for error”Digital representation of information: examples include “[…] there's just so many things you need to input, to getreferences to: conversion, copy and pasting, migrating Aspen to actually work, you know, the number of stages,info, and representing data across fields*. the diameter of the column, the feed stage, the reflux ratio..”Limitations of technology: examples include references to “So instead of creating an Aspen document first and liketool capabilities/purposes in relation to the problem; getting our whole process flow in there and then trying toawareness of the assumptions made when simulating a get it all to work, we figured it's probably just a better'real world' phenomenon; assessing what/when tool can be idea to, to tackle each individual process first and then tryapplied *. to like interconnect them in Aspen.”Modeling and abstraction: Examples include references to “From my basic knowledge, just being able to type in linemethods for representing 'real world' phenomena through then you’re able to create a line which has a simplecomputer modeling. References to representing a problem command. [...] you make a rectangle so those simpleas a model (system equations, graphs, simulations) and commands kind of add –gave me a sense of what I wantedimplementing the model to obtain solution or solve to try.”problems (debug)*.* Descriptor from the rubric used to inform the interview coding process. 2
Vergara, C. E., & Urban-Lurain, M., & Sticklen, J., & Esfahanian, A., & McQuade, H., & League, A., & Bush, C. J., & Cavanaugh, M. (2014, June), Towards Improving Computational Competencies for Undergraduate Engineering Students Paper presented at 2014 ASEE Annual Conference & Exposition, Indianapolis, Indiana. 10.18260/1-2--23201
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2014 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015