2006-906: THE EFFICACY OF ONGOING COURSE ASSESSMENT FORENHANCING STUDENT LEARNING IN STRUCTURAL DESIGN COURSESAbi Aghayere, Rochester Institute of Technology Dr. Abi Aghayere is a professor of civil engineering technology at RIT, and the 2004-05 recipient of RIT’s prestigious Eisenhart Award for Outstanding Teaching. He is also one of the recipients of the 2003 ASEE Best Paper Award. He received a B.S. in Civil Engineering from the University of Lagos, a S.M. in Structural Engineering from MIT, and a Ph.D. in Structural Engineering from the University of Alberta. Dr. Aghayere is a licensed professional engineer in Ontario, Canada
students. Future work willinclude enhanced data sampling, a revision of interview questions and assessment ofparticipant’s understanding of concepts via quizzes.I. Introduction Over the past several decades mentorship programs within industrial, collegiate and K-12professional and educational environments have been of intense interest. For example, [1-4]found that undergraduate students and in particular women and underrepresented minoritystudents reported increased skills, confidence and motivation to pursue science or engineeringcareers as a result of research experiences and positive relationships with mentors. In fact,women and under-represented minorities are less likely to enter and remain in science andengineering when they do
the learningoutcome of students.Thus, there is a need for a new framework in evaluating a project-based design class that istaught using experiential learning methods. This is almost impossible in this type of designclass, as the experiential learning activity (the project) is the same as the assessment activity.This paper does not solve the problem of directly measuring learning outcomes from project-based design experiences, but presents an additional new method for evaluating this type ofclass. An implicit assumption of experiential learning environments is that a correlation existsbetween time spent on an activity and learning related to that activity. The evaluation methodthat we present here involves collecting data on the instructor’s
engagementby the students and ideally with local clients and campus community on problems students couldaddress. It was recommended that these design projects be tailored toward students with minimaltraining in calculus, chemistry and physics. In respect to ABET student outcome criteria, thecourse would prioritize (d) an ability to function on multidisciplinary teams, (g) an ability tocommunicate effectively, (i) a recognition of the need for, and an ability to engage in life-longlearning, and (j) a knowledge of contemporary issues.The initial offering of the ENG 3 course was to 39 students in fall 2015. During this time, thecurricular materials, presentation assignments, rubrics for assessing communication skills, and apneumatic powered robotic arm
, International Journal of Plasticity, Materials Research Letters, and the ASME Journal of Electronic Packaging, among others.Mr. Dan Cordon, University of Idaho, Moscow Clinical faculty member at the University of Idaho with teaching focus in design courses ranging from freshman introductory engineering design through the capstone experience. Technical research area is in the field of internal combustion engines and alternative fuels.Dr. Steven W. Beyerlein, University of Idaho, Moscow Dr. Beyerlein has taught at the University of Idaho for the last 27 years. He is coordinator of the college of engineering inter-disciplinary capstone design course. He is also a co-PI on a DOE sponsored Industrial Assessment Center program in
Paper ID #21325Differences and Similarities in Student, Instructor, and Professional Percep-tions of ”Good Engineering Design” through Adaptive Comparative Judg-mentDr. Scott R. Bartholomew, Purdue Polytechnic Institute Scott R. Bartholomew, PhD. is an assistant professor of Engineering/Technology Teacher Education at Purdue University. Previously he taught Technology and Engineering classes at the middle school and university level. Dr. Bartholomew’s current work revolves around Adaptive Comparative Judgment (ACJ) assessment techniques, student design portfolios, and Technology & Engineering teacher preparation.Dr. Greg J
AC 2007-1259: COURSE LEVEL ASSESSMENT AND IMPROVEMENT:APPLYING EDUCATIONAL PEDAGOGY TO ABET ACCREDITATIONKenneth Williamson, Kenneth J. Williamson is presently Department Head of both Chemical Engineering and Civil, Construction, and Environmental Engineering at Oregon State University. He serves as Associate Director of the Western Region Hazardous Substance Research Center. Dr. Williamson’s research interests are in hazardous substance management and bioremediation.Milo Koretsky, Oregon State University Milo Koretsky is an Associate Professor of Chemical Engineering at Oregon State University. He currently has research activity in areas related to thin film materials processing and engineering
AC 2007-878: A METHODOLOGY FOR DIRECT ASSESSMENT OF STUDENTATTAINMENT OF PROGRAM OUTCOMESScott Danielson, Arizona State UniversityBradley Rogers, Arizona State University Page 12.63.1© American Society for Engineering Education, 2007 A Methodology for Direct Assessment of Student Attainment of Program OutcomesAbstractWhile not directly required in Criterion 3 of the ABET accreditation criteria for engineeringtechnology programs, some form of direct assessment of student attainment of programoutcomes is generally expected. Unfortunately, direct assessment can be overlooked by programfaculty, often leading to an over reliance on indirect
AC 2007-1960: THE USE OF DIRECT AND INDIRECT EVIDENCE TO ASSESSUNIVERSITY, PROGRAM, AND COURSE LEVEL OBJECTIVES AND STUDENTCOMPETENCIES IN CHEMICAL ENGINEERINGRonald Terry, Brigham Young University Ron Terry is a Professor of Chemical Engineering at Brigham Young University and an Associate in BYU's Office of Planning and Assessment. His scholarship is centered on pedagogy, student learning, and engineering ethics and has presented/published numerous articles in engineering education. He is one of BYU's co-investigators for the NSF funded National Center for Engineering and Technology Education.W. Vincent Wilding, Brigham Young University Vincent Wilding is a Professor of Chemical Engineering at
AC 2007-2095: USING OUTCOMES-BASED ASSESSMENT AND CONTINUOUSQUALITY IMPROVEMENT PRACTICES FROM ABET PROGRAMACCREDITATION IN INSTITUTIONAL ACCREDITATIONSusan Scachitti, Purdue University-Calumet Susan is Associate Professor of Industrial Engineering Technology at Purdue University Calumet. She holds degrees in Industrial Engineering Technology from the University of Dayton and a MBA in Management from North Central College. She teaches and consults in TQM, six sigma, lean and continuous improvement. Sue is past chair of the IE Division of ASEE and formerly served as division chair, program chair, newsletter editor, and treasurer. She has served as a TAC/ABET commissioner or alternate since
curriculum design, rather than be confined by rigidcriteria. This paper offers preliminary evidence that the regular assessment of the ABET-designated outcomes has opened the eyes of our faculty to issues in student learning thatmay not have been considered before. While initial assessment was conducted at thedisciplinary course level, improvement actions have been more far-reaching includingnon-trivial course and program improvements, interdepartmental faculty collaboration,redesign of course content, and renewal of faculty interest in improved classroompedagogy. This paper reports on the assessment-based approaches used to implementcurricular change and the benefits that have resulted to date. In a broader sense, this paperproposes a model process
AC 2007-2126: USE OF QFD IN THE ASSESSMENT OF COURSE ACTIVITIESFOR LEARNING OUTCOMESZbigniew Prusak, Central Connecticut State University Dr. Zbigniew Prusak is a Professor in the Engineering Department at Central Connecticut State University in New Britain, CT. He teaches courses in Mechanical and Manufacturing Engineering Technology and Mechanical Engineering programs. He has over 10 years of international industrial and research experience in the fields of precision manufacturing, design of mechanical systems and metrology. Dr. Prusak received M.S. Mechanical Engineering from Technical University of Krakow and his Ph.D. in Mechanical Engineering from University of Connecticut. E
2006-1374: INTERNALLY-DEVELOPED DEPARTMENTAL EXIT EXAMS V/SEXTERNALLY-NORMED ASSESSMENT TESTS: WHAT WE FOUNDVirendra Varma, Missouri Western State University Virendra K. Varma, PhD, PE, F.ASCE, is Professor of construction and Chairman of the Department of Engineering Technology at Missouri Western State University. He served as a Member of the TAC/ABET Commission from 1998-2003. He is a former President of ACI-Missouri, and a former President of the NW Chapter of MSPE (of NSPE). He has published and presented extensively. He is the Chair of the Construction Engineering Division of ASEE. He has held highly responsible roles in design and construction industry ranging from a project
Session 3230 Assessment of Introduction to Engineering and Problem-Solving Course Joni E. Spurlin, Jerome P. Lavelle, Mary Clare Robbins, and Sarah A. Rajala Office of Academic Affairs College of Engineering North Carolina State University Campus Box 7904 Raleigh, NC 27695-7904AbstractAt North Carolina State University, the freshmen’s first course in engineering is E101
Session 3230 Standardizing Outcomes Assessment Allows Faculty to Focus on Student Learning Gregory G. Kremer Ohio UniversityI. Benefits of standard procedures and templates for assessment and continuous improvementOutcomes assessment is best viewed as a means to an end, not a goal in itself. It is a tool meantto produce improved student learning (both in terms of what is being learned and how well it isbeing learned), so we must avoid the trap of spending all our time and energy on assessment andnot having any left to make the
Session 1609 Student Internships: A Rich Source of Data for Assessment of Program Outcomes Susan M. Blanchard, Peter L. Mente, and Lesley H. Hubbard Joint Department of Biomedical Engineering at UNC Chapel Hill and NC StateI. BackgroundStudents in the Biomedical Engineering (BME) program at NC State University havesought out summer internships, particularly the Research Experiences for Undergraduatesthat are sponsored by the National Science Foundation, since the program began in themid 1990s. In addition, NC State University is fortunate to have been one of the
feedback on new ideas that arise in earlier round(s) and 3)to determine a level of proficiency expected of biomedical engineering students within eachtopic.Overview of SurveyThe survey is comprised of eighty questions divided among nineteen categories including elevenbiomedical engineering domains, four biology domains, physiology, engineering design, andmathematical/scientific pre-requisites. Within each category we ask the participant to assess hisown level of expertise for that topic, after which, he is asked to assess the importance/relevanceof several concepts comprising that topic to a core curriculum that should be recommended forALL undergraduate BME majors. In addition, participants have the opportunity to suggestconcepts not included in
Session _(Draft Doc. 2003-892)___ Course Assessment Tools and Methods Utilizing Assignments, Tests and Exams John R. Hackworth, Richard L. Jones Old Dominion UniversityI. Introduction Until recently, course assessment methods have been relegated to simply having aninstructor examine results of assignments, tests and, exams, and making subjectivedeterminations of how well the class is performing. This includes an “educated guess” as towhether or not students are grasping concepts being delivered in lecture classes (and supported inlaboratory classes
Impact of Assessment on a BME Undergraduate Program Thomas R. Harris, David Cordray Vanderbilt University, Nashville, TN 37235IntroductionLearning theory suggests that effective instruction should be “student centered, knowledgecentered, assessment centered, and community centered”1. We have been engaged in a largestudy aimed at exploring and testing these concepts for biomedical engineering education—theNSF Vanderbilt-Northwestern-Texas-Harvard/MIT (VaNTH) Engineering Research Center onBioengineering Educational Technologies. The set of concepts that have been applied toimprove learning have been labeled the “How People Learn (HPL) Framework”2. This paper isan
Session 3230 Assessment Tracking Protocols and Design Documents as Monitoring Tools for Assessment and Evaluation of Teaching Innovations in Bioengineering Reuben H. Fan, Betty Stricker, Sean Brophy, Ph.D. Department of Biomedical Engineering / The Office of Innovation through Technology Vanderbilt University, Nashville, TN 37235Abstract This project aims at developing methods to track the assessment and evaluation ofeducational practices that incorporate learning sciences and technology with
Session 3130 Outcomes Assessment: Developing an Electronic Assessment Database as a Model for Collection and Analysis of Data Joni E. Spurlin, Sarah A. Rajala, Jerome P. Lavelle, O. Jerome Hoskins Office of Academic Affairs College of Engineering North Carolina State University Campus Box 7904 Raleigh, NC 27695-7904AbstractAs the ABET process in each institution moves toward outcomes assessment, it pushes eachprogram to develop and implement
Session 3230 Self-Reported Behaviors And Heuristic Beliefs About Learning and Preparing for Problem Solving Exams Charles F. Yokomoto, Maher E. Rizkalla, Roger Ware Indiana University-Purdue University Indianapolis1.0 IntroductionIn this paper, we describe a study of the self-reported behaviors and heuristic beliefs of students asthey relate to solving homework problems and preparing for problem-solving exams. Thepurpose of the study is to develop an understanding of our students and to determine if ourfaculty’s assessment of our students in this area is an accurate one. Often
Session 1788 Development and Initial Experience with a Laptop-based Student Assessment System to Enhance Classroom Instruction Brophy, S. P., Norris, P., Nichols, M., and Jansen, E. D. Department of Biomedical Engineering Vanderbilt UniversityAbstractNew principles of learning and instruction highlight the need to engage students in thoughtful useof knowledge. However, engaging individual engineering students in large classroomssimultaneously can be challenging. Classroom communication systems (CCS) encouragestudents to apply conceptual ideas during class, by
Session 3130 A Model for the Evaluation of Innovative Engineering Courseware: Engineering an Assessment Program Richard H. Hall, Timothy A. Philpot, David B. Oglesby, Ralph E. Flori, Nancy Hubing, Steve E. Watkins, and Vikas Yellamraju University of Missouri – RollaAbstractThis paper describes a general model for assessment of instructional innovations used by theUniversity of Missouri – Rolla’s Media Design and Assessment Laboratory and an example ofthe model’s application. This model is based on three themes: a) iterative assessment with on-going
Session 3130 Development of a Problem Test Bank for Linear Circuits and Its Implications for Improving Learning and the Assessment of Student Learning Charles F. Yokomoto, Maher E. Rizkalla Department of Electrical and Computer EngineeringI. IntroductionIn this paper, we describe an on-going project that is taking place in our departmentwhose goals are to establish some uniformity in the assessment of student learning acrosssections in our introductory linear circuit analysis course, to promote an understanding inthe culture of our department of the different levels of cognitive
, participate at a much higher level than in liveclasses. Our experience is that students who are unlikely to speak out in class are quitewilling to share their thoughts in the online threaded discussion. This also makes itsimpler for the instructor to assess levels of participation for grading purposes. Toolssuch as web caucus and document posting permit students to share work and critique oneanother’s work. They can also view a variety of problem solutions in quantitative classes.Electronic coursepacks can augment texts and other materials. Our university libraryprovides this useful web-based service free of charge to students. The online syllabus cancontain embedded hyperlinks to Internet resources useful for student research and a realhelp in
fields, contains many studies that documentquantitative statistical analyses assessing an instrument’s validity3,4; however, there is littleguidance in the literature to help researchers establish whether or not the interpretation of aninstrument is consistent with its intended design by identifying and understanding participants’thought processes that take place when responding to each item on the instrument. Other fieldshave used a variety of techniques falling under the scope of Verbal Report Methods (VRMs), inwhich subjects are asked to provide constant verbal feedback while performing a task5,6. VRMshave also been used to establish a case for the cognitive validity of various quantitativeinstruments, but there is little guidance in the
,equations, graphs) presented in the test items or the primary physics concepts (e.g. acceleration,force, motion) for which the items have been designed to assess achievement in engineering. Inthe future, we will plan studies so that more items are included on each scale. Further, we willattempt to replicate the initial findings reported in this manuscript. If correlation coefficientsbetween scores are at best moderate, then this pattern of results has important implications forassessment, teaching, and research in engineering education. Simply, it implies that the skill setrequired to succeed in engineering may be multidimensional. As such, a set of various tasks ortests are needed to help students understand their profiles of strengths and
AC 2011-2582: SCALING THE REVISED PSVT-R: CHARACTERISTICSOF THE FIRST YEAR ENGINEERING STUDENTS’ SPATIAL ABILITYYukiko Maeda, Purdue University Yukiko Maeda is an assistant professor in the College of Education at Purdue University. She received her PhD in quantitative methods in education from University of Minnesota. Her research interests include survey and assessment design in educational research, and meta - analysis.So Yoon Yoon, Purdue University, West Lafayette So Yoon Yoon is a doctoral candidate in gifted education at Purdue University. She enjoys working with diverse students talented in STEM areas. Her current research interest is to scale an instrument to mea- sure students’ spatial ability