AC 2011-907: ESTABLISHING INTER-RATER AGREEMENT FOR TIDEE’STEAMWORK AND PROFESSIONAL DEVELOPMENT ASSESSMENTSRobert Gerlick, Pittsburg State University Dr. Robert Gerlick is Assistant Professor of Mechanical Engineering Technology at Pittsburg State Uni- versity.Denny C. Davis, Washington State University Dr. Davis is Professor of Bioengineering and Director of the Engineering Education Research Center at Washington State University. He has led numerous multidisciplinary research projects to enhance engi- neering education. He currently leads projects creating and testing assessments and curriculum materials for engineering design and professional skills, especially for use in capstone engineering design courses
AC 2011-2720: AN INSTRUMENT TO ASSESS STUDENTS’ ENGINEER-ING PROBLEM SOLVING ABILITY IN COOPERATIVE PROBLEM-BASEDLEARNING (CPBL)Syed Ahmad Helmi Syed Hassan, Universiti Teknologi Malaysia Syed Helmi is an academic staff in the Faculty of Mechanical Engineering and is currently a Ph.D. in Engineering Education candidate in Universiti Teknologi Malaysia.Khairiyah Mohd-Yusof, Universiti Teknologi Malaysia Khairiyah is an associate professor in the Department of Chemical Engineering, Universiti Teknologi Malaysia. She is presently the Deputy Director at the Centre for Teaching and Learning in UTM. Her main research areas are Process Modeling, Simulation and Control, and Engineering Education. She has been implementing
Session 1609 Rubrics Cubed: Tying Grades to Assessment to Reduce Faculty Workloads Susan M. Blanchard, Marian G. McCord, Peter L. Mente, David S. Lalush, C. Frank Abrams, Elizabeth G. Loboa, and H. Troy Nagle Joint Department of Biomedical Engineering at UNC Chapel Hill and NC StateI. BackgroundAssessment of program outcomes is an important, but time-consuming, part of the ABETaccreditation process for faculty. Many faculty members argue, “I grade; therefore, Iassess.” The problem with using grades as assessment tools is that grades often covermaterial that
Session 3130 Adaptive Model Of Assessment Strategy For Information Technology And Engineering Programs Leonid B. Preiser Department of Technology and Information Systems School of Business and Technology National University 4141 Camino del Rio South, San Diego, CA 92108-4103 (619) 563-7165, fax (619) 563-7160 lpreiser@nu.eduIntroductionThis paper focuses on the methodologies and criteria leading to the development
assessment tools (tests, projects, andfinal examination) were prepared collaboratively. In 2001, the Switched Replications experimentwas embedded in an environment where all students received hypermedia treatment throughoutthe semester, except for the two weeks of the experiment. During those two weeks, bothinstructors covered the same lecture material and the same application examples, but useddifferent media. Thus different instructional design could be rejected as a plausible explanation forthe observed differences both in the 1999-2000 semester-long studies as well as in the SwitchedReplications experiment in 2001.Novelty FactorAnother alternative explanation for observed performance gains in a hypermedia-instructed groupcould be a novelty
student email informingthem of their grade on the assignment and providing a list of errors with details referring tospecific cell locations in the submitted workbook. The proposed method of automated gradingrequires the least set-up time of all traditional or manual grading systems and providesimmediate and valuable feedback to students. The tool is free from manual errors and subjectivegrading, thereby providing an objective evaluation of student performance. The tool allowsfrequent formative assessment opportunities in addition to periodic and summative evaluations.In the current paper, a preliminary feasibility testing of the proposed method in an undergraduateMS Excel® training course with 70 students is described. Specific issues
provides the instructor with a tool to obtain a continuousand regular assessment of student performance and understanding of material covered in theclass.Purpose of the StudyThe purpose of this study is to: Demonstrate that the proposed quiz technique results in improved student performance Present a technique that can be used for the continuous evaluation of the acquisition of knowledge by students in the classroom and reduce the occurrence of plagiarism that occurs in the traditional homework-based method of continuous evaluation. Create a more interactive classroom experience as compared to the traditional lecture. Awareness of the importance of the effect of an interactive classroom environment on
at Texas A&M University. He holds B.S., M.S., and Ph.D. degrees in aerospace engineering from Texas A&M University. His research interests include educational research, solid mechanics, experimental mechanics, microstructural evaluation of materials, and exper- iment and instrument design. He has been involved with various research projects sponsored by NSF, NASA, and AFOSR, ranging from education-related issues to traditional research topics in the areas of elevated temperature constitutive modeling of monolithic super alloys and environmental effects on tita- nium based metal matrix composites. His current research interests include epistemologies, assessment, and modeling of student learning, student
Division of Engineering Science at the University of Toronto. In this position, Lisa plays a central role in the evaluation, design and delivery of a dynamic and complex curriculum, while facilitating the development and implementation of various teaching and learning initiatives. Lisa is cross-appointed with the Department of Curriculum, Teaching and Learning at OISE/UT, and teaches undergraduate courses in engineering & society, and graduate courses in engineering education. Lisa completed an Undergraduate Degree in Environmental Science at the University of Guelph, and a Master’s Degree in Curriculum Studies at the University of Toronto. Research interests include teaching and assessment in engineering
ABET Evaluators Team site visit in 2013. EEET received excellent comments for the display materials presented by Dr. Subal Sarkar ABET team chair which was managed to completion by Wajid. He is Digital Integrated Quality Management Systems Expert for Automated Academic Student Outcomes based Assessments Methodology He has taught several courses on electronics, microprocessors, electric circuits, digital electronics and instrumentation. He has conducted several workshops at the IU campus and eslewhere on Outcomes Assessment best practices, OBE, EvalTools R 6 for faculty, E learning with EvalTools R 6 for students, ABET accreditation process. He is a member of SAP Community, ISO 9001, Senior Member IEEE, IEEE
organizations such as the American Society of Engineer- ing Education (ASEE) and National Society of Black Engineers (NSBE). To contact Dr. Long, email: Leroy.Long@erau.edu. c American Society for Engineering Education, 2016 Investigating First-Year Engineering Students’ Educational Technology Use and Academic Achievement: Development and Validation of an Assessment ToolAbstractPrevious scholars have examined the use of educational technology as a strategy for improvingstudent outcomes and skills. Generally, past studies of technology have focused on devices suchas computers and cellphones or word processing and web-based software. Students have reportedpositive perceptions of
AC 2007-772: WEBCT IN ASSESSMENT: USING ON-LINE E-TOOLS TOAUTOMATE THE ASSESSMENT PROCESSLynn Kelly, New Mexico State University Lynn Kelly has been at NMSU since 1998 and is currently an Associate Professor in the Department of Engineering Technology in the College of Engineering. She received a Bachelor of Science in Engineering Technology from NMSU in 1988. She then went on to earn a Master of Science in Industrial Engineering from NMSU in 1994. She served three years on the Board of the Teaching Academy at NMSU. For the past three years she has been the coordinator of the distance education bachelor’s program (Information & Communications Technology, ICT) offered by the
AC 2008-1598: TC2K AND CLASSROOM ASSESSMENT: THE CASE FORCOMPREHENSIVE COURSE ASSESSMENT IN SUSTAINING CONTINUOUSPROGRAM IMPROVEMENTDavid Cottrell, University of North Carolina at Charlotte DR. DAVID S. COTTRELL is an Assistant Professor in the Department of Engineering Technology, University of North Carolina at Charlotte. He graduated from the United States Military Academy in 1978 and retired in 2000 after more than 22 years of service with the US Army Corps of Engineers. Studies at Texas A&M University resulted in an MS Degree in Civil Engineering in 1987 and a PhD in 1995. He is a registered Professional Engineer and has taught courses in statics, dynamics, mechanics of materials, graphic
AC 2009-1391: ASSESSING INFORMATION LITERACY IN ENGINEERING:INTEGRATING A COLLEGE-WIDE PROGRAM WITH ABET-DRIVENASSESSMENTDonna Riley, Smith College Donna Riley is Associate Professor of Engineering at Smith College.Rocco Piccinino, Smith College Rocco Piccinino is the Associate Director of Branch Libraries and Head of the Young Science Library at Smith College.Mary Moriarty, Smith CollegeLinda Jones, Smith College Linda E. Jones is the Hewlett Professor of Engineering and Director of the Picker Engineering Program at Smith College. Page 14.240.1© American Society for Engineering Education, 2009
2006-1132: PROGRAM ASSESSMENT THE EASY WAY: USING EMBEDDEDINDICATORS TO ASSESS PROGRAM OUTCOMESFred Meyer, U.S. Military Academy Lieutenant Colonel Karl F. (Fred) Meyer is an Associate Professor and Civil Engineering Structures Group Director in the Department of Civil and Mechanical Engineering at the United States Military Academy (USMA), West Point, NY. He is a registered Professional Engineer in Virginia. LTC Meyer received a B.S. degree from USMA in 1984, and M.S. and Ph.D. degrees in Civil Engineering from the Georgia Institute of Technology in 1993 and 2002, respectively.Allen Estes, U.S. Military Academy Colonel Allen C. Estes is a Professor and Civil Engineering Program Director at the
2006-1180: A REVIEW OF LITERATURE ON ASSESSMENT PRACTICES INCAPSTONE ENGINEERING DESIGN COURSES: IMPLICATIONS FORFORMATIVE ASSESSMENTMichael Trevisan, Washington State University Dr. Mike Trevisan is professor and director of the Assessment and Evaluation Center at Washington State University. He has evaluated or provided assessment development work for numerous NSF, state agency, and school district projects.Denny Davis, Washington State University Dr. Denny Davis is professor of engineering at Washington State University and is co-director of the new Center for Engineering Education. He has received numerous NSF grants focused on the renewal of engineering education.Steven Beyerlein
Paper ID #28644Assessing an Assessment: A Case Study of the NSSE ’Experiences withInformation Literacy’ ModuleMs. Debbie Morrow, Grand Valley State University Debbie Morrow currently serves as Liaison Librarian to the School of Engineering and the other units within the Padnos College of Engineering & Computing at Grand Valley State University, to the Math- ematics, Statistics, and Physics departments, and to the Honors College at GVSU. In that position her primary role is to support students in courses in her liaison areas both in and outside of their classrooms. Helping students make connections between information
Paper ID #32132Best 2019 Zone IV Paper : Assessing Student Assessment in a FlippedClassroomDr. Bryan Mealy, California Polytechnic State University, San Luis Obispo c American Society for Engineering Education, 2020 Paper ID #241472018 ASEE Zone IV Conference: Boulder, Colorado Mar 25Assessing Student Assessment in a Flipped ClassroomProf. bryan james mealy, Cal Poly State University Bryan Mealy is an associate professor at Cal Poly State University in San Luis Obispo, California. c American Society for
Paper ID #16346We Assess What We Value: ”Evidence-based” Logic and the Abandonmentof ”Non-Assessable” Learning OutcomesDr. Donna M. Riley, Virginia Tech Donna Riley is Professor of Engineering Education at Virginia Tech. c American Society for Engineering Education, 2016 We Assess What We Value: “Evidence-based” Logic and the Abandonment of “Non-assessable” Learning OutcomesAbstractThis paper seeks to analyze the recent proposed changes to ABET’s baccalaureate-levelprograms accreditation General Criteria 3 (Student Outcomes) and 5 (Curriculum) in light of
How To Assess or How Not to Assess … That is the Question Christine Masters, Sarah Rzasa, Jill Lane, Richard Behr The Pennsylvania State UniversityAbstract Many innovations are taking place in engineering classrooms across the nation. Buthow do we decide if an innovation is achieving the desired outcomes? Most engineering facultymembers are interested, even eager to make improvements in the way engineering concepts aretaught in their courses. But many, if not most, have little or no experience in formal educationalassessment. Hopefully our experiences in assessing a new innovation incorporated into the largeenrollment statics course at Penn State during the Fall of 2004 can
AC 2010-1522: ASSESSING THE STANDARDS FOR ASSESSMENT: IS IT TIMETO UPDATE CRITERION 3?Stephen Ressler, United States Military Academy Page 15.209.1© American Society for Engineering Education, 2010 Assessing the Standards for Assessment: Is it Time to Update Criterion 3?PurposeThe ABET engineering accreditation criteria specify that engineering programs must implementcontinuous quality improvement processes to ensure that they remain relevant and effective overtime. But how does ABET ensure that its criteria remain relevant and effective over time? In2009, the Criteria Committee of the ABET Engineering Accreditation Commission
Session 1392 Assessing Women in Engineering (AWE): Assessment Results on Women Engineering Students Beliefs Rose M. Marra, Cherith Moore,; Mieke Schuurman; Barbara Bogue University of Missouri – Columbia / The Pennsylvania State UniversityIntroductionWomen in Engineering (WIE) programs around the United States are a crucial part of ourcountry's response to the need for more women in engineering professions1. For Women inEngineering (WIE) programs to be maximally effective, they must have access to validatedassessment instruments for measuring the effectiveness of their recruitment and retentionactivities for women in
Session 2793 Dynamic Multiple Assessment: An Instructional Method that Captures the Symbiosis of Assessment and Instruction Tamara Balac, Daniel M. Gaines Electrical Engineering and Computer Science Department Vanderbilt UniversityAbstractStandard instruction does not typically make effective use of assessment to improveinstruction. Assessment is generally used only to assign grades to students, and nofeedback is used to inform instruction. As a consequence, students may develop multiplemisconceptions and receive no deep understanding of the domain. Furthermore
Session 3530 Assessing News Ways of Teaching Dynamics: An Ongoing Program to Improve Teaching, Learning, and Assessment Patricia M. Yaeger, Rose M. Marra, Gary L. Gray, Francesco Costanzo The Pennsylvania State UniversityAbstractIn spring 1998, a traditional lecture and problem-solving based course in introductory dynamicswas infused with interactive learning activities. The enhanced course called “Interactive Dy-namics” was designed to engage students in a collaborative environment in which students haveeasy access to an array of technological tools (web-based simulations, spreadsheets, computa
Civil & Environmental Engineering. His work contains a unique blend of engineering education and civil engineering projects. Dr. Perry’s current work centers on understanding how students transfer their knowledge between engi- neering school and work. This is supplemented by his role in developing assessment techniques for two NSF-funded projects focused on the incorporation of virtual and mixed reality technology into civil engi- neering education. In addition, his past civil engineering research investigated the behavior of wood shear wall structures under seismic loading conditions. Dr. Perry’s expertise in both the engineering education and civil engineering domains provide him with a unique skillset that
Paper ID #36022Student Self-Assessment Questionnaires using Hierarchical Bloom’sTaxonomyProf. Ashanthi Shanika Maxworth, University of Southern Maine Dr. Ashanthi Maxworth is currently an assistant professor in electrical engineering at the University of Southern Maine. She is originally from Sri Lanka where she obtained her B.Sc in Electronics and Telecommunication Engineering from the University of Moratuwa. In January 2013 she started her grad- uate studies at the University of Colorado Denver. She obtained her Masters (2014) and Ph.D. (2017) in Electrical Engineering specializing in electromagnetic wave propagation in
professional engineering experience. ©American Society for Engineering Education, 2023 Assessment methods and students’ expectations: A Survey Rajarajan Subramanian Pennsylvania State University at HarrisburgOne of the most important objectives of classroom assessments is to benefit learning outcomes.Students should know what to expect from assessments as far as the preparation for each class isconcerned. Engineering courses need certain types of assessments that ascertain the students’capability of creative thinking, analytic knowledge, and out-of-the-box ideas. In this study, thestudents were given a questionnaire to give their opinion of what kind of assessment mode theylike. The
Paper ID #37737Establishing Metrics to Assess a Retraining InitiativeJoshua Dean Josh Dean is an Assistant Professor in the Department of Civil and Mechanical Engineering at the United States Military Academy at West Point, NY. He is a graduate of West Point, earning a B.S. in Mechanical Engineering; he later earned a M.S. in Mechanical Engineering from Purdue University. His research interest areas include energetic materials, thermodynamics, and engineering education.Gunnar Tamm Dr. Gunnar Tamm has taught at West Point since 2004 within the Department of Civil and Mechanical Engineering, where he is a
Paper ID #37792Environmentally and Socially Responsible Engineering -Assessing Student EmpowermentNatasha Andrade (Lecturer) Senior LecturerElisabeth Smela (Associate Dean for Faculty Affairs) Prof. of Mechanical Engineering, University of MarylandVincent Nguyen (Senior Lecturer) Vincent P. Nguyen is a Senior Lecturer at the University of Maryland, College Park (UMCP). He received his B.S., M.S., and PhD. in mechanical engineering from the University of Maryland, College Park. He is a founding member of the Environmental and Socially Responsible Engineering (ESRE) group who work to integrate and track
Paper ID #36897Assessing Engineering Sketching Skills on Object AssemblyTasksHillary E. Merzdorf (Graduate Student) Hillary Merzdorf is a Ph.D. candidate at Purdue University in the School of Engineering Education. Her research interests are in flexible assessment practices incorporating both traditional psychometrics and technology-based approaches, digital engineering education tools, and cognitive engineering methods for learning research.Donna Jaison Graduate Student at Texas A&M University.Morgan Weaver (Graduate Research Assistant)Kerrie A Douglas (Assistant Professor of Engineering Education) Dr