AC 2011-551: DESIGN OF A SUSTAINABLE PROCESS FOR UNDER-GRADUATE CURRICULUM REFORM, DEVELOPMENT AND ASSESS-MENT: A CHEMICAL ENGINEERING CASE STUDYLarissa V. Pchenitchnaia, Texas A&M University Dr. Larissa V. Pchenitchnaia is a Curriculum Renewal Specialist at Artie McFerrin Department of Chem- ical Engineering at Texas A&M University. Dr. Pchenitchnaia’s has a Ph.D. in educational administra- tion (higher education). Her professional interests include faculty professional development, curriculum development, and assessment of teaching practices and learning outcomes. She can be reached at laris- sap@tamu.eduLale Yurttas, Texas A&M University Biodata for Dr. Lale Yurttas Lale Yurttas received her Ph.D
unintended consequence of built-in obsolescence. Theineffectiveness of many designs has been resident in a static view of learning and teaching styles,personnel-dependence, an inability to manage changes in program size, and/or a lack ofportability and adoption by the larger educational community. To avoid these specific pitfalls inour design for educational enhancement, we are: (1) employing a dynamic view of learning andteaching styles where the characteristics of student and faculty are periodically measured toestablish an assessment process calibration, (2) using knowledge management systems to processvoluminous data collection and analysis in an efficient and flexible manner, (3) using a modulardesign of an established assessment paradigm that
(only assign), Proceedings of the 2003American Society of Engineering Education Annual Conference, 2003.3 Wankat, P., The Role of Homework, Proceedings of the 2001 American Society Engineering Education, AnnualConference, Albuquerque, June 2001.4 Carpenter, J., Camp, B, Using a Web Based Homework System to Improve Accountability and Mastery inCalculus, Proceedings of the 2008 American Society of Engineering Education Annual Conference, 2008.5 Fernandez, A., Saviz, C., Burmeister, J., Homework as an Outcome Assessment: Relationships BetweenHomework and Test Performance, Proceedings of the 2006 American Society of Engineering Education AnnualConference, 2006.6 Feldman, L, Designing Homework Assignments: From Theory to Design; Proceedings of the
AC 2010-226: A HOLISTIC APPROACH FOR STUDENT ASSESSMENT INPROJECT-BASED MULTIDISCIPLINARY ENGINEERING CAPSTONE DESIGNMark Steiner, Rensselaer Polytechnic InstituteJunichi Kanai, Rensselaer Polytechnic InstituteRichard Alben, Rensselaer Polytechnic InstituteLester Gerhardt, Rensselaer Polytechnic InstituteCheng Hsu, Rensselaer Polytechnic Institute Page 15.42.1© American Society for Engineering Education, 2010 A Holistic Approach for Student Assessment in Project-based Multidisciplinary Engineering Capstone DesignAbstractA capstone design course involves multiple variables and complexities which make its teachingconspicuously challenging1,2; e.g., sponsors
. More specifically, this Page 15.983.2 project is expected to provide useful insights into several key PBL problems, including: 1) how to select appropriate programming problems to ensure the breadth of contents covered, 2) how to balance teaching and students’ self- directed study in programming courses, and 3) how to enhance the guided PBL model based on both qualitative and quantitative evaluation to improve students’ MTP programming skills. • Assess the effectiveness of developed PBL-based multicore programming course for students with diverse background. As one of the few earliest courses
AC 2011-1179: A STREAMLINED APPROACH TO DEVELOPING ANDASSESSING PROGRAM EDUCATIONAL OBJECTIVES AND PROGRAMOUTCOMESChrista Moll Weisbrook, University of Missouri Dr. Christa M. Weisbrook, P.E., is a Faculty Fellow in the University of Missouri System Office of Aca- demic Affairs, where she is involved in program review and assessment, course redesign, and collabora- tive programs initiatives. Prior to this appointment, she served as the special assistant to the provost and lecturer in engineering management at Missouri University of Science and Technology and the assistant dean for academic programs for the College of Engineering at the University of Missouri. Dr. Weisbrook earned BS and PhD degrees in mechanical and
AC 2010-1970: REFINEMENT AND INITIAL TESTING OF AN ENGINEERINGSTUDENT PRESENTATION SCORING SYSTEMTristan Utschig, Georgia Institute of Technology Dr. Tristan T. Utschig is a Senior Academic Professional in the Center for the Enhancement of Teaching and Learning and is Assistant Director for the Scholarship and Assessment of Teaching and Learning at the Georgia Institute of Technology. Formerly, he was Associate Professor of Engineering Physics at Lewis-Clark State College. Dr. Utschig has regularly published and presented work on a variety of topics including assessment instruments and methodologies, using technology in the classroom, faculty development in instructional design, teaching
cases, implying thatthis metric doesn’t identify which statics instructor is better at preparing students forsubsequent courses. Although the correlations are weak, trends are discernable wherestudents who succeed in passing statics taught by an instructor who has a reputation ofbeing more rigorous, do better in the follow-on courses. At best, the grade-basedcorrelation metric explains up to 25% of the future grade success in follow-on engineeringcourses for the most effective statics instructors.IntroductionThere is much discussion of the need to continuously improve our programs, curriculum,and courses1. The improvement is driven by assessments, evaluations, and feedback fromboth inside and outside the college. Feedback from employers
strategy to give feedback to the students has been a long and error proneprocess. Attending a workshop7 at the 2007 FIE conference helped to build a basic evaluationform, which was refined over the following years. The workshop did not give the author enoughconfidence to use the full assessment tool provided by the presenters, therefore only a very smallportion has been adapted to the course at the University of Kentucky. From this perspective theauthor probably fits very well in the description of workshop attendees given by Montfort et al.8In the next sections the author will describe the developed outlines, self/peer evaluation, andgrading rubrics as used in the one semester senior design course at the University of Kentucky.In the paper itself
various activities includingdoing research, writing technical papers, making technical presentations, participating in fieldtrips, writing field trip reports and creating posters for technical sessions. Students were requiredto attend weekly tutorial sessions where they were taught research methodologies and how towrite technical papers and make technical presentations. The students were also required toparticipate in a final poster competition. Throughout the program, we assessed students’ progressand provided them with feedbacks. The students also participated in various types of surveysduring the program. Throughout the program, we used the survey results to improve the tutorialsessions so that students could get better research and learning
Concept-based Instruction and Personal Response Systems (PRS) as an Assessment Method for Introductory Materials Science and Engineering Maura Jenkins and Edward K. Goo University of Southern CaliforniaAbstractPersonal response systems (PRS) are gaining in use as a method to engage students inlarge science and engineering lectures. Faculty pose questions to the class mid-lectureand receive immediate feedback via remote-control “clickers” as to whether studentsunderstand the underlying concepts necessary to solve problems on homework andexams. Thus, the pace of the lecture can be adjusted accordingly to focus on the mostdifficult concepts.This method has
Using a Materials Concept Inventory to Assess an Introductory Materials Class: Potential and Problems William Jordan, Henry Cardenas, and Chad B. O’Neal College of Engineering and Science Louisiana Tech University Ruston, LA 71272ABSTRACTIn every engineering course there is a concern about how much the students are actuallylearning. The physics community has addressed this through the development of anassessment instrument called the Force Concept Inventory. More recently this has beenexpanded to the development of Engineering Concept Inventories. Universities affiliatedwith the N.S.F. sponsored Foundation Coalition
Use of Web-based Portfolios to Assess the Technical Competencies of Engineering Technology Students: A Case Study Sohail Anwar The Pennsylvania State University, Altoona College Jo-Ann Rolle and Altaf A. Memon School of Business and Technology, Excelsior CollegeAbstractOn-line instruction is becoming a key component of numerous academic programs,largely as a result of the Internet and the proliferation of personal computers in officesand homes. Everyday, more and more educational institutions are introducing new on-line courses. Computer and telecommunication technological advances have providedalternatives to the traditional
Session 3642 Evaluation and Outcomes Assessment During the Semester: Putting Course Learning Objectives to Work David S. Cottrell Pennsylvania State University at HarrisburgI. IntroductionIn recent years, much has been written about the requirement to perform outcomes andobjective assessments to evaluate the strengths of ABET accredited programs in allengineering disciplines including engineering management. In particular, the criteria foraccrediting engineering technology programs stipulates that programs must demonstrate thatgraduates have a commitment to quality
evidence that the results are applied for ongoing programimprovement. Plans for continuous improvement are a part of the current criteria, but theemphasis of continuous improvement is increasing. Current programs may be weak inoutcomes assessment and the feedback element under the new criteria.The new TAC of ABET criteria are less specific and thus more flexible. This will allowmore diversity among engineering technology programs. Controls must be in place toensure that program changes are truly improvements and that academic programs are notcontinuously disrupted by many poorly planned changes. Changes developed with goodintentions may yield unforeseen deleterious effects. Programs having identical or similartitles may serve different student
Education The purpose of the present paper is to describe the process that was put into place toensure the initial and on-going measurement of data to provide evidence that a-k outcomes werebeing achieved in each of the ten accredited program in the Clark School. Now more than twoyears since the initial visit, we will also describe our efforts to keep programs involved in theassessment process, highlighting the Mechanical Engineering department’s work.II. The Assessment Process at Maryland The A. James Clark School of Engineering of UMCP is a leader in undergraduateeducation. Highly ranked in the US NEWS and WORLD REPORT, the Clark School prides itselfon innovation in the classroom, beginning with the first course. ENES 100 is a first
a shorter subset of the graded report,allowing students to practice summarization of crucial information for a busy audience in theworkplace. This shorter report contains final recommendations to the company, and the studentgroup’s method of solution, reasoning, and supporting calculations that led to therecommendations. Students also present a one-hour seminar at the sponsoring companydetailing their design solution, and a 30-minute oral presentation at a University-sponsoredsenior design colloquium. As a part of the evaluation and assessment activities required to meetABET EC2000 criteria, a panel of practicing engineers from the sponsoring company will returnto the faculty advisor written comments and suggestions about the strengths and
Session 2609 Exploring an Electronic Polling System for the Assessment of Student Progress in two Biomedical Engineering Courses Robert J. Roselli, Sean P. Brophy Department of Biomedical Engineering / The Learning Technology Center Vanderbilt University, Nashville TN 37235AbstractMonitoring students' understanding as part of course lectures has the potential to increase studentengagement, facilitate modification of instruction so it targets learners’ needs, and increasestudents’ overall learning of the course materials. Classroom Communications Systems (CCS)provide
. The new criteria specify that engineering programs shouldseek to continuously improve their degree offerings through an ongoing assessment processthat includes constituent input. Our own department has specified undergraduate alumni asone of the prime or key constituents that will be queried for input into our own processes.For the first time in our history, we sent a detailed survey to all of our undergraduate alumni.We also solicited salary information that could be submitted anonymously. The results fromthe survey will be presented and discussed.IntroductionThe Department of Engineering Management at The University of Missouri-Rolla (UMR)has been in existence for over thirty years, and was among the first degrees of its kind in theUnited
teaching institutions and because of their respective traditions are an extremely goodmatch for each other.The development and implementation of the exchange program required a strong commitment andflexibility from both institutions to make it work. The FHL, in particular, was first required bylaw to gain permission from the relevant German government academic accrediting agencies toimplement the exchange program and then the FHL converted the agreed upon classes intoEnglish taught courses. Significant issues had to be addressed regarding grade conversionbetween the German and US systems, transcript entries, mapping of the curriculum betweeninstitutions of courses taken, and assessment processes.German Educational SystemGermany has an educational
ABET's Technological Education Initiative: Focus on Faculty Maryanne Weiss, Peggie Weeks, Mark Pagano ABET, Inc./ABET, Inc./Purdue UniversityAbstractThe Accreditation Board for Engineering and Technology, with support from the NationalScience Foundation’s Advanced Technological Education program, is conducting twelve hands-on regional faculty workshops for engineering technology educators. The purpose of theTechnological Education Initiative (TEI) workshops is to enhance faculty’s knowledge ofemerging technologies, explore ways in which these technologies may be incorporated into theirprograms, and provide faculty with experience in developing effective assessment strategies
traditional routine. Epistecybernetics, a term aptly coined by Hensley (1) et aland simply defined as the governance and stewardship of knowledge provides theframework for meeting the requirement of systematized documentation of program(s)activities. The CUES (Consortium for Upgrading Educational Standards) protocol, oneof the core components of the epistecybernetic system, when successfully implemented,can be a useful assessment tool for program(s) activities and enhanced student learning.1. IntroductionInstitutions, programs, accreditation agencies such as ABET and NCATE, andgoverning bodies such as KBR (Kansas Board of Regents) and others rely extensivelyon the themes of enhanced student learning, successful course delivery methods,continuous
Session 3560 A Work in Progress – Updating and Maintaining an Effective Assessment Program under ABET Engineering Criteria 2000 J. Shawn Addington, Robert A. Johnson, and David L. Livingston Department of Electrical and Computer Engineering Virginia Military InstituteThis paper serves as a follow-up to previously published works1,2 regarding the assessmentprogram developed and utilized by the Electrical and Computer Engineering Department at theVirginia Military Institute. In particular, the paper will: 1) outline the departmental assessmentstrategy, including the
. With Engaged Teaching Hub, Minju has designed TA training materials for oral exams and have conducted quantitative analysis on the value of oral exams as early diagnostic tool (Kim et al., ASEE 2022). Minju is interested in designing assessments that can capture and motivate students’ deep conceptual learning, such as oral exams and the usage of visual representations (e.g., diagrams and manual gestures).Marko V. Lubarda, University of California, San Diego Marko V. Lubarda is an Assistant Teaching Professor in the Department of Mechanical and Aerospace Engineering at the University of California, San Diego. He teaches mechanics, materials science, design, computational analysis, and engineering mathematics courses
[2-3].The educators and pedagogical researchers have studied many aspects of experiential learningincluding assessment, delivery, and impact of such model of student learning. Carlson andSullivan [4] have presented a conceptual framework and case study of integrated teaching andlearning laboratory at a major university in the US. The laboratory served as multi-disciplinarylearning platform for hands-on engineering in first year engineering design and other courses likecomputer simulation and capstone design studios for undergraduate students. The laboratory alsosupported outreach activities. This specific experiential learning model has been reported to be
% ' Statistical Analyses: The primary data set for these analysis are the participants who answered both the pre-and-post surveys (n = 100) that allows single sample, paired comparison. Sample normality for all continuous variables was assessed through kurtosis and skewness (range of ± 1.5), and the data were normally distributed. Survey item consistency was measured using Cronbach’s Alpha (α) +.60 considered reliable. Correlations were done as Pearson Product Moment Correlations (r) with two-tailed significance. The level of statistical significance (t and F statistics) for all analysis is p-value < 0.05 and effect size (Cohen’s d) is reported using conventional norms for range (.10 < small > .30 < medium > .50 > large). All analyses
psychological factors responsiblefor gender disparity in construction management programs at the undergraduate level. A surveyinstrument was developed to assess student personality characteristics linked to performance in,and completion of, educational programs. The survey response data from 178 constructionmanagement students at two universities was analyzed. Based on survey results, strategies arediscussed for increasing female representation in construction management programs.Literature ReviewUnderstanding and explaining human conduct has been an objective of the behavioral sciencesfor almost 100 years5. While many constructs exist that inform human behavior, the initial stepin this study was identification of the most pertinent constructs related
. Page 24.1099.1 c American Society for Engineering Education, 2014 Standards-Based Grading in a Fluid Mechanics CourseAbstractStandards based grading is a formal assessment mechanism that tests for student achievement ofspecified learning objectives, or standards. Standards-Based-Grading has been gaining inpopularity in K-12 education, and also has been seeing increased use in higher education. Withincreased pressure from ABET to measure achievement of student outcomes, Standards-BasedGrading provides a method to do that within the traditional course setting without having togenerate a separate set of data outside the normal course grading. This paper describes howStandards-Based Grading was