Asee peer logo

Using the Results of Certification Exam Data: A More Deliberate Approach to Improving Student Learning

Download Paper |


2020 ASEE Virtual Annual Conference Content Access


Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

Engineering Design Graphics Division Technical Session 5

Tagged Division

Engineering Design Graphics

Tagged Topic


Page Count




Permanent URL

Download Count


Request a correction

Paper Authors


Robert A. Chin East Carolina University

visit author page

Robert A. ”Bob” Chin is a faculty member, Department of Technology Systems, College of Engineering and Technology, East Carolina University. He is a past chair of the Engineering Design Graphics Division and as of the 2020 annual conference, he will be serving as the outgoing past chair of the Division. In 2015, he completed his second term as the director of publications for the Engineering Design Graphics Division and as the Engineering Design Graphics Journal editor. Chin has also served as the Engineering Design Graphics Division’s annual and mid-year conference program chair, and he has served as a review board member for several journals including the EDGJ. He has been a program chair for the Southeastern Section and has served as its Engineering Design Graphics Division’s vice chair and chair and as its Instructional Unit’s secretary, vice chair, and chair. His ongoing involvement with ASEE has focused on annual conference paper presentation themes associated with the Engineering Design Graphics, Engineering Libraries, Engineering Technology, New Engineering Educators, and the Two-Year College Divisions and their educational and instructional agendas.

visit author page


Ranjeet Agarwala East Carolina University

visit author page

Dr. Ranjeet Agarwala serves as an Assistant Professor in the Department of Technology Systems at East Carolina University. He holds a PhD in Mechanical Engineering from the North Carolina State University. Since 2001 he has taught courses in Engineering Design, Thermal and Fluid Systems, Digital Manufacturing, and 3D printing, GD&T, Electro-Mechanical Systems, Statics and Dynamics. His research interests are in the areas of Sustainability such as Renewable Energy and Green Manufacturing such as Additive Manufacturing

visit author page

Download Paper |


An accredited undergraduate architectural design technology program adopted an American Design Drafting Association certification exam to help assess student learning in architectural graphics, a key component of architectural design technology. The exam is administered in a junior level architectural design technology course. All those enrolled in the course must past the exam in order to earn credit for the course, and all who have earned credit for the course, pass the exam. Almost all who don’t pass the exam the first time have retaken the exam before the end of the semester in which the course was offered and in which the exam was administered. There has been the rare exception in which a student will take an incomplete for the course and retake the exam and clear the incomplete the following semester. Passing the exam however is not the goal of this course requirement. The purpose of administering the exam is that of ascertaining knowledge and skills. So, rather than just examining the data and trying to make improvements in the students’ knowledge and skill in the areas that appeared to need more attention based on individual student and class student performance, a more deliberate approach was undertaken. Very recently, the program posed a question to the exam administrating entity as to whether other certification exam data were available to which the program could use to make comparisons. Because the data were available, the program undertook an initiative on behalf of its students to compare the students’ performance, and thus the performance of the program, to that of a comparator population—all those who have sat for the exam. Exam data were collected, compiled, and analyzed with the aid of descriptive statistics. Because the assumptions were met and the data were available, the annual program exam session averages and the annual historical exam averages for the comparator population were used to calculate annual z-scores for the 20 competencies that comprise the major parts of the exam. The purpose of the z-scores was to determine whether there was a significant difference between the annual session averages and the averages that were the result of members comprising the comparator population sitting for the exam. While all the students in the program pass the exam, their performance in and among select competencies was not consistent. Their knowledge and skills also varied when comparisons were made to that of the knowledge and skills exhibited by the comparator population. The data will now be offered to the program’s advisory committee for its consideration and recommendations. The data will also be examined by faculty who teach prerequisite courses for their consideration. The focus of these reviews will be on that of student learning and the knowledge and skills they possess upon completion of the program.

Chin, R. A., & Agarwala, R. (2020, June), Using the Results of Certification Exam Data: A More Deliberate Approach to Improving Student Learning Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--35475

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015