San Antonio, Texas
June 10, 2012
June 10, 2012
June 13, 2012
2153-5965
Mechanical Engineering
47
25.755.1 - 25.755.47
10.18260/1-2--21512
https://peer.asee.org/21512
834
John K. Estell is a professor of computer engineering and computer science at Ohio Northern University. He received his doctorate from the University of Illinois, Urbana-Champaign. His areas of research
include simplifying the outcomes assessment process, first-year engineering instruction, and the pedagogical aspects of writing computer games. Estell is an ABET Program Evaluator, a Senior Member of
IEEE, and a member of ACM, ASEE, Tau Beta Pi, Eta Kappa Nu, and Upsilon Pi Epsilon.
John-David Yoder received all of his degrees (B.S., M.S., and Ph.D.) in mechanical engineering from the University of Notre Dame. He is professor and Chair of the Mechanical Engineering Department at Ohio Northern University, Ada, Ohio. He has previously served as Proposal Engineer and Proposal Engineering Supervisor at Grob System, Inc., and Software Engineer at Shaum Manufacturing, Inc. He has held a number of leadership and advisory positions in various entrepreneurial ventures. He is currently a KEEN (Kern Entrepreneurial Education Network) Fellow, and has served as a Faculty Fellow at the Jet Propulsion Laboratory, Pasadena, Calif., and an Invited Professor at INRIA Rhone-Alpes, Monbonnot, France. Research interests include computer vision, mobile robotics, intelligent vehicles, entrepreneurship, and education.
Briana Morrison is an Assistant Professor at Southern Polytechnic State University. She was recently the undergraduate coordinator for both the Computer Science and Software Engineering programs. Her research interests include computer science education, gender issues in computer science, and data structures.
Fong Mak, P.E., received his B.S.E.E. degree from West Virginia University in 1983, M.S.E.E. and Ph.D. in electrical engineering from the University of Illinois in 1986 and 1990. He is currently the Chair of electrical and computer engineering at Gannon University. He is also the Program Director for the professional-track Gannon/GE Transportation System Embedded Software Graduate program.
Improving upon Best Practices: FCAR 2.0The Faculty Course Assessment Report (FCAR) presents a streamlined methodology that allowsinstructors to write assessment reports in a concise, standardized format conducive for use inboth course and student (program) outcomes assessment. The FCAR is a short (typically 1-2pages) form, completed by the instructor that taught the class. The FCAR is structured as asequence of standardized reporting categories that include what course modifications were made,the outcomes assessment information obtained, reflection on the part of the instructor, andsuggestions for curricular improvement. Through this approach, the instructor is guided througha systematic review of the course, with the additional benefit of clearly and succinctlydocumenting critical portions of the "closing the loop" process. At the center of this approach isthe concept of performance vectors, a 4-tuple vector that categorizes aggregate studentperformance (in terms of the number of students who performed at excellent, adequate, minimal,or unsatisfactory levels) on a directly measured assessment artifact. For each performancecriterion to be reported on, an entry is placed into the FCAR documenting the criterion, theoutcome being supported, the assignment(s) used for acquiring the data, the assessment tool usedfor evaluating the data, and the resultant performance vector. Additionally, as this assessmentinformation is processed by the instructor who is closest to the data, any observed difficulties orextenuating circumstances affecting performance can be readily documented as part of theFCAR. For the department chair and/or assessment coordinator, FCARs provide a valuableresource, as all assessment information regarding a particular course is included in one place in acommon format providing ease of use. This information can then be extracted and summarizedin a way that allows all courses that cover a given Student Outcome to be easily evaluated.For the past several years, presentations featuring the FCAR methodology have been given at theABET Symposium. Based partly on this dissemination stream, the use of this instrument hasspread far beyond its origins. As other institutions have adopted the instrument, they have mademodifications as well, and in turn reported at a variety of venues regarding how their version ofthe FCAR has played a highly useful role in streamlining their continuous quality improvementprocesses, yielding both qualitative and quantitative information, facilitating greater consistencyin the reporting and processing of that information, and keeping faculty actively engaged in anon-going assessment process. In brief, this paper will present a survey of the literature regardingthe use of FCARs, provide an in-depth description of the current implementation of thisinstrument in at least 5 departments at 3 universities, and present recommendations for a model“FCAR 2.0” document. The paper will include example copies of FCARs from variousinstitutions, details on their use, and faculty responses on the positive and negative aspects ofimplementing this approach to assessment.
Estell, J. K., & Yoder, J. S., & Morrison, B. B., & Mak, F. K. (2012, June), Improving Upon Best Practices: FCAR 2.0 Paper presented at 2012 ASEE Annual Conference & Exposition, San Antonio, Texas. 10.18260/1-2--21512
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2012 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015