Portland, Oregon
June 23, 2024
June 23, 2024
June 26, 2024
Engineering Design Graphics Division (EDGD) Technical Session 2
Engineering Design Graphics Division (EDGD)
20
10.18260/1-2--47195
https://peer.asee.org/47195
130
Dr. Yip-Hoi is currently a professor in Manufacturing Engineering at the Department of Engineering and Design at Western Washington University. Previously, he served on the faculties of the University of the West Indies - St. Augustine, the University of Michigan - Ann Arbor, and the University of British Columbia. His research interests lie in the areas of CAD, geometric and solid modeling, machining and CNC, engineering design and ethics, and machine design.
Jack Wilson is a Ph.D. student in the Edward P. Fitts Department of Industrial and Systems Engineering at NC State, focusing on Advanced Manufacturing and Systems Analytics and Optimization.
Automating the assessment of CAD models has been the focus of significant research efforts. One focus of this has been in its application to grading in support of training of engineering students in 3D parametric modeling skills and practices. However, there continue to be significant challenges in producing broadly acceptable tools of practice due to the complexities involved in creating a CAD model and in identifying formal evaluation criteria that robustly capture whether skills have been acquired. Of interest is whether tools can be developed that provide more robust formative assessment of a modeling activity. This contrasts with summative assessment approaches which largely benefits the assessor in reducing grading times by evaluating the result but can miss important tendencies in a student designer that might need to be corrected. For this to be feasible better metrics that reflect how a modeling activity is progressing not just with respect to realizing a final shape goal, but also in capturing design intent and meeting best practices is needed. In this paper some of the challenges of evaluating 3D CAD modeling efficacy are explored. These challenges increase with the level of complexity desired in the result which can range from just creating a final 3D shape to capturing design intent and finally skill at incorporating best practices. A case study of a capstone modeling project given to students in an introductory CAD class is used to illustrate these challenges. This example also highlights the difficulties encountered with assessing more open-ended modeling experiences when students are given less guidance and have many more options that they can use in satisfying the modeling requirements. A simple case study is also presented to demonstrate the viability of collecting a more complete set of assessment metrics during a modeling activity. A discussion of how access to a richer set of metrics might lead to a better understanding of modeling tendencies is presented.
Yip-Hoi, D. M., & Wilson, J. P. (2024, June), Directions in Automating CAD Modeling Assessment Paper presented at 2024 ASEE Annual Conference & Exposition, Portland, Oregon. 10.18260/1-2--47195
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2024 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015