June 23, 2013
June 23, 2013
June 26, 2013
Engineering Management, Engineering Economy, and Industrial Engineering
23.572.1 - 23.572.18
Experimental Assessment of Higher-Level Data Analysis SkillsWe report the results of a pilot experiment, part of a larger effort to assess and improve students'competence in data analysis. The premise of the research is that, while most engineeringstatistics courses emphasize technique, students also need to develop judgment about how toapply techniques, and further need to develop "analytic imagination" so that they can see andexploit the potential in datasets. Ultimately, these experiments should lead to methods andmaterials that can transform classical technique-oriented statistics courses into courses that willbetter prepare engineers to be effective data analysts.This first experiment tested the abilities of twenty-seven student volunteers enrolled in twostatistics classes to analyze an open-ended problem. One of the classes was a requiredintroductory class; the other was an elective follow-on class. Students received a dataset andvariable definitions and were given 30 minutes to "Examine these data and report what you havefound that would be interesting to the company." The Web Visitors exercise was chosen becauseof its relative simplicity, open-endedness, and compatibility with the software's data sizelimitation. Several variables were recorded to describe the students, including gender, number ofprevious statistics courses, and performance on the final exam. Experimental outputs includedthe students' written responses and electronic records of all their menu choices when usingMinitab software to analyze the data. The written responses were subjected to several types ofquantitative textual analysis. Software use was summarized in terms of the volume and variety oftechniques used, with variety expressed as the normalized entropy across eight categories oftechniques. Subjective assessments of the utility of the students' responses were elicited fromfour independent raters.Analysis of the pilot experiment concluded that there were major differences between individualstudents' behaviors and the quality of their responses. There were also marginally significantdifferences between students in the two classes in terms of the relationship between of volumeand variety of steps. However, volume and variety themselves were not predictable from class orfrom any of the students' personal attributes, and inter-rater reliability in judgments of the qualityof students' responses was not high. Nevertheless, valuable lessons were learned regardingmethods for insuring data quality, quantifying student behavior, and assessing the quality ofperformance. These lessons are being applied to additional experiments describing andcontrasting the performance on open-ended problems with larger numbers of students in classesfeaturing greater differences in statistical background.
Layton, J. A., & Willemain, T. R. (2013, June), Experimental Assessment of Higher-Level Data Analysis Skills Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. 10.18260/1-2--19586
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015