Asee peer logo

How Accurate Is Students’ Self Assessment Of Computer Skills?

Download Paper |

Conference

2008 Annual Conference & Exposition

Location

Pittsburgh, Pennsylvania

Publication Date

June 22, 2008

Start Date

June 22, 2008

End Date

June 25, 2008

ISSN

2153-5965

Conference Session

Assessment

Tagged Division

Educational Research and Methods

Page Count

16

Page Numbers

13.671.1 - 13.671.16

Permanent URL

https://peer.asee.org/4324

Download Count

39

Request a correction

Paper Authors

author page

Michael Collura University of New Haven

author page

Samuel Daniels University of New Haven

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

How Accurate is Students’ Self-Assessment of Computer Skills?

Abstract Self-evaluation by students is commonly used as a key element in program and course assessment plans. Such instruments are intended to provide crucial feedback for program improvement and thus play a significant role in closing our assessment loop. For many of the program outcomes, self-assessment by current students and graduates augments other, more objective measures. However, for some outcomes there are no practical means of obtaining objective assessment and we must rely on self-assessment. The heavy reliance on this metric begs the question “How accurate is student self-assessment?” This paper provides data from a second-semester engineering course in which students develop proficiency using computer tools to solve typical engineering problems. Students’ self-assessments in several areas are compared with the instructor’s assessment of these students.

Some work reported in the literature addresses the accuracy of student self-assessment in specific academic areas. In the medical field, literature exists which addresses medical students’ self- assessment of specific skills. Other comparisons have been published to compare students’ expected grades with actual results. Little was found that is relevant to engineering student and in particular to their assessment of professional skills.

The work reported here relates to the assessment of ABET’s program outcome k: “an ability to use the techniques, skills and modern engineering tools necessary for engineering practice. Methods of Engineering Analysis is a course taken by all engineering majors during their second semester at the University of New Haven. In this course, students are introduced to engineering topics and a variety of numerical methods for solving these problems. The current platform used is a spreadsheet with Visual Basic for Applications programming. Students complete a 30- question survey the first day of class in which they rate their expertise in three broad categories: basic spread-sheet usage, advanced spread-sheet usage and programming. The same survey is completed at the end of the class, thus providing a pre and post view from the students perspective. Quizzes given throughout the course and the final exam were structured to enable instructors to assess student performance in these same areas with composite measures. Data is presented to compare the instructor assessment of performance with students’ self-assessment at the individual level.

Collura, M., & Daniels, S. (2008, June), How Accurate Is Students’ Self Assessment Of Computer Skills? Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. https://peer.asee.org/4324

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015