Asee peer logo

A Successful Student Initiated Assessment Method For An Environmental Engineering Graduate Program

Download Paper |

Conference

2006 Annual Conference & Exposition

Location

Chicago, Illinois

Publication Date

June 18, 2006

Start Date

June 18, 2006

End Date

June 21, 2006

ISSN

2153-5965

Conference Session

Graduate Student Experiences

Tagged Division

Graduate Studies

Page Count

23

Page Numbers

11.130.1 - 11.130.23

DOI

10.18260/1-2--533

Permanent URL

https://peer.asee.org/533

Download Count

509

Paper Authors

biography

Scott Rogers Georgia Institute of Technology

visit author page

Mr. Rogers is a Ph.D. candidate in environmental engineering at the Georgia Institute of Technology in Atlanta, Georgia. He served as chair of the student-survey subcommittee of the Georgia Tech Association of Environmental Engineers and Scientists Dialogue for Academic Excellence Committee (DAEC) from August 2004 to June 2005 and has served as chair of DAEC since June 2005.

visit author page

biography

Jeremy Noonan Purdue University

visit author page

Mr. Noonan is a Ph.D. student in engineering education at Purdue University in West Lafayette, Indiana. At the time of this study, he was in the M.S. EnvE degree program in environmental engineering at the Georgia Institute of Technology in Atlanta, Georgia. He served as chair of DAEC from August 2004 to May 2005.

visit author page

biography

Jaemeen Baek Georgia Institute of Technology

visit author page

Ms. Baek is a Ph.D. candidate in environmental engineering at the Georgia Institute of Technology in Atlanta, Georgia. She served on DAEC from the formation of the committee in August 2004 to September 2005.

visit author page

biography

Sangil Lee Georgia Institute of Technology

visit author page

Mr. Lee is a Ph.D. candidate in environmental engineering at the Georgia Institute of Technology in Atlanta, Georgia. He has served on DAEC since the formation of the committee in August 2004.

visit author page

biography

Ulas Tezel Georgia Institute of Technology

visit author page

Mr. Tezel is a Ph.D. candidate in environmental engineering at the Georgia Institute of Technology in Atlanta, Georgia. He has served on DAEC since the formation of the committee in August 2004.

visit author page

biography

Grant Michalski Georgia Institute of Technology

visit author page

Mr. Michalski is in the M.S. EnvE degree program in environmental engineering at the Georgia Institute of Technology in Atlanta, Georgia, with graduation expected in May 2006. He has served on DAEC since the formation of the committee in August 2004 and served as secretary of DAEC from August 2004 to June 2005.

visit author page

biography

Chia-Hung Hou Georgia Institute of Technology

visit author page

Mr. Hou is a Ph.D. candidate in environmental engineering at the Georgia Institute of Technology in Atlanta, Georgia. He served on DAEC from the formation of the committee in August 2004 to June 2005.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

A Successful Student-Initiated Assessment Method for an Environmental Engineering Graduate Program

Abstract

Obstacles in assessing academic conditions can include generating interest in assessment efforts in order to achieve high response rates, transcending communication barriers, preserving confidentiality, minimizing biases from numerous sources, and conducting meaningful statistical analyses. A graduate environmental engineering program needed to overcome these obstacles to create a valid assessment tool. Previous program surveys did not amply address specific student concerns. Those surveys had poorly designed questions and answer formats. Survey distribution had relied on students to retrieve and return surveys themselves. Data analysis had consisted of only computing mean values and compiling comments. Results of the surveys had suffered from low response rates, biases, and demographic underrepresentation.

A graduate-student committee designed a survey considering the aforementioned problems. “The improvement of research quality” was the overall survey theme, and four subtopics -- research resources, research preparation, research views and attitudes, and research-group support -- were created to generate specific question ideas from the student population at-large. Questions were included in the survey based on importance, the actionable nature of obtained knowledge, and other criteria. Background and control questions were included for categorizing respondents. Sensitive natures of some questions were addressed to reduce biases. The format of the survey was tailored to make respondents comfortable and interested in participating. Question quality was examined through a pilot study and reviews by professionals. Answer formats were mainly closed-ended with most open-ended questions providing supplemental information. Hand-distribution and hand-collection were intended to make the survey tangible, appreciable, and accessible for respondents. Univariate analysis produced meaningful findings regarding individual variables, while bivariate/multivariate analysis determined correlations among multiple variables. Sensitivity analysis was also conducted to uncover potential biases in answering behavior for students who both were involved in survey design and responded to the survey.

We submit that our survey effort was successful overall due to high response, accurate demographic representation, positive student feedback, reduced biases, and significance of findings. 50 students (greater than 75% of the population) responded to the survey. Some of the salient findings indicate deficiencies in communication and statistics education, deficiencies in overall research preparation for first-year and master’s students, an overall failing of a laboratory course to provide research-skill education, and a lack of guidance from research-group members for some students.

Our improved survey has led students and faculty members in the program to appreciate internal assessment and encourage the student committee to continue its efforts. The committee is beginning to solve problems discovered in this study and will be continuing to use the successful method in further assessment. Our method is believed to be applicable for engineering programs that must deal with common obstacles in making a sound assessment tool.

Rogers, S., & Noonan, J., & Baek, J., & Lee, S., & Tezel, U., & Michalski, G., & Hou, C. (2006, June), A Successful Student Initiated Assessment Method For An Environmental Engineering Graduate Program Paper presented at 2006 Annual Conference & Exposition, Chicago, Illinois. 10.18260/1-2--533

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2006 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015