Asee peer logo
Displaying all 7 results
Collection
AEE Journal
Authors
Katherine Fu; Robert Kirkman; Bumsoo Lee
curriculaof engineering programs, we developed and delivered a free-standing, semester-length course indesign ethics, in which students worked in groups on a design project for a client, with frequent,structured opportunities to reflect on the ethical values at stake in their design decisions. We alsoconducted a pilot test of a novel assessment method using Latent Semantic Analysis (LSA) (Foltz1998, Landauer 1998) to detect changes in the cognitive schemas students bring to bear on ethicalquestions. BACKGROUNDEthics and Design A course in design ethics is conditioned on the long-recognized parallel between ethicalproblem-solving and the design process (Whitbeck 2011, Bero and Kuhlman 2011, Feister et al
Collection
AEE Journal
Authors
Nicola Brown
selected was a website which couldbe developed during the course as a series of assessments. A range of online assessment tools have been reported in the literature including websites, blogs,wikis and e-Portfolios (Bishop et al., 2014; Carroll et al., 2006; Chao, 2007; Chen et al., 2005; Juddet al., 2010; Miyazoe and Anderson, 2010; Reijenga and Roeling, 2009). These tools are generallyused to facilitate collaboration between students, enable self-reflection and in some cases enhancecommunication. However, one study reported that when student input was tracked the contributiontended to be individuals entering their own information and very little editing by team membersoccurred (Judd et al., 2010). Chao (2007) reported very positive feedback
Collection
AEE Journal
Authors
Gail Goldberg
: Documentation and analysis of prior solution attempts •  Element C: Presentation and justification of solution design requirements Component II: Generating and Defending an Original Solution •  Element D: Design concept generation, analysis, and selection •  Element E: Application of STEM principles and practices •  Element F: Consideration of design viability Component III: Constructing and Testing a Prototype •  Element G: Construction of a testable prototype •  Element H: Prototype testing and data collection plan •  Element I: Testing, data collection and analysis Component IV: Evaluation, Reflection, and Recommendations •  Element J: Documentation of external evaluation •  Element K: Reflection on
Collection
AEE Journal
Authors
Diana Bairaktarova; Michele Eodice
outlet for showing what they know about the challenge. It can serve as a baseline or pre-assessment. 3.Multiple Perspectives - provide insights on the challenge. These statements or comments from experts do not provide a solution but should help the students see the many dimensions to the challenge. 4.Research and Revise - engages students in learning activities linked to the challenge. These can be readings, homework problems, simulations, or other activities. 5.Test Your Mettle - application of what students have learned and evaluation of what they need to know more about. This step helps students reflect on and synthesize what they know. 6.Go Public - provides students an outlet to demonstrate what they know at the
Collection
AEE Journal
Authors
Ryan Solnosky P.E.; Joshua Fairchild
through word selection to more closely align with the project. In adapting Team Cohesionfrom Carless and De Paola’s (2000) Group Environment Questionnaire, similar procedures werefollowed. Carless and De Paola’s (2000) original survey items were rooted in research from Wechet al., (1998), Campion et al. (1993), Anastasi and Urbina, (1997), and Cohen and Bailey (1997). Thisallowed the statements to reflect an organizational multi-disciplinary team environment statementthat they agree upon rather than a question. Examples of adaptation can be seen in Table 1. Therationale for doing such a modification was to ease students in completing the survey, but moreimportantly to associate the terminology original questions to our study. Table 1
Collection
AEE Journal
Authors
Claire Dancz; Kevin Ketchman; Rebekah Burke P.E.; Troy Hottle; Kristen Parrish; Melissa Bilec; Amy Landis
intellectual behavior withinthe student homework assignments (“knowledge,” “comprehension,” “application,” “analysis,” “syn-thesis,” or “evaluation”) (Anderson, Krathwohl, and Bloom 2001, Bloom et al. 1956). McCormick etal. 2014 utilized Sustainability Links to evaluate the linkages between the three pillars of sustain-ability, including “concepts” (societal, economic, environmental), “crosslinks” (societal-economic,environmental-economic, societal-environmental) and “interdependency” (societal-economic-environmental) (McCormick et al. 2014b). McCormick et al. 2014 did not include a “no evidence”response option; the authors added this option. Table 1 reflects these three approaches to assessDimensions of Sustainability, Bloom’s Taxonomy, and
Collection
AEE Journal
Authors
Cheryl Bodnar; Matthew Markovetz; Renee Clark; Zachari Swiecki; Golnaz Irgens; Naomi Chesler; David Shaffer
-parametricstatistical analyses in this work, and their results were in general agreement. Another limitation to this work relates to the pre-constructed questions and responses built intothe focus group design. The space from which students could draw questions relevant to their designwas constrained in a manner that may not be reflective of what they might ask in a true industrial set-ting. This could be remedied by an open question format; however, this is difficult to regulate withinan epistemic game environment. It would be possible to further determine student valuation of thedesign metrics through qualitative analysis of the notebook logs students maintained during theseactivities. This work is currently underway and should serve as useful feedback