six sections participated in a common three-hour weekly lab, and completed the Page 13.1345.3same design projects and writing assignments. The Let Me Learn® process was implemented intwo of the six sections by the writing instructor; the other four sections did not use LML. At theend of the semester, students were given a survey that asked them to rate their agreement withthe following four statements on a scale where 1=strongly agree and 4=strongly disagree:1. My team worked together to DEFINE its project goal(s).2. My team worked together to REACH its project goal(s).3. My team RECOGNIZED my skills, knowledge, and abilities.4. My team
Adaptive Expertise and Transfer of Design Process Knowledge, 129 (7), July 2007, ASME Journal of Mechanical Design, pp. 730-734.21. Pandy, M.G., Petrosino, A.J., Austin, B. and Barr, R. (2004). Assessing Adaptive Expertise in Undergraduate Biomechanics, Journal of Engineering Education, 93, 211-222.22. Schwartz, D. L., Lin, X., Brophy, S., & Bransford, J. D. (1999). Toward the development of flexibly adaptive instructional designs. In Charles M. Reigeluth (Ed.) Instructional Design Theories and Models, Mahwah, NJ: Lawrence Erlbaum Associates, Inc.23. Walker, Joan M.T., Cordray, David S., King, Paul H., and Brophy, Sean P. (2006). Design scenarios as an assessment of adaptive expertise, International Journal of Engineering
Page 14.1295.10observational data that educational researchers routinely encounter and can be used in a varietyof settings to gain deeper insight into the factors affecting educational outcomes.AcknowledgementThis material is based upon work supported by the National Science Foundation under award0757020 (DUE). Any opinions, findings and conclusions or recommendations expressed in thismaterial are those of the author(s) and do not necessarily reflect the views of the NationalScience Foundation (NSF).References1. National Science Board Science and Engineering Indicators 2002; NSB-02-1; National ScienceFoundation: Arlington, VA, April, 2002.2. Bernold, L. E.; Spurlin, J. E.; Anson, C. M., Understanding our students: A longitudinal
.” Proceedings of the 2008 American Society for Engineering Education Annual Conference. 7. Jacoby, Barbara. 1996. Service-Learning in Higher Education: Concepts and Practices. San Francisco, CA: Jossey-Bass. 8. King, Patricia M., & Kitchener, Karen S. 1994. Developing Reflective Judgement, Jossey-Bass Inc, San Francisco, CA. 9. Lima, Marybeth, and Oakes, William Oakes. 2006. Service-Learning: Engineering in Your Community. Page 14.1223.10 Ann Arbor, MI: Great Lakes Press, Inc. 10. Lynch, C.L. and Wolcott, S. K. 2001. “Helping your students develop critical thinking skills.” IDEA Paper #37
1 11.29** .19 .91S within-grouperror 48 (3976)Note: Values enclosed in parentheses represent mean square errors. S = subjects. **p < .01The simple effect for the condition group at the low level of familiarity with the chemistry fieldproved not to be significant, F(1,24) = .01, p = .94. LOW Familiarity with Chemistry HIGH Familiarity with Chemistry 100 PERFORMANCE: Estimated Marginal Means
Educational Research Association, American Psychological Association, and the National Council on Measurement in Education, Standards for educational and psychological testing. 1999, Washington, DC.5. Carminer, E.G. and R.A. Zeller, Reliability and validity assessment. 1979, Thousand Oaks, CA: SAGE Publications.6. Messick, S., Validity, in Educational Measurement, R.L. Linn, Editor. 1989, The American Council on Education and the National Council on Measurement in Education: Washington, D.C. p. 13-103.7. Wilson, M., Constructing measures: An item response modeling approach. 2005, Mahwah, J: Lawrence Erlbaum Associates.8. Baker, D., S. Krause, and S.Y. Purzer, Developing an instrument to measure
International Planning/Advisory Committee for the 2009 Research in Engineering Education Symposium, and is guest co-editor for a special issue of the International Journal of Engineering Education on applications of engineering education research.© American Society for Engineering Education, 2009Trevor Harding, California Polytechnic State University Dr. Trevor S. Harding is Associate Professor of Materials Engineering at California Polytechnic State University–San Luis Obispo where he teaches courses in service learning, introductory materials engineering, biomedical materials design, and tribology. His research interests include both ethical development in engineering students and in vivo degradation of
expectations. The Telegraph. 3/6/14. http://www.telegraph.co.uk/education/educationopinion/10872594/Meeting-students- high-expectations.html Accessed 7/2/17 7. Paton, G. & Carter, C. (2014). Universities lowering entry grades to fill places this year. The Telegraph. 14/8/14. http://www.telegraph.co.uk/education/educationnews/11035385/Universities- lowering-entry-grades-to-fill-places-this-year.html Accessed 10/2/17. 8. Coughlan, S. (2016). University lowers entrance grades for disadvantage. BBC. 15/12/17. http://www.bbc.co.uk/news/education-38301844 Accessed 10/2/17. 9. PAPER AUTHORS - ANONYMISED 10. PAPER AUTHORS - ANONYMISED 11. PTES (2016) Postgraduate Taught Experience
valence or affect [8]. The commonality ofaffective assessments underscores the importance of emotion in the learning process, especiallyin the context of game-based learning where play is an element of motivation. They write thatthe body of research on game-based learning in engineering, “nearly unanimously agree[s] thatstudents enjoy game-based learning” but there is a significant lack of studies demonstrating theimpact on learning outcomes. This is either due to a lack of validated measures (e.g. student self-assessment on individually developed surveys or questionnaires) or small sample sizes and/ormissing statistical analysis [8].While games may inspire thoughts of play, the two are overlapping but distinct topics in thecontext of education
. The impact of thebias reduction in the purpose sampling could lead to objectivity obtained by probabilisticsampling subject to future studies.References[1] L. A. Palinkas, S. M. Horwitz, C. A. Green, J. P. Wisdom, N. Duan, and K. Hoagwood, "Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research," Administration and Policy in Mental Health and Mental Health Services Research, vol. 42, pp. 533-544, 2015.[2] C. L. Livneh, "Characteristics of lifelong learners in the human service professions," Adult Education Quarterly, vol. 38, pp. 149-159, 1988.[3] M. Q. Patton, "Two Decades of Developments in Qualitative Inquiry: A Personal, Experiential Perspective
. Netemeyer, W. O. Bearden. & S. Sharma, Scaling procedures: Issues and applications. Thousand Oaks, CA: Sage Publications, 2003.[3] J. C. Nunnally and I. H. Bernstein. Psychometric theory. New York: McGraw-Hill, 1994.[4] G. A. Churchill. A paradigm for developing better measures of marketing constructs. Journal of Marketing Research, 16(1), 64-73, 1979.[5] L. A. Clark, & D. Watson, Constructing validity: Basic issues in objective scale development. Psychological Assessment, 7(3), 309-19, 1995.[6] P. C. Kendall, J. N. Butcher, and G. N. Holmbeck, Handbook of research methods in clinical psychology. New York: Wiley, 1999.[7] P. E. Spector, Summated rating scale construction: An introduction. Thousand Oaks, CA: Sage Publications
Conference on Computational Linguistics, 1, 127–132. http://doi.org/10.3115/991813.991833 11. Jarvis, S. (2002). Short texts, best-fitting curves and new measures of lexical diversity. Language Testing, 19(1), 57–84. http://doi.org/10.1191/0265532202lt220oa12. Lu, X. (2012). The Relationship of Lexical Richness to the Quality of ESL Learners’ Oral Narratives. Modern Language Journal, 96(2), 190–208. http://doi.org/10.1111/j.1540-4781.2011.01232_1.x13. Miller, G. a. (1995). WordNet: a lexical database for English. Communications of the ACM, 38(11), 39–41. http://doi.org/10.1145/219717.21974814. Montfort, D., Brown, S., & Pollock, D. (2009). An Investigation of Students’ Conceptual Understanding in Related Sophomore to
pre- post anxiety treatment to improve academic performance for engineering students.” Procedia-Social and Behavioral Sciences, 15, pp. 3826-3830.7. Ferguson, C.W., Yanik, P.M., Chang, A. and Kaul, S. (2015). “Scholarship Program Initiative via Recruitment, Innovation, and Transformation.” Proc. 122nd ASEE Annual Conference and Exposition, Seattle, WA.8. Kaul, S., Chang, A., Yanik, P.M. and Ferguson, C.W. (2015). “Development of a Mentorship Program in Engineering and Technology.” Proc. 122nd ASEE Annual Conference and Exposition, Seattle, WA.
Education. In D. Grasso & M. B. Burkins (Eds.), Holistic engineering education: Beyond technology (pp. 17-35). New York: Springer.3. Council on Competitiveness. (2005). Innovate America: Thriving in a world of challenge and change. Washington, DC: Council on Competitiveness.4. Jamieson, L. H., & Lohmann, J. R. (2012). Innovation with impact: Creating a culture for scholarly and systematic innovation in engineering education. Washington, DC, USA: ASEE.5. Borrego, M., Froyd, J. E., & Hall, T. S. (2010). Diffusion of engineering education innovations: A survey of awareness and adoption rates in U.S. engineering departments. Journal of Engineering Education, 99(3), 185-207.6. Charyton, C
Institute ofTechnology." In Elements of Quality Online Education: Practice and Direction, edited by J. Bourne and J. C.Moore, 261-78. Needham, MA: Sloan Consortium, 2002. 7. Collis, B., “Course Redesign for Blended Learning: Modern Optics for Technical Professionals,”International Journal of Continuing Engineering Education and Lifelong Learning, 13 (2003): 22-38. 8. Kaleta, R., Skibba, K. and Joosten, T., "Discovering, Designing, and Delivering Hybrid Courses." InBlended Learning: Research Perspectives, edited by A. G. Picciano and C. D. Dziuban, 111-43. Needam, MA: TheSloan Consortium, 2007. 9. Peercy, P. S. and Cramer, S. M., “Redefining Quality in Engineering Education Through HybridInstruction,” Journal of Engineering
university for six courses which were part of two tracks: a common introductorysequence and a sequence for honors students.3 Professors and teaching assistants of these coursesclassified their respective section(s) of “Introduction to Engineering” and generally hadagreement in most areas within each of the eight main outcomes; however, discrepancies intopics were discovered within sections covered by each outcome.In the self-study, the results were organized by main outcome where a three-color coding systemwas used to show the level of agreement between instructors.3 An outcome marked as greendenoted that the outcome was covered in each section of one or more courses. An outcome
. Hopkins, K.D. and A.R. Gullickson, Response Rates in Survey Research: A Meta- Analysis of the Effects of Monetary Gratuities. The Journal of Experimental Education, 1992. 61(1): p. 52-62.8. Scollon, C.N., C. Kim-Prieto, and E. Diener, Experience, Sampling: Promises and Pitfalls, Strengths and Weaknesses, in Assessing Well-Being, E. Diener, Editor. 2009, Springer Netherlands. p. 157-180.9. Csikszentmihalyi, M., S. Abuhamdeh, and J. Nakamura, Flow, in Handbook of competence and motivation, A.J.D.C.S. Elliot, Editor. 2005, Guilford Press: New York.10. Schunk, D.H., P.R. Pintrich, and J.L. Meece, Motivation in Education: Theory, Research, and Applications. 3 ed. 2008, Upper Saddle River, NJ: Pearson.11
theperformance data to model student thinking (e.g., through factor analysis, item response theoryor diagnostic classification modeling).Applying the Evidentiary Validity Framework to Concept InventoriesRigorous development of a validity argument and pursuit of validity evidence in support of thatargument are particularly important for assessments such as concept inventories that areadministered across multiple institutions and, in some cases, are used to evaluate educationalinterventions.9,17 To investigate the validity properties of an inventory, one must first identifywhat claim(s) the developers or users are making about their concept inventory. Claims can beabout student learning gains, student misunderstandings, and overall mastery of
, M. (2000). Concept analysis of self-efficacy. Graduate Research in Nursing, available at: http://graduateresearch.com/Kear.htm [14] Ausubel, D. (1960). The use of advanced organizers in the learning and retention of meaningful verbal material. Journal of Educational Psychology, 51(5), 267-272. [15] Davis, D. C., Beyerlein, S. W., & Davis, I. T. (2005, June). Development and use of an engineer profile. In Proceedings of the 2005 American Society for Engineering Education Annual Conference & Exposition. [16] Kaewpet, C., & Sukamolson, S. (2011). A sociolinguistic approach to oral and written communication for
much explicit, named attention has reflection received inengineering education scholarship and how do we interpret these results? To answer thisquestion, we conducted a systematic literature review of all conference publications from theAmerican Society of Engineering Education (ASEE). Our exploration sought to answer thisquestion by assessing the number of papers that explicitly mention reflection.BackgroundReflection can be described as, “...an intentional and dialectical thinking process where anindividual revisits features of an experience with which he/she is aware and uses one or morelenses in order to assign meaning(s) to the experience that can guide future action (and thusfuture experience).”4 This intentional process can be used as
. Mislevy, R. J., & Braun, H. I. (2003). Intuitive test theory. In Annual Dinner Meeting of the PrincetonAssociation for Computing Machinery (ACM) and Institute of Electrical and Electronics Engineers (IEEE)Computer Society Chapters, Kingston, NJ, May (Vol. 22).2. Embretson, S. E. (1996b). The new rules of measurement. Psychological Assessment, 8(4), 341-349.3. Mislevy, R.J., & Bock, R. (1982). Adaptive EAP estimation of ability in a microcomputer environment. AppliedPsychological Measurement, 6, 431-444.4. Baker, F.B., & Kim, S. (2004). Item Response Theory: Parameter Estimation Techniques, 2nd Edition. NewYork: Marcel Dekker, Inc.5. Spiegelhalter, D.J., Thomas, A., Best, N.G., Lunn, D. (2004). WinBUGS Version 2.0 Users Manual
reasoning in science. In: Carruthers P, Stich S, Siegal M, eds. The Cognitive Basis of Science. Port Chester, NY: Cambridge University Press; 2002:133-153.12. Nersessian NJ. Model-based reasoning in conceptual change. In: Magnani L, Nersessian NJ, Thagard P, eds. Model-Based Reasoning in Scientific Discovery. New York: Kluwer Academic/Plenum Press; 1999:5-22.13. Lesh R, Doerr HM. Foundations of a models and modeling perspective on mathematics teaching, learning, and problem solving. In: Lesh R, Doerr HM, eds. Beyond Constructivism: Models and Modeling Perspectives on Mathematics Problem Solving, Learning, and Teaching. Mahwah, NJ: Lawrence Erlbaum Associates; 2003:3-33.14. Kozma R, Russell J. Modelling students
. Oklahoma City, OK. October 23-26. (2013)6 Rogalski, J., & Samurc¸ay, R. 1990. Acquisition of programming knowledge and skills. In J.M. Hoc, T.R.G. Green, R. Samurc¸ay, & D.J. Gillmore (Eds.), Psychology of programming (pp. 157–174). London: Academic Press7 du Boulay, B. 1989. Some difficulties of learning to program. In E. Soloway & J.C. Spohrer (Eds.), (pp. 283–299). Hillsdale, NJ: Lawrence Erlbaum.8 Sweller, J., Ayres, P., and Kalyuga, S. 2011. Cognitive Load Theory, Explorations in the Learning Sciences, Instructional Systems and Performance Technologies, doi: 10.1007/978-1- 4419-8126-4_5, Springer Science+Business Media, LLC9 Maloney, J., Peppler, K., Kafai, Y. B., Resnick, M., & Rusk, N
concepts in engineering science and helping engineering faculty conduct rigorous research in engineering education.Dr. Robin Adams, Purdue University, West Lafayette Robin S. Adams is an Associate Professor in the School of Engineering Education at Purdue University and holds a PhD in Education, an MS in Materials Science and Engineering, and a BS in Mechanical Engineering. She researches cross-disciplinarity ways of thinking, acting and being; design learning; and engineering education transformation. c American Society for Engineering Education, 2016 Voicing the indescribable: Using photo elicitation as a method to uncover belonging and
Paper ID #30699Student Perceptions of and Learning in Makerspaces Embedded in theirUndergraduate Engineering Preparation ProgramsDr. Louis S. Nadelson, University of Central Arkansas Louis S. Nadelson has a BS from Colorado State University, a BA from the Evergreen State College, a MEd from Western Washington University, and a PhD in educational psychology from UNLV. His scholarly interests include all areas of STEM teaching and learning, inservice and preservice teacher pro- fessional development, program evaluation, multidisciplinary research, and conceptual change. Nadelson uses his over 20 years of high school and
results fromthe statistical analyses suggest that coupling peer discussion with PRS use can enhance students’ability to actively construct knowledge in class.References1. National Research Council. (1996). National science education standards. .Washington, DC:National Academy Press.2. Wulf, W. A., & Fisher, G M. C (2002). A makeover for engineering education. Issues in Science andTechnology. Online, http://www.nap.edu/issues/18.3/p_wulf.html.3 . Ebert-May, D., Brewer, C., Allred, S. (1997). Innovation in Large Lectures: Teaching for Active Learning.BioScience, 47(9), pp. 601-607.4. Kennedy, G. E.; Cutts, Q. I.(2005). The association between students' use of an electronic voting system and their
gratefully acknowledge the contributions of the following people: William StephenAnderson, Mary Anderson-Rowland, Angela Beauchamp, James Borgford-Parnell, David Bugg,Wen-Yu Chao, Rosa Cintron, Tyler S. Combrink, Jeanette Davidson , Tiffany Davis-Blackwood,Randall W. Evans, Bach Do, M. Jayne Fleener, Francey Freeman, Van Ha, Betty J. Harris,Rebecca L. Heeney, Quintin Hughes, Elizabeth Kvach, Stephen M. Lancaster, Tony Lee, BenLopez, Anna Wong Lowe, Gabriel Matney, Lindsey S. McClure, Reinheld E. Meissler, SandraKay Moore-Furneaux, Ruth Moaning, Teri J. Murphy, Brittany Shanel Norwood, MayraOlivares, Sedelta Oosahwee, Teri Reed Rhoads, Tracy Revis, Anne Reynolds, Lauren Rieken,Paul Rocha, Johanna Rojas, Kimberly Rutland, Lisa Schmidt, Larry Schuman
-related majors in college womenand men: A path analysis. Journal of Counseling Psychology, 32, 47-56.5. Hackett, G. & Betz, N. (1989). An exploration of the mathematics self-efficacy/mathematics performancecorrespondence. Journal of Research in Mathematics Education, 20, 261-273.6. Lapan, R., Boggs, K., & Morrill, W. (1989). Self-efficacy as a mediator of investigative and realistic generaloccupational themes on the Strong-Campbell interest inventory. Journal of Counseling Psychology,36, 176-182.7. Lent, R., Lopez, F., Bieschke, K. & Socall, D. (1991). Mathematics self-efficacy: sources of relations to science-based career choice. Journal of Counseling Psychology, 38, 424-431.8. Lent, R., Brown, S. & Larkin, K. (1987). Comparison
, T., Jaspers, M., & Chapman, M. (2007). Integrating web-delivered problem-based learning scenarios to the curriculum. Active Learning in Higher Education. 4. Bordelon, T. D. & Phillips, I. (2006). Service learning: What students have to say. Active Learning in Higher Education. 7(1), 143-153. 5. Guertin, L. A., Zappe, S. E., & Kim, H. (2007). Just-in-Time Teaching (JiTT) exercises to engage students in an introductory-level dinosaur course. Journal of Science Education and Technology. 6, 507-514. 6. Cimbala, J. M., Pauley, L. L., Zappe, S. E., & Hsieh, M. (June, 2006). Experiential learning in fluid flow class. Paper presented at the annual meeting of the American Society of Engineering
lead to the overwhelming success of the project came about due to the desirefor the presentations to be memorable to the entire class. It has been found that when asked topresent a creative and fun project they are highly enthusiastic and often exceed the expectation ofthe assignment.References 1. Jensen, D., Wood, J., Dennis, S., Wood, K., Campbell, M., Design Implementation and Assessment of a Suite of Multimedia and Hands-on Active Learning Enhancements for Machine Design, Proceedings of ASME International Mechanical Engineering Congress and Exposition, Orlando, FL, 2005 2. Criteria for Accrediting Engineering Technology Programs, ABET Board of Directors, Technology Accreditation Commission, 2007. 3. Umble, E.J