introducing GenAI into student assessments. This study is standaloneand preliminary; it is not planned to be part of a broader research program.BackgroundOver the past few years, text-based chatbots based on Generative Artificial Intelligence (GenAI)large language models (LLMs) have soared in both public availability and usage [1] [2]. Themost popular of these, ChatGPT, has grown in popularity at growth rates never seen in theinternet age [3]. This growth has driven advances in usability and functionality, while alsoraising questions of legal ethics and morality [2]. The consistent viewpoint is that GenAI is hereto stay and humans need to adapt to this new reality. Educators have shifted from awe ofGenAI’s capabilities, to fear of academic integrity
this specific course. Note that in this paper, we will sometimes use the term“AI” to refer specifically to more advanced generative chatbot interactions (e.g., ChatGPT),whereas “LLM” denotes more direct interactions with the underlying large language model, suchas through an OpenAI Assistant or OpenWebUI model.Concept inventories (CIs) have been widely used as diagnostic tools to assess studentcomprehension of key topics 1 and evaluate the effectiveness of instructional interventions 2 .These standardized tests consist of carefully designed multiple-choice questions that targetspecific misconceptions, and they have been administered across high schools, colleges, anduniversities 3 . One issue with multiple-choice question based CIs is the
the latest OpenAI and Anthropic models: ChatGPT 4o [9] and Claude 3.5Sonnet [10]. These two models were selected based on previous benchmark results for reasoningand mathematics [11, 12]. In some of our experiments, the AI LLMs are used ‘as is’ or ‘off theshelf’ - no training and no instructions. In other experiments, the AI LLMs are trained, i.e.instructions are fed into the AI program along with one or more chapters of the course textbook.In evaluating the ability of LLM chatbots to act like a very good TA, we sought to investigatehow the AI TA performs for different amounts of instruction / training prior to asking questions.In our tests, all chatbots are configured with temperature set to 0.3 and a maximum token size of2048.3.3.1
Meslem, Bergische Universit¨at Wuppertal ©American Society for Engineering Education, 2025 WIP: AI in Online Laboratory Teaching - A Systematic Literature ReviewIntroductionThe presence of ChatGPT has recently, and in a short period of time, become increasinglyprevalent in the day-to-day life. Education, being a part and a reflection of the day-to-day life,has therefore also been affected by this change. The fast spread of this technology within thiscontext has however come with its challenges. These include the lack of an adequateunderstanding of it, of how to use it, and how to integrate it in an efficient way in the dailylife (Gill & Kaur, 2023). Many students
[4]. In2024, more research is available for AI in education and industry, including as a virtual assistantusing AI as a prompting tool [14], and as a development bot to enhance software design [18].With the popularity of ChatGPT increasing, from one million users in the first week during thelaunch in 2022, to more than 200 million weekly users in late 2024 [16], the usage of ChatGPTin college engineering courses is expected to follow a significant increase soon. Using AI in the engineering classroom has been seen to offer both advantages anddisadvantages. Students saw an increased confidence in decision-making, critical thinking andproblem-solving skills using AI tools, which may help develop professional skills [14]; however,the
itwithin five days, and within two months, it had 100 million active, daily users. Since that time,many generative AI tools have been developed and have been applied to fields ranging from artto medicine, and poetry to finance [1]. Religious groups have even developed AI tools to providechurch-related information and advice, such as Cathy, the Episcopal Church’s “virtual guide.”Cathy (Churchy Answers That Help You) was trained on the Episcopal Church’s website, theBook of Common Prayer and Forward Movement publications, and the ChatGPT knowledgebase. Suggestions for how this bot is to be used include inquiries on official positions of theEpiscopal church, suggestions for liturgy, and general questions about the church and itspractices [4].With
focus of the literature. Within the first monthsof its launch, it was found that ChatGPT could pass law school exams, though it only managed aC+ [20]. This is just one example of the deluge of papers describing how large language modelscan perform reasonably well on traditional examinations (e.g., [21], [22], [23], [24], [25]). Thesemodels are trained using large and diverse sets of writing and employ statistical procedures topredict a response to a statement or question, which can lead to surprising coherence and theappearance of analytical reasoning.In STEM fields, where communication is less in written short responses and more often acombination of diagrams and equations, generative AI tools have seen uneven success in problem-solving. For
estimating methods [9].Additionally, Ghasemi and Dai [10] investigated the use of GPT-4 in construction estimating,mainly for cost analysis and bid pricing in a bridge rehabilitation project. The study found thatGPT-4 “holds the potential for construction estimating with reasonable accuracy.” Despiteshowing potential, the authors noted that issues related to consistency and reliability could limitGPT-4’s use in complex or novel estimating scenarios [10].AI in Engineering EducationArtificial intelligence (AI) is increasingly integrated into engineering education, reshaping howstudents learn and educators teach. ChatGPT and other generative AI tools are also gainingattention in engineering education. Qadir [11] discussed the potential of generative AI
University in the City of New York Sakul Ratanalert is a Senior Lecturer in Discipline in the Department of Chemical Engineering at Columbia University. He received his BS in Chemical and Biomolecular Engineering from Cornell University, and his MS in Chemical Engineering Practice and his PhD in Chemical Engineering from MIT. His current research interests include developing engaging learning activities and building students’ intuition and conceptual understanding. ©American Society for Engineering Education, 2025 Development of an MEB Novice Chatbot to Improve Chemical Engineering Critical ThinkingAbstractThe rise of ChatGPT, and other generative AI tools, has led
discussions in higher educationincluding its potential uses in and beyond the classroom. Initially, the focus was primarily onpreventing students from using generative AI tools, but attention is now shifting towardintegrating these tools into teaching and learning [1]. Many educators are exploring ways toincorporate generative AI into instruction [2].Students are often assumed to be tech-savvy [3]. With the widespread use of tools like ChatGPT,they may also be perceived as competent users of generative AI. However, effectively using AIfor learning requires more than just basic digital literacy, which can impact both the learningexperience and its benefit. Therefore, studying students’ interactions with AI is important, as thefindings will shape how
Teaching and Educational Research in EngineeringAbstractThe use of generative Artificial Intelligence (genAI) in teaching and education has receivedattention and rapid growth in university engineering programs since OpenAI released ChatGPT inNovember 2022. In this paper, the authors explore the use of genAI in teaching and educationalresearch in engineering disciplines and examine potential benefits and challenges whiletransitioning to genAI implemented in engineering education. This study A) Analyzes howeducators and learners understand and identify the usage of genAI and ChatGPT in engineeringeducation; B) Explores the potential benefits, challenges, and limitations of using thesetechnologies; and C) Identifies educators' perceptions of using
their writing in sustained or long-term writing projects[13, 14]. Due to thismodule, the majority of students were optimistic towards using AI in future assignments forwriting. However, students who use ChatGPT to write tend to run into common pitfalls such asambiguous writing, bias reinforcement, and “hallucinations”[15]. This shift reflects the need toprovide clear guidance on appropriate AI usage in educational settings. This work highlights thegrowing recognition that fostering AI literacy is a crucial educational practice in modernclassrooms.To investigate the ways students respond to AI literacy efforts and how they may change theiruse of genAI in these situations, we introduce structured usage of AI in one lecture to increase AIliteracy
withsports. These findings suggest the need for alternative analogies that better resonate with diversestudent backgrounds.For the solar charging station analogy, 72% of students matched all terms correctly, although someconfusion persisted. For example, 10% mistook ‘DC Source’ for the interface controller, and 15%confused ‘computer controller’ with the image of a cell phone. These findings suggest areas forrefining analogies, particularly in distinguishing components with similar terminology.A survey conducted at the end of the semester confirmed that students preferred real-worldanalogies over AI tools like ChatGPT, highlighting their value in establishing a strong conceptualfoundation and boosting confidence. Table 1 presents key survey results
literature review encompasses academic databases, focusing on search terms such as“GenAI,” “ChatGPT,” and “Generative AI in engineering education.” Relevant papers areanalyzed to identify common themes, which are then synthesized to provide a thematic overviewof GenAI's role in engineering education. Initial findings suggest themes around ethicalconsiderations, pedagogical shifts, and the potential of GenAI to enhance student learning.Ethical concerns, such as algorithmic bias, privacy, and academic integrity, are highlighted,alongside the need for continuous upskilling of both students and educators.This study aims to offer a comprehensive understanding of GenAI's implications in engineeringeducation, serving as a valuable resource for educators
Learning: Insights from Liberal Education Courses in Lebanon Reine Azzi Lebanese American University A Framework for Hybrid Human-AI Learning: Insights from Liberal Education Courses in LebanonAbstractThe global debate over Generative Artificial Intelligence (GenAI) has continued in academicinstitutions, resulting in discussions on academic integrity and educational standards in a worldwhere ‘ChatGPT’ use continues to permeate educational, professional, and social contexts.While some academic institutions initially called for banning GenAI tools, many haveemphasized the need to introduce these tools within controlled
Engineering Education Randall D. Manteufel Mechanical Aerospace and Industrial Engineering Department University of Texas at San Antonio AbstractSince the introduction of ChatGPT in November 2022, Artificial Intelligence (AI) has been poised tosignificantly impact engineering education by enabling real-time problem-solving assistance,personalized learning experiences, and automated grading systems. The potential uses of AI areextensive, particularly in generating detailed responses to specific queries based on its training data.Ongoing investments and rapid advancements in AI are anticipated to drive breakthroughs
variety of complex technical topics, students face challenges in understandingand applying theoretical knowledge. AI technologies such as AI-assisted tutoring systems,performance predictions models, and generative AI tools are effective in enhancing studentinteractions with engineering curriculum improving student understanding and engagement[1][2]. By enabling real-time feedback, personalized learning experiences, and interactiveproblem-solving environments, AI tools are creating new opportunities for engineering education[3][4].The advancement of AI technology, particularly generative AI systems such as ChatGPT fosterscritical thinking and collaboration among students. In a study done by Abril students used AItools such as ChatGPT to obtain and
. The Chronicle of Higher Education has observed, One year after its release, ChatGPT has pushed higher education into a liminal place. Colleges are still hammering out large-scale plans and policies governing how generative AI will be dealt with in operations, research, and academic programming. But professors have been forced more immediately to adapt their classrooms to its presence. Those adaptations vary significantly, depending on whether they see the technology as a tool that can aid learning or as a threat that inhibits it [11]. Faculty perspectives and responses are particularly critical in professional programs suchas engineering, medicine, and teacher preparation, where the rapid integration
tracks learners’ progress, i.e., it adjusts future responses based onconversation history, and account for the user's existing knowledge. The Adviser alsoincorporates user-level personalization, dynamically adjusting language and the depth ofinformation to align with different user levels. Additionally, Knowledge Retrieval AugmentedGeneration (RAG) [8] integrates knowledge retrieval from manufacturing documents withLarge-Language-Model’s generation capabilities (ChatGPT in this case) to provide contextuallyrelevant responses. Manufacturing documents are divided into smaller chunks of 500 words.Each chunk is transformed into a numerical representation (embedding), capturing semanticinformation for similarity-based retrieval. Figure 1 shows the
weights as a percentage of the totalfinal grade and were graded complete/incomplete based on meeting the assignment requirements. The use of GAItools to complete assignments was permitted, since the authors believe that such tools could be important equitymeasures in a reading-heavy course [16], with the requirement that students attribute their usage of such toolswherever used, such as signing assignments with “proofread by ChatGPT” if done so. Students were alsoencouraged, in line with some assignment requirements mentioned below, to experiment with various GAI assistantsin writing and completing assignments, thus being able to determine which tool could best support which action.The course was offered in a synchronous HyFlex format [8], where
GPTZero and TurnItIn claim to identify whether a student’s writing was One key aspect of this paper is the distinction betweenproprietary and open-source large language models. Proprietary produced by generative AI, but they are highly inaccurate.models, such as OpenAI’s ChatGPT, are often considered less They tend to flag simple or predictable writing as AI-secure and privacy-invasive compared to open-source generated. Studies show that such false positives occur morealternatives like Meta’s Llama. Educating students on the frequently among certain groups, including
equitable and effective implementation. Ultimately, AI has thepotential to revolutionize higher education by making learning more efficient, inclusive, andadaptable to the needs of the learner.Keywords: Artificial Intelligence, higher education, personalized learning, adaptive teaching,student outcomes, data-driven education.IntroductionIn recent years, the integration of artificial intelligence (AI) into educational settings hascaptured significant public interest, eliciting both fascination and concern. According toEducause, while administrators and educators worry about AI undermining instructional quality,students have embraced AI tools like ChatGPT, appreciating their utility while remainingcautious about risks. This dichotomy underscores the
studies representative of student experiences from eachcategory that expands on the model and its implications in higher education learningenvironments. The findings emphasize that learning is not a static process; students’ interactionswith AI tools evolve over time, influenced by their initial attitudes and skills. The implications ofthis paper extend to curriculum design, pedagogical approaches, and the broader integration ofgenerative AI tools in higher education.IntroductionThe rapid advancement of generative artificial intelligence has revolutionized various industries,including education. As generative AI tools such as ChatGPT, Claude, and Gemini becomeincreasingly accessible, educators are exploring their potential to transform teaching
identify trends in mental health apps since 2009.Privacy policy documents of mental health apps are also collected for analysis. ChatGPT isutilized to extract privacy-related metrics, such as the percentage of apps that reference privacyregulations like the General Data Protection Regulation (GDPR), the Health InsurancePortability and Accountability Act (HIPAA), and the Children’s Online Privacy Protection Act(COPPA). LLMs and RAG are employed to answer critical privacy and security-relatedquestions from the dataset of privacy policy documents. These questions cover multiplecategories, such as the types of user information collected, details on third-party data sharing,and whether users are given options to opt out of data collection.Results:Our
OpenAI’s ChatGPT and DALL·E, have been utilized to supportdiverse educational needs, including content creation, question generation, and adaptive learningenvironments. For instance, generative AI can create dynamic learning modules tailored toindividual student needs, enabling differentiated instruction and addressing varying levels ofprior knowledge [5,6]. This personalization is particularly valuable in engineering education,where students often face challenges in grasping complex concepts. In addition to personalizedinstruction, generative AI aids instructors by automating administrative tasks, such as gradingand feedback provision. Automated grading tools powered by AI can evaluate assignments andexams efficiently while providing detailed
learning: Knowledge-building andknowledge-telling in peer tutors’ explanations and questions. Review of Educational Research,77(4), 534–574.[2] Cohen, P. A., Kulik, J. A., & Kulik, C. C. (1982). Educational outcomes of tutoring: A meta-analysis of findings. American Educational Research Journal, 19(2), 237–248.[3] Chen, A., Wei, Y., Le, H., & Zhang, Y. (2024). Learning-by-teaching with ChatGPT: Theeffect of teachable ChatGPT agent on programming education. arXiv preprint arXiv:2412.15226.[4] Topping, K. J. (2005). Trends in peer learning. Educational Psychology, 25(6), 631–645.[5] Goodlad, S., & Hirst, B. (1989). Peer Tutoring: A guide to learning by teaching. London:Kogan Page.[6] Biswas, G., Leelawong, K., Schwartz, D., & Vye, N
and scipy (version 1.7.1) for statistical computations. IEEE Open Journal of the Communications Society, vol. 4, pp. 2952 2971, 2023, doi: 10.1109/OJCOMS.2023.3320646.tools such as Large Language models like ChatGPT are being The sentiment analysis component utilized multiple indicators A moderate positive correlation was observed between self-reported AI [3] Li Y, Wang C, Cao Y, et al. Human pose estimation based in-home lower body rehabilitation system[C]//2020implemented in educational institutions to provide personalized learning
has been heralded as an enticing opportunity to improve education, there are concernswith the ethical use of AI in education [6]. What are the academic implications of using an LLMto assess student assignments? Students are potentially penalized or punished if they use it intheir assignments. How would the instructor’s use of an LLM be different? Kumar [6] discussesthis dilemma as to what is right and good. They list quick feedback that is of high qualityprovided at a reasonable cost and convenience as good but predicates these benefits with the AI’suse as being right or wrong.There are also questions on how student information is processed in an LLM. Though servicessuch as ChatGPT have a large number of users, if an assignment is loaded for
shown a wide range of interests in the AI-in-education domain, with themajority focusing on the applications, impacts, and potential of GenAI in education [2]. Studiesexplore the effects GenAI may have on academic practices and how it could shape the wayindividuals participate in academic activities and achieve educational outcomes. For example,Oguz et al. and Kasneci et al. examined the effectiveness of tools like ChatGPT as educationalaids in personalizing learning [14], [15]. Abedi et al. investigated the integration of LargeLanguage Models (LLMs) and chatbots in graduate engineering education, highlighting theirpotential to enhance self-paced learning, provide instant feedback, and reduce instructor workload[16]. Alasadi and Carlos, as well
disrupt inequities. Manywidely used AI tools, such as ChatGPT, are trained on massive proprietary datasets controlled byprivate corporations, raising questions about data security, bias, and accessibility. These concernsare particularly pressing in education, where AI’s role in student and faculty interactions must becritically examined. Without transparent and equitable governance, AI risks reinforcing existingpower imbalances rather than dismantling them.By centering only on the technical aspects of AI, we risk unintended consequences that reinforcesystemic inequities, creating outcomes that disproportionately harm marginalized groups. Someresearchers are exploring ethics, bias, and social responsibility regarding AI [5]. In this practicepaper