]. The plat-form’s Python-based code editor, combined with ROS2 and Gazebo for simulation, en- 2ables students to apply programming concepts directly in a robotics context, bridgingthe gap between abstract coding exercises and real-world applications. One of the key motivations behind the development of the FORE platform is the needfor a flexible and scalable educational tool that can adapt to the varying needs of students.For beginners, the platform provides structured lessons that gradually introduce corerobotics concepts such as motion control, sensor integration, and path planning. For moreadvanced students, the platform offers opportunities to explore more complex roboticalgorithms and systems
howthese strategies impact success. In this study, the term “best practices” refers to guidelinesthat have been established for optimizing AI interactions during problem-solving tasks, forexample in (Open AI, 2024; Google, 2024).An IRB-approved plan guided data collection from the competition, where teams of threeundergraduate students were encouraged to use generative AI to solve programmingproblems. Over 100 students participated. After the competition, students voluntarilysubmitted transcripts documenting their interactions with AI tools. These transcripts wereexamined using a directed content analysis (Hsieh and Shannon 2005) to assess how wellstudents followed prompt engineering best practices.The study findings reveal significant variability
information was highlighted in observations that AI eliminates”scrounging on Google to find a good explanation for a question...you get a direct response to yourniche question.” Beyond student applications, participants recognized potential faculty benefits,noting that AI can assist in developing ”assignment description[s] or...lesson plan[s]” and ”helpteachers generate homework and streamline the process of grading,” allowing educators to ”spendmore of their time focused on the students.”3.1.5 Creative SupportAI’s utility as an ideation tool and starting point generator was identified in 42 responses (10.50%total). Participants valued how AI chatbots help students ”get an idea as to how to start a certainproblem...when they have no idea
plan to test the two classifier models on various types of student learningdata, for example, the live student engagement and performance data obtained from a learningmanagement system, as it would assist course instructors to better assess student academic needs.Using our own data instead of an online dataset might also help address severe class imbalanceissues. Additionally, it may be worthwhile to build hybrid models that combine RFC’s featureselection with MOC’s multi-output prediction, as it would allow for more precise predictions anda deeper understanding of how different aspects of student engagement and performance areinterrelated. Furthermore, it may be worthwhile to incorporate some qualitative data, such asteacher assessments
disabilities should have equal privileges to use an application like everyoneelse. To do better, we need to incorporate them in our plans during development. This inclusionensures that their specific needs and challenges are addressed, leading to a more inclusive anduser-friendly experience for all.”5.2.3 Envisioned Accessible Designs and PracticesStudents uttered accessibility features they would consider when designing mobile apps, such asvoice commands for people with physical impairments and blind people, built-in screen readersfor blind people, and customizable versions for different disabilities. One student talked abouthow customizing text size could enhance older adults’ digital experience, applying knowledgelearned to a new accessibility
time. Multiple participants shared their experiencewith the goal tracker feature of ClearMind: I had a lot to focus on each day, but the goal tracker kept me on track with checking in with ClearMind. The daily progress was color-coded, which motivated me to fill it in every day. [This refers to the goal tracker feature, where if a user misses a day, the color for that day’s progress grays out.] As a visual learner, seeing my progress was helpful. It not only helped me with procrastination but also with career planning. I would keep using ClearMind because of that one feature I just talked about—the score. I want to see how high it goes [my score changes over time].Many participants appreciate
(not replace) teachers.”These recommendations highlight a strong desire to instill ethical awareness and critical thinkingas part of AI use in education, ensuring that AI tools are pedagogically supportive and not misused.2. Customization and PersonalizationParticipants suggested that AI tools should be better tailored to meet the diverse learning needsof students and the instructional preferences of educators. Recommendations in this themeincluded: “Offer personalized learning based on student pace and style.” “Let educators customize AI outputs, lesson plans, and prompts.” “Provide multimodal support (e.g., visuals, audio, interactive tools).”These responses reflect a call for
model GC-induced write amplification inSSDLab. Real SSDs may have other sources of write amplification like wear leveling[3], [46].shown in Table 1 provide none or only a few lines of information for storage. The first week is forHDD/SSD physical internals, which is related to the latency calculation. The second week is forSSD and FTL details. Our planned student workload per programming assignment is 50 to 60lines of code in a duration of two weeks. Thus, we also provide two weeks of lab sessions todiscuss the programming assignment.In the labs, we review the overall structure of the topics (i.e., how FTL handles requests) andmake sure students understand how to implement the assignments from a high level by asking andanswering questions about
who completed the previous version of thecourse. We will recruit students from eight additional engineering courses to pilot the updatedCS1 assessment in Spring 2025, anticipating at least 500 participants. Over the next two years,we will continue data collection as part of our longitudinal study to measure the long-term effectsof the CS1 redesign on engineering students’ ability to apply computational tools in theirrespective fields.We plan to revise the existing labs and incorporate group activities using the Process-OrientedGuided Inquiry Learning (POGIL) framework [50, 51]. POGIL is an instructional approach inwhich students work in structured groups with assigned roles, actively exploring concepts andconstructing their own understanding