Virtual On line
June 22, 2020
June 22, 2020
June 26, 2021
Diversity and NSF Grantees Poster Session
A central feature of step-based tutoring systems, which are known to be more effective than conventional answer-based tutoring, is to accept and evaluate each step of a student’s work and provide immediate feedback. In applying this approach to the domain of linear circuit analysis for topics such as superposition, source transformations, and Thévenin/Norton equivalent circuits, it is necessary to allow students to draw or re-draw circuits after killing sources, combining elements in series or parallel or making source transformations, or deriving equivalent circuits, and then provide appropriate feedback (funded by the NSF IUSE program). Here, we describe approaches used for this problem in our tutoring software and systematic assessment of such interfaces. Necessary features involve the ability to “split” or shift a circuit (plotted on a square grid) to make room to place new elements, the ability to drag elements to new positions and reconnect them, and change the type or value of elements or delete them as required in randomly generated circuit topologies. The “sought variables” (unknown currents, voltages, and powers) may also need to be transformed to different quantities to permit series and parallel simplifications. We have also implemented advanced simplification methods that are frequently useful for such problems, including the ability to remove circuit sections that are removably hinged, voltage-splittable, or current-splittable as we discussed in previous work. We will describe approaches and algorithms that implement such functionality and show how it can be used to extend step-based tutoring to situations requiring it. We will further describe automated generation of “transcripts” of student work to overcome a key deficiency of computer-based instruction where students have no record of their work from which to study or review. We have also implemented video tutorials for each exercise to help overcome the learning curve associated with the user interface.
Independent assessment of these features was carried out by professional evaluators in several different participating institutions including those that heavily serve underrepresented minority populations. Evaluations included both quantitative and qualitative methods including focus groups, surveys, and interviews with students and faculty participants. Controlled, randomized experiments were also carried out in a total of three course sections in Spring and Fall 2019 to compare use of the tutoring software to a commercial answer-based system (in Sp’19) and to conventional textbook-based paper homework (in F’19). A total of three advanced tutorials were assessed. In Fall 2019, students rated our software a mean of 4.14/5 for being helpful to learn the material vs. 3.05/5 for the paper homework (HW), p < 0.001 and effect size d = 1.11 σ (N = 43 and 41 in the two groups); rated software difficulty as 2.91 (1=extremely easy, 5 = extremely difficult) compared to 3.83 for the paper HW, p < 0.001 and d = 1.10 σ; and 1.63 for preferring a different HW type for software users (1 = somewhat disagree, 5 = strongly agree) vs. 3.90 for paper HW (p < 0.001 and d = 2.07 σ). Assessment of learning via a post-test quiz and exam is in progress.
Skromme, B. J., & Redshaw, C., & Gupta, A., & Gupta, S., & Andrei, P., & Erives, H., & Bailey, D., & Thompson , W. L., & Bansal, S. K., & Barnard, W. M. (2020, June), Interactive Editing of Circuits in a Step-based Tutoring System Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34859
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015