Baltimore , Maryland
June 25, 2023
June 25, 2023
June 28, 2023
Engineering Technology Division (ETD)
18
10.18260/1-2--42749
https://peer.asee.org/42749
338
Brian Ngac is Deputy to the Vice President of Digital Engineering Research & Development Programs at Parsons Corporation’s Defense & Intelligence Unit, and a PhD Candidate (ABD) at George Mason University’s College of Engineering & Computing. He holds 12 internationally recognized cyber security and management certifications including the C|CISO, CISSP, ISSMP, CISM, and PMP. His areas of expertise are in cyber security, digital engineering (RDT&E), and business process improvement (solving business challenges with technology solutions). His research focus are in cyber executive management, expert crowdsourcing, and decision analytics.
Mihai Boicu, Ph.D., is Assistant Professor of Information Technology at George Mason University. He published over 120 peer-reviewed publications, including 4 books. He performs theoretical and applied research in Artificial Intelligence, Machine Learning, Probabilistic Reasoning, Crowdsourcing and Engineering Education. He received more than 3M in funding from NSF, DARPA, IARPA, AFOSR, IC and other government agencies.
State of the art curriculum development efforts are done with a committee often consisting of two to four faculty members but are commonly undertaken by the assigned course instructor. However, the small number of faculty participants in the curriculum development effort can yield an out-of-date and insufficient curriculum for students entering the industry workforce (Thompson & Purdy, 2009; Nakayma, 2012; Gupta, 2016). Crowdsourcing has been used to gather more input from domain experts consisting of faculty and industry professionals (Woodward et al., 2013; Satterfield et al., 2010; Nakayma, 2012). However, these efforts can yield large amounts of inputs from various crowd workers resulting in additional time required for the curriculum owner or committee to sort through all inputs, organize them into categories, identify similarities and determine which topics to include in the curriculum based on the crowd’s consensus [citation anonymized]. In a previous experiment performed, we obtained 1) that crowdsourcing efforts can be effective at gathering inputs which both confirm or reject existing topics and yield additional topics to be included; 2) continuing and dynamic expert crowdsourcing is helpful in building consensus of newly suggested topics for the curriculum; and 3) manual aggregation of crowdsourced inputs is an inefficient process [citation anonymized]. This paper proposes the integration of a consensus building function into a crowdsourcing platform to dynamically and automatically validate and integrate both structured inputs (topics with confirmed representation) and semi-structured inputs (newly suggested curriculum topics) from the expert crowd when developing a course curriculum. The topics are organized in an ontology, having a hierarchical relationship “subtopic of” and an associated consensus value representing the agreement between the crowd participants about the inclusion of the topic (and its representation). When a worker from the expert crowd is providing feedback about an existing topic, the consensus value for that topic will be updated (increasing or decreasing the agreement). To ensure confidence in the consensus value a minimal number of inputs is required, named threshold confidence, (which is established based on the number of available workers and the expected working time). A topic may be in three different states: under review (after it is proposed, and before reaching consensus), accepted (if the consensus value is above the acceptance threshold with the required confidence) or rejected (if the consensus value is below the rejection threshold with the required confidence). While the crowd process focuses on the topics under review, the consensus value continues to be updated and it is possible to have further refinement both of accepted and rejected topics. The process continues until consensus is reached on all topics. In this paper we describe an experiment performed using a dynamic crowdsourcing platform with consensus building function for a new cybersecurity curriculum and compare it with a previous experiment in which we use a typical crowdsourcing platform for another cybersecurity curriculum. The current cybersecurity curriculum focuses on the topic of managing network and data security, while the previous curriculum focuses on the topic of managing information security with vendors and partners.
Ngac, B. K., & Boicu, M. (2023, June), Consensus Building Method for Expert Crowdsourcing of Curriculum Topics Paper presented at 2023 ASEE Annual Conference & Exposition, Baltimore , Maryland. 10.18260/1-2--42749
ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2023 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015