make adaptations to suit students with less flexibleschedules, especially engineering students, reflected a commitment by faculty and administratorsto be entrepreneurial in seizing opportunities to develop the program.Engineering ChangesAs entrepreneurship activities proceeded in the Business Department, in the Engineeringdivision, a first-year introductory course intended to familiarize students with computerapplications for engineers was modified in 2013 to follow a new paradigm wherein coursecontent was presented paralleling a real-world engineering consulting project. Topic-specificlectures focused on requisite computer application, analysis and writing skills were paced withperiodic “business meetings.” Those meetings were related to a
twenty competency clusters. We have developed our framework based on thecompetencies proposed by Waychal et al. 8, who have proposed a smaller reasonable subset ofVloke‘s cluster. That, we posit, is a good starting point.We developed the framework with an axiom that the throughput of a learning process increasessignificantly with active participation, intense reflections, and collaborative working on casestudies and real-life projects i.e. student-centered learning. We have synergistically combined theelements to ensure the targeted outcome of the workshop - the ability to explain creativity andinnovation and their underlying dynamics, and the ability to apply the understanding to provideinnovative solutions to real-life problems. We do not
) teach with examples andcases, 5) prime student motivation and use formative assessment.8 Learning blocks werecreated, refined, and utilized in our two most recent Tech-E camps to see if they couldmaintain the same level of engagement with learners while involving deeper learning andentrepreneurship concepts in them.Learning blocks were designed to take advantage of key strategies found in project-basedlearning, such as, tackling realistic problems using the learner’s knowledge, increasinglearners control over their learning, involving instructors that serve as coaches/facilitatorsof inquiry and reflection, and utilizing either pairs or groups in the process. 9,10 Thechallenge portions of the blocks introduce some key entrepreneurship components
most useful for addressingchallenges that are complex, require many people, and in which there is a high degree ofuncertainty about the best approach.1 This set of conditions holds true far beyond productdevelopment.One such scenario is that of planning and implementation of organizational interventions –anenvironment in which “strategic planning” is often the tool of choice but one which is ineffectivein a networked (rather than hierarchical) context. An alternative approach described in this paperis “strategic doing”. As in agile product development, the approach uses iterative cycles ofimplementation, learning and reflection, and improvement, with a focus on rapidexperimentation and gradual scaling up of solutions. While not designed for
actionable commercialoutcomes whereas Entrepreneurship refers to executing on those opportunities in fulfillment ofcommercial outcomes. There is extensive overlap across the continuum from creation/discoveryto execution and outcome. Center programs and lead personnel assignments are being designedto focus on areas of the continuum most appropriate for the stage of the opportunity, theindividuals involved, the technology being pursued, and other relevant factors. This I&Econtinuum, as we envision it, is reflected in Figure 1.Key initiatives of the Center include creation of an Innovators & Entrepreneurs guest speakerseries, execution of a coordinated pitch competition strategy, development of a mentor network,enhanced curricular programming
was administered todetermine skills gains and team accomplishments and to reflect on the participants’ experience inthe program. An alumni survey was administered six months after the conclusion of the summerprogram to check in on Catalyze teams.Survey questions were rated on a one-to-five Likert type scale with a 70% cutoff (a rating of3.5/5) to help evaluate effective program components. Surveys also included open-endedquestions to gather verbal responses to support numerical ratings. Numerical results wereevaluated against a 70% cutoff (3.50/5.00) with activities and ratings above 3.50 considered asevidence of program success for the evaluation of results. Results were presented by theassessment specialist and evaluated at a programmatic