CROWDSOURCING: AN INSTRUCTIONAL METHOD AT AN EMERGENCY MEDICINE CONTINUING EDUCATION COURSE
It struck me during one of the Emegency Medicine Update Europe Conferences that I organize, that there is an incredible amount of expertise and experience in the room during all the presentations. While we have an "expert" standing at the front of the room giving a presentation, the richness of interaction was during the discussion and debate. Many in the audience have also given continuing education presentations and were leaders in the field of Emergency Medicine. I wondered how I can harness the wisdom of the crowd through a facilitated conversation?
At our Emergency Medicine Update Europe Conference in Spain in 2013, I tried a new teaching method, 15 minutes every day of the 5 -day conference. The description and evaluation of this "crowdsourcing" technique was published online in 2015 in the Canadian Journal of Emergency Medicine.
Introduction
Crowdsourcing is the practice of obtaining needed services, ideas or content by soliciting contributions from a large group of people and especially from an online community. Crowdsourcing has often been used to solicit opinions and solve problems. The purpose of this study was to describe and evaluate a novel teaching method at a traditional continuing education (CE) event using a “crowdsourcing technique”.
Methods
Emergency Medicine (EM) Update Europe was a 5 day conference consisting of 15 forty-five minute traditional CE presentations. Sixty-three physicians registered for the conference, with most (n=57) from Canada. Prior to the start of the conference, registrants were contacted by email and invited to submit up to 3 problems, controversies or questions related to EM that they would like to have discussed during the conference. Each day of the conference, a 15 minute period was devoted to an open crowdsourcing discussion facilitated by the course director (PI) using the list of rank-ordered topics submitted. Participants were asked to complete an anonymous paper survey after the last crowdsourcing activity. The survey consisted of a 15 item questionnaire exploring satisfaction and attitudes towards the activity (Kirkpatrick evaluation levels 1 & 2). In addition, quantitative observational data was collected on frequency of participation during each topic.
Results
Twelve topics were discussed over 5 days. There were 45 registrants present for each topic (range 42–48; SD 1.83) with an average of 9 (SD 2.84) participants contributing to each conversation. Thirty-two unique individuals (67%) contributed to at least one of the conversations. Thirty-nine participants out of 48 completed the survey (response rate 81%). Most (79%) believed that they had knowledge and/or expertise to share with their colleagues. During the crowdsourcing conversation, of the 39 respondents, 80% reported that they had contributed at least once. Most (77%) enjoyed the activity and 74% found the crowdsourcing conversation valuable. Most (64%) reported that they often or very often trusted the opinions of those speaking during the activity. Over half (56%) the respondents reported that they learned something during the crowdsourcing activity.
Conclusions
Deliberate crowdsourcing is perceived as a worthwhile teaching method at an EM CE conference.