971567221566Matern Child Health JMatern Child Health JMaternal and child health journal1092-78751573-662825107597429581910.1007/s10995-014-1585-xHHSPA653432ArticleEvaluation of the 2012 18th Maternal and Child Health (MCH) Epidemiology and 22nd CityMatCH MCH Urban Leadership Conference: Six Month Impact on Science, Program, and PolicyArellanoDanielle E.Division of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USA. Oak Ridge Institute for Science and Education, Oak Ridge, TN, USA. Division of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, 4770 Buford Hwy NE, MS F-74, Chamblee, GA 30341, USAhiz6@cdc.govGoodmanDavid A.Division of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USAHowletteTravisDivision of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USA. Oak Ridge Institute for Science and Education, Oak Ridge, TN, USAKroelingerCharlan D.Division of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USALawMarkCityMatCH, University of Nebraska Medical Center, Omaha, NE, USAPhillipsDonnaDivision of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USAJonesJessicaMaternal and Child Health Bureau, Health Resources and Services Administration, Washington, DC, USABrantleyMary D.Division of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USAFitzgeraldMaureenCityMatCH, University of Nebraska Medical Center, Omaha, NE, USA81201592014019201518715651571© Springer Science+Business Media New York (outside the USA) 20142014

The 18th Maternal and Child Health (MCH) Epidemiology and 22nd CityMatCH MCH Urban Leadership Conference took place in December 2012, covering MCH science, program, and policy issues. Assessing the impact of the Conference on attendees’ work 6 months post-Conference provides information critical to understanding the impact and the use of new partnerships, knowledge, and skills gained during the Conference. Evaluation assessments, which included collection of quantitative and qualitative data, were administered at two time points: at Conference registration and 6 months post-Conference. The evaluation files were merged using computer IP address, linking responses from each assessment. Percentages of attendees reporting Conference impacts were calculated from quantitative data, and common themes and supporting examples were identified from qualitative data. Online registration was completed by 650 individuals. Of registrants, 30 % responded to the 6 month post-Conference assessment. Between registration and 6 month post-Conference evaluation, the distribution of respondents did not significantly differ by organizational affiliation. In the 6 months following the Conference, 65 % of respondents reported pursuing a networking interaction; 96 % shared knowledge from the Conference with coworkers and others in their agency; and 74 % utilized knowledge from the Conference to translate data into public health action. The Conference produced far-reaching impacts among Conference attendees. The Conference served as a platform for networking, knowledge sharing, and attaining skills that advance the work of attendees, with the potential of impacting organizational and workforce capacity. Increasing capacity could improve MCH programs, policies, and services, ultimately impacting the health of women, infants, and children.

MCHCapacity buildingImpact assessmentConference evaluation
Introduction

For the first time, from December 12–14, 2012, the Maternal and Child Health Epidemiology Program/Division of Reproductive Health (DRH)/Centers for Disease Control and Prevention (CDC), CityMatCH, and the Maternal and Child Health Bureau (MCHB)/Health Resources and Services Administration (HRSA) co-hosted the 18th Maternal and Child Health (MCH) Epidemiology and 22nd CityMatCH MCH Urban Leadership Conference (referred to as ‘the Conference’ [1]). The goal of the Conference was to advance partnerships in MCH data, practice, and policy; ultimately, the Conference offered a platform for sharing information, enhancing knowledge, and generating new ideas for improving the health and well-being of mothers and children. By developing a collaborative co-hosted Conference, partnering agencies were able to integrate scientific and programmatic/policy areas in the field of MCH into one venue. Because this was a combined Conference, the Planning Committee was able to implement a policy and program track of seminars, symposia, and breakout sessions which complemented the epidemiology track traditionally offered to attendees. Additional knowledge and skill-based development opportunities for MCH professionals included pre-Conference trainings devoted to scientific writing, spatial analysis, quality improvement, and leadership. These trainings were followed by the multi-day Conference.

Evaluation has become critical to understanding the usefulness, impact, and influence of conference content to professionals applying concepts in the real-world setting. Long-term follow up on knowledge and skills obtained during conferences is critical for determining conference impacts [2, 3]. Additionally, Neves et al. [2] have identified five elements essential in conference evaluation: (1) pre-determining conference objectives, (2) defining the purpose of the evaluation, (3) developing a methodology for evaluation, (4) outlining indicators of success, and (5) selecting a theory or model. To determine the impact of the Conference on the practice of federal, state, and local MCH professionals, the Conference Planning Committee implemented a structured, prospective evaluation. The Committee chose to focus on elements 1–4, with an emphasis on the following indicators of success:

Usefulness of knowledge acquired at the Conference in professional settings.

Follow-up on networking interactions to develop collaborative activities.

Program/policy change based on Conference outputs.

This article provides evaluation results summarizing the subsequent impacts of the Conference on attendee professional networking interactions and application of new knowledge and skills attained during the Conference, through a 6 month post-Conference assessment.

Methods

Assessments were administered at two time points. The first assessment was administered at the time of online Conference registration (September–December 2012). Data from this assessment represented all fully registered and paid attendees, and documented professional role, organizational affiliation, and expectations for the Conference. The follow-up assessment was administered 6 months after the Conference (June–July 2013). Questions on this assessment measured the impact and use of the Conference content and initiation of new networking interactions. Both assessments were administered online and included closed and open-ended questions. No incentive was offered for completion of any assessment. The 6 month post-Conference follow-up assessment is the focus of this report, with supporting information from the registration data. The assessments were considered program evaluation, categorized as non-research, and exempted from human subjects review by the CDC.

Two variables were used in analysis from the registration data: the primary professional role and organizational affiliation of Conference attendees. The two registration variables were also used to determine whether there were differences in the distribution of professional roles and organizational affiliations between Conference registrants and 6 month post-Conference assessment respondents. An alpha value of <0.05 was the threshold for statistical significance.

Variables used in the analysis of the 6 month post-Conference assessment, focused on the following five impact areas. Open and closed-ended questions pertaining to these variables are shown in Table 1.

The development of networking interactions and/or new relationships with Conference attendees, and products or impacts resulting from networking opportunities.

The transfer of knowledge gained at the Conference among co-workers and colleagues.

The application of MCH epidemiologic knowledge learned at the Conference, as well as resulting products or impacts.

The application of MCH program and policy knowledge learned at the Conference, as well as resulting products or impacts.

The use of knowledge gained at the Conference to translate data into public health action.

Data from each of the Conference evaluation files were imported into Microsoft Excel. The two post-Conference evaluations were linked by a unique identifier, created for each respondent based on IP address. All identifiable information was removed from the file prior to analysis. The 6 month evaluation file and registration file were then merged using STATA v11. Linking registration information allowed the researchers to describe the five impact areas by attendee demographic characteristics. Quantitative data were cleaned and descriptive analyses developed using STATA v13 and SAS 9.3. Open-ended questions were reviewed to identify common themes and specific impact examples related to quantitative findings.

Results

Online registration was completed by 650 individuals, 197 (30 %) of whom responded to the 6 month post-Conference assessment and were linked by IP address. The distribution of respondents did not significantly differ between registration and 6 month post-Conference assessments by organizational affiliation, as shown in Table 2. Additionally, the distribution of respondents by professional role did not significantly differ between assessments; although, we observed an increase in percentage of those self-reporting as epidemiologists at the 6 month follow-up (from 27 to 37 %), and a decrease in administrators/managers (31–25 %).

Pursuing Networking Interactions

In the 6 months following the Conference, 65 % of respondents reported pursuing a networking interaction; with variation by professional role (range 45–100 %; Table 3) and organizational affiliation (range 53–94 %; Table 3). MCH professionals described how networking interactions resulted in multiple impacts. Common themes included: developing job opportunities, sharing technical expertise, contributing to public presentations, producing peer-reviewed publications, building relationships, and collaborating on projects.

Two specific examples provided by respondents were:

An epidemiologist said “As a result of a meeting at the Conference for Region V epidemiologists and MCH Directors, I obtained feedback to improve data sheets describing sources of excess infant mortality in Region V. I have continued ongoing work with the epidemiologists to assess the potential impact of various factors (such as smoking, breastfeeding, family planning) that may inform the selection of programmatic strategies for their participation in [the Collaborative Innovation and Improvement Network (CoIIN) to Reduce Infant Mortality].”, and

A program administrator said “After networking with some of the March of Dimes attendees and finding out how they had initiated work regarding the early elective delivery issue, I [returned home] and am in the process of talking with some of the folks that were recommended; we now have a monthly meeting of several groups within the state, discussing and acting on this issue.”

Sharing of Knowledge Gained at the Conference

In the 6 months following the Conference, 96 % of respondents stated they shared knowledge from the Conference with their co-workers and others in their agency; with little variation by professional role (range 90–100 %) or organizational affiliation (range 89–100 %). Knowledge transfer occurred by a variety of methods, techniques, and approaches. Common themes included: use of framing for communication, use of data to inform program development, and use of information for teaching and training.

A specific example provided by one respondent who self-identified as a medical professional captures the diverse ways that knowledge was shared by attendees:

“[I am] now incorporating [communication and] framing messages [learned in a Conference session] in the community meetings I facilitate (i.e., reframing how we think about teens). Also, I incorporated slides from the framing session into a professional staff workshop that I conducted. [Additionally,] I am using more data and talking about data to staff and the community [on] how to use data to develop programs.”

Applying New Knowledge in Epidemiology, Program, and/or Policy, and Translating Data into Public Health Action

In the 6 months following the Conference, 73 % of respondents stated they had applied MCH epidemiology skills, methods, or practices learned at the Conference in their work; with substantial variation by professional role (range 45–92 %) and organizational affiliation (range 47–88 %). A similar overall percentage of respondents reported applying MCH program and policy skills, methods, or practices learned at the Conference in their work in the 6 months after the Conference (78 %). While there was variation by professional role (range 75–86 %) and organizational affiliation (range 65–86 %), the ranges were more narrow than for epidemiology skills, methods, or practices. In the 6 months after the Conference, 74 % of respondents said that they had utilized knowledge from the Conference for translating data into public health action, with variation by professional role (range 50–83 %) and organizational affiliation (range 56–85 %).

The application of new knowledge from the Conference impacted the practice of MCH. Common themes included: using new tools (software programs, Life Course metrics) and research methods, publishing scientific work, integrating information into public health decision-making and presentations (programmatic, public, and scientific), and improving data skills.

Some specific examples provided by program, or organizational managers were:

“Because I’m a nurse manager I come back with a better understanding of the importance of accurate data and I look for ways to improve it within my own health system,” and

“[From applying knowledge learned at the Conference], we have completed a Community Health Needs Assessment in conjunction with local partners and are now in the process of evaluating that data so that we can begin a Community Health Improvement Plan.”

One attendee was “made aware of uses of different data systems and [is] planning to use data sources to assess and monitor [public health programs].” Others also worked within their network to strategically integrate epidemiologic knowledge to impact programmatic work: “I worked with our state Privacy Office to access real-time birth data to identify elective deliveries prior to 39 weeks to inform our Perinatal Quality Collaborative effort to reduce elective deliveries.”

Discussion

Through networking, and the sharing and application of new knowledge, attendees of the 2012 Conference achieved potentially far-reaching methodological, programmatic, and policy-related impacts. The Conference served as a platform for networking, with more than half of respondents following up on a networking interaction. The impacts of networking interactions included promoting capacity building through internships and jobs, and increasing sustainability within organizations through work groups and strategic decision-making. Almost all respondents shared new knowledge from the Conference with colleagues and partners, including both technical program and epidemiological knowledge. Continued attendee sharing of knowledge and skills has the potential to increase and broaden the impact of the Conference. Additionally, respondents went beyond sharing new knowledge to applying new knowledge and skills gained at the Conference. Approximately three-quarters of respondents applied epidemiological, program and policy, and translation knowledge from the Conference further improving the efficiency and effectiveness of their work.

Many conference evaluations measure immediate post-conference indicators of satisfaction, conference quality, and intention to act on learned knowledge [46]. The current evaluation is distinct in that the Conference Evaluation Committee evaluated the impact of the Conference following an extended time period post-Conference [7, 8]. Responses to the areas of impact in the six-month follow-up assessment varied by organizational affiliation, with community-based organizations, non-governmental organizations, and local health department staff indicating lower percentages applying new epidemiologic knowledge and translating data into public health action. A guiding principle of the Conference is to have an equally positive impact on all attendees, regardless of professional role or organizational affiliation. To ensure equitable impact, Travers et al. [9] recommend implementing Community-Based Participatory Research workshops and/or sessions highlighting rigorous and higher quality science to better engage attendees at the local and community levels. Further, to support a cohesive, consistent, and broad impact, Wiessner et al. [10] recommend selecting a conference-specific theory, ‘New Learning,’ as part of conference planning and evaluation. In order to engage attendees in all organizational levels at this Conference and better integrate methods and practice, a new policy and program track was planned to complement the traditional epidemiology track. In response, the majority of attendees indicated they apply MCH program and policy skills, methods, or practices learned at the Conference in their work; reinforcing the value of adding the program and policy track to the Conference. Together with the epidemiology track, attendees received information that fully engaged them in scientific, developmental, and policy discussions providing them with a more well-rounded experience.

The evaluation of the 2012 co-hosted 18th MCH Epidemiology and 22nd CityMatCH MCH Urban Leadership Conference provides a baseline methodology for future Conference evaluations. The five impact areas will be monitored. Variations in impacts according to professional roles and organizational affiliations suggest potential opportunities for improvement, and are under consideration by the Conference Planning Committee. Conducting focus groups within professional role type and organizational affiliation may help to identify specific strategies for increasing impact among those groups with a lower prevalence of interaction follow-up and knowledge application.

Limitations of the evaluation include an inability to examine variation in impact by demographic differences of respondents, such as age or length of career in public health, which may have helped to identify focused opportunities for Conference improvement. These data are self-reported, which may affect findings. Respondents who had a positive experience at the Conference may be more inclined to share impact activities than those whose experiences were less positive. Therefore, response bias may have affected evaluation results. While a 30 % response rate at 6 months post-Conference is comparable or better than other published conference evaluations have documented, there is still opportunity to increase response rates [2, 9]. These are areas to address in future evaluations. Although the authors recognize these limitations, they also acknowledge that the distribution of professionals who responded was similar at registration and in the 6 month post-Conference follow-up.

This evaluation provides evidence that conferences can create expanded information sharing and networking, application of new knowledge, and the translation of data to propel public health action. When the Conference impacts documented in this evaluation are considered together, the Conference positively influenced the practice of MCH at the local, state, and federal levels, as well as across professional roles; creating potential for downstream impact on organizational and workforce capacity leading to changes in implementation of programs and policies in the field of MCH. The subsequent longer-term outcome of these impacts is improved health among women, infants, and children.

The authors would like to thank the Maternal and Child Health Epidemiology Program at the Centers for Disease Control and Prevention (CDC), the Maternal and Child Health Bureau at the Health Resources and Services Administration (HRSA), the Association of Maternal and Child Health Programs (AMCHP), and CityMatCH for their scientific, administrative, and programmatic contributions to this article. Additionally, the authors would like to thank the 2012 Ad Hoc Conference Planning Committee for their participation in collecting data for the evaluation and the 2012 Conference Evaluation Committee, a subcommittee of the Ad Hoc Conference Planning Committee (denoted in bold), for their design of the conference evaluation tools: Chad Abresch, MEd, Director, City-MatCH; Folorunso Akintan, MD, MPH, Montana-Wyoming Tribal Leaders Council; Danielle Arellano, MPH, CDC; Mary Balluff, MS, Chief, Community Health and Nutrition Services, Douglas County Health Department; Wanda Barfield, MD, MPH, CDC; Mary Brantley, MPH, CDC; Elizabeth Conrey, RD, PhD, CDC Assignee to Ohio Department of Health; Deborah Dee, PhD, CDC; Maureen Fitzgerald, MPA, CityMatCH; David A. Goodman, PhD, CDC; Violanda Grigorescu, MD, MSPH, CDC; Cynthia Harding, MPH, Director, Los Angeles County Department of Public Health; Travis Howlette, MPH, CDC; Jessica R. Jones, MPH, HRSA; Russell S. Kirby, PhD, MS, FACE, University of South Florida; Laurin Kasehagen, PhD, CDC Assignee to CityMatCH; Michael Kogan, PhD, HRSA; Charlan D. Kroelinger, PhD, CDC; Mark Law, PhD, CityMatCH; Leslie O’ Leary, PhD, CDC; Joyce Martin, MPH, CDC; Patricia O’Campo, PhD, University of Toronto; Donna Phillips, MPH, CDC; Ellen Pliska, MHS, Association of State and Territorial Health Officials; Italia Rolle, PhD, RD, CDC; Kristin Rankin, PhD, School of Public Health, University of Illinois at Chicago; William Sappenfield, MD, MPH, Chair, Department of Community and Family Health, University of South Florida; Laura S. Snebold, MPH, National Association of City and County Health Officials; Caroline Stampfel, MPH, AMCHP; Gina Thornton-Evans, DDS, MPH, CDC; Calondra Tibbs, MPH, Memphis and Shelby County Health Department; Keila Torres, JD, BSN, RN, Drexel University, College of Nursing and Health Professions; Myra Tucker, RN, CDC; and Lee Warner, PhD, MPH, CDC.

Disclaimer: The findings and conclusions in this report are those of the author(s) and do not necessarily represent the official position of the Centers for Disease Control and Prevention, Health Resources and Services Administration, or the Department of Health and Human Services.

Archived maternal and child health epidemiology conferencescited Access May 29, 2014http://www.cdc.gov/reproductivehealth/MCHEpi/ArchivedMCHEPiConf.htmNevesJLavisJNRansonMK2012A scoping review about conference objectives and evaluative practices: How do we get more out of them?Health Research Policy and Systems10263722857399FosterJGuisingerVGrahamA2010Global government health partners’ forum 2006: Eighteen months laterInternational Nursing Review5717317920579151CurwickCCReeb-WhitakerCKConnonCL2003Reaching managers at an industry association conference: Evaluation of ergonomics trainingAmerican Association of Occupational Health Nurses Journal5111464469MahoneyMCMichalekAMWigginsCL1996Native American cancer conference III: Cognitive correlates and impressions of attendeesCancer787153315378839566RootmanIEdwardsP2006As the ship sails forthCanadian Journal of Public Health97Supp 2S43S4616805161LalondeBWolvaardtJEWebbEM2007A process and outcomes evaluation of the international aids conference: Who attends? Who benefits most?Medscape General Medicine9162217435615FarnumKMcCarthyMLBeauchesneMA2005The primary care for the underserved conference as a building block to social capital: Impact on practice, research, and educationJournal of Cultural Diversity12412613516479839TraversRWilsonMMcKayC2008Increasing accessibility for community participants at academic conferencesPrograms in community health partnerships: Research, education, and action23257264WiessnerCAHatcherTChapmanD2008Creating new learning at professional conferences: An innovative approach to conference learning, knowledge construction, and programme evaluationHuman Resource Development International114367383

Six month post-Conference assessment open and closed-ended questions with corresponding impact areas

Question series… in the last 6 monthsQuestion typeImpact areas
Networking assessmentI have followed up on a networking interaction and/or new relationship with a Conference attendeeClosed-ended: yes/noThe development of networking interactions and/ or new relationships with Conference attendees, and products or impacts resulting from networking opportunities
Please describe any products or impacts that have resulted from networking opportunities at the ConferenceOpen-ended
Application of Conference contentI have shared knowledge gained at the Conference with my co-workers/ colleagues and others in my agencyClosed-ended: yes/noThe transfer of knowledge gained at the Conference among co-workers and colleagues
I have applied MCH epidemiology knowledge (i.e., skills, methods or practices) that I learned from the Conference in my workClosed-ended: yes/noThe application of MCH epidemiologic knowledge learned at the Conference, as well as resulting products or impacts
I have applied MCH program/ policy related knowledge (i.e., skills, methods or practices) that I learned from the Conference in my workClosed-ended: yes/noThe application of MCH program and policy knowledge learned at the Conference, as well as resulting products or impacts
I have utilized my knowledge gained from the Conference about translating data into public health action in my workClosed-ended: yes/noThe use of knowledge gained at the Conference to translate data into public health action
Please describe any products or impacts that have resulted from your application of information learned at the ConferenceOpen-ended

Percentage of respondents indicating primary professional role and organizational affiliation, at registration and at six month post-Conference follow-up

Conference registration (%)(N = 650)Six months post- Conference (%)(N = 197)
Primary professional role of Conference attendee
Epidemiologist2737
Statistician87
Teacher/faculty64
Student/fellow87
Administrator/manager3125
Medical health professional1515
Other56
Overall100100
Organizational affiliation of Conference attendee
State health department2329
University2014
Community based organization65
Nongovernmental organization89
Federal government118
Local health department1721
Multiple89
Other76
Overall100100

Percentage of respondents who indicated an impact within the five impact areas at 6 months post-Conference, by primary role and organizational affiliation of respondent

Followed up on a networking interaction (%)Transferred knowledge to coworkers (%)Applied new epidemiology knowledge (%)Applied new program/policy knowledge (%)Translated data into public health action (%)
Primary professional role of Conference attendee (N = 197)
Epidemiologist (72)6799867774
Statistician (14)5793798662
Teacher/faculty (8)100100757550
Student/fellow (13)46100928383
Administrator/manager (50)6696567676
Medical health professional (29)7290697982
Other (11)4591458273
Organizational affiliation of Conference attendee (N = 197)
State health department (57)5397867773
University (28)8696688564
Community based organization (9)5689677856
Nongovernmental organization (17)71100476571
Federal government (16)94100888675
Local health department (41)5990688085
Multiple (18)67100657882
Other (11)64100827364
Overall6596737874