971567221566Matern Child Health JMatern Child Health JMaternal and child health journal1092-78751573-662823143158453717210.1007/s10995-012-1190-9HHSPA714774ArticleBuilding Analytic Capacity, Facilitating Partnerships, and Promoting Data Use in State Health Agencies: A Distance-Based Workforce Development Initiative Applied to Maternal and Child Health EpidemiologyRankinKristin M.Division of Epidemiology and Biostatistics, University of Illinois at Chicago School of Public Health, Chicago, IL, USAkrankin@uic.eduKroelingerCharlan D.Division of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USARosenbergDeborahDivision of Epidemiology and Biostatistics, University of Illinois at Chicago School of Public Health, Chicago, IL, USABarfieldWanda D.Division of Reproductive Health, National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA, USA12820151220121482015160 2196202

The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control’s MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.

Workforce developmentAnalytic capacityPartnershipsProfessional developmentMCH leadership
Purpose

The Maternal and Child Health Epidemiology Program (MCHEP) within the Division of Reproductive Health at the Centers for Disease Control and Prevention (CDC) has been sponsoring capacity-building activities to enhance analytic efforts and increase infrastructure in state and local agencies for over 25 years [1]. The primary goal of the MCHEP has been to assign senior maternal and child health (MCH) epidemiologists to state agencies that have demonstrated need for enhanced analytic capacity in MCH. The MCHEP has committed to supporting the work of the host agencies and MCH epidemiology assignees by providing continuing education, workforce development, and opportunities to collaborate with state staff.

To best address the epidemiologic and analytic needs of assignees and state staff, MCHEP began partnering with faculty in the Division of Epidemiology and Biostatistics at the University of Illinois at Chicago (UIC) School of Public Health in 2005 to provide distance-based workforce development for their MCH Epidemiology assignees, other senior epidemiologists working in state health agencies, and select junior epidemiologists, fellows, program staff and administrators. Reflecting the values of the MCHEP and UIC faculty, as well as efforts of MCH epidemiologists working in different settings across the country, the initiative focused on the practical application of methodological concepts and data use to address state priorities. The objective during the first year of the workforce development initiative was to provide training in specialized regression techniques that participants may not have been exposed to in their academic programs or other work. The objectives evolved in subsequent years to include the application of advanced methods to answer research questions of state agency priority and MCH focus, using state-based surveillance data such as the Pregnancy Risk Assessment Monitoring System (www.cdc.gov/prams), and in recent years, other national-level surveillance systems. This ongoing, now seven-year partnership has culminated in many of the articles that are compiled in this special supplement of the Maternal and Child Health Journal.

The purpose of this article is to chronicle the development, evolution, and impact of this distance-based model for improving analytic capacity among MCH epidemiologists, from its inception to the present. We will describe how the objectives, audience, and activities changed over time and document the data products that resulted from this initiative. We also will highlight partnerships that were established and/or enhanced through these efforts and discuss challenges to successfully implementing this distance-based model of workforce development, especially in state agencies. We will conclude by discussing the potential future impact of these efforts.

Description

Four iterations of the workforce development initiative were undertaken between 2005 and 2012, each with a different set of objectives and outputs. Table 1 lists the years, analytic focus, required data products, and participating state/agency teams for each.

2005–2006: Training in Advanced Regression Methods

The objective during the first year of this initiative was to provide advanced training in specialized regression techniques. The syllabus was organized into three modules covered during one calendar year: (1) log binomial and Poisson regression with count data, (2) Poisson regression with person-time data and survival analysis/proportional hazards modeling; and (3) cumulative and generalized logit modeling for ordinal and nominal outcome variables. Each module consisted of four alternating lecture and discussion sessions. Lectures were of an applied nature, focusing less on statistical details, and more on performing the analysis and interpreting the results. Data management and analysis issues that staff encounter in the field, such as variable selection, recoding, and iterative model-building, were covered in detail. SAS (SAS Institute: Cary, NC) coding techniques were shared and all didactic material was accompanied by practical examples using MCH data.

Discussion sessions allowed participants to share their results from analysis assignments, which focused on applying each new regression technique to an analysis of the 2002 public use perinatal mortality file from the National Center for Health Statistics (NCHS) (www.cdc.gov/nchs). Conversation focused on ways that different individuals and teams approached the problem. Participants’ skills were enhanced by practicing new methods and receiving detailed feedback about their work from colleagues and faculty members.

2007–2008: Analysis of Data from the Pregnancy Risk Assessment Monitoring System (PRAMS)

In order to better integrate the activities of this initiative into the daily work of the participants, the MCHEP partnered with states to support the development of usable products as part of the second iteration of this initiative. Therefore, the objectives for 2007–2008 focused on developing a fact sheet, report, or peer-reviewed manuscript to address a state priority for MCH populations. Topics were limited to those for which there were relevant PRAMS data, and state teams chose postpartum depression, obesity, cesarean sections, infant sleep position, pre-term delivery, routine health care for women, and other issues of importance to their state’s MCH population.

By 2007, 28 states and New York City had PRAMS programs and it had become a priority of CDC and the states to utilize PRAMS data for program planning and evaluation. However, many MCH epidemiologists had not previously analyzed data from a complex sample survey. Therefore, much of the didactic material provided by UIC faculty focused on analysis methods, software packages, and procedure syntax required to appropriately analyze PRAMS data, accounting for the weighting and sampling design. Practical issues were highlighted, such as creating composite variables, model-building, appropriately handling stratified and subpopulation analyses in the context of a complex sampling design, and considering techniques for presenting results that would most effectively translate the findings of the study. Focusing on PRAMS alone allowed for the joint exploration by participants and faculty of specific content areas, and teams shared alternative approaches to coding indices for variables such as the thirteen-question scale for stressful life events and the multiple questions about smoking during different times in the pregnancy. At the end, members of each state team presented their final products to all participants and received feedback from colleagues and faculty before finalizing projects. This process promoted idea-sharing, including how other teams might perform similar analyses in their own states. The results of many of these projects were also disseminated more widely. With the assistance of faculty, twelve teams submitted abstracts to the 2008 Maternal and Child Health Epidemiology Conference (Abstracts listed at: http://webcast.hrsa.gov/conferences/mchb/mchepi2008/index.htm), at which participants presented their work in three special sessions devoted to PRAMS analyses. In addition, one team went on to publish their manuscript examining associations between having a partner who was controlling or threatening and perinatal depression [2].

2008–2009: Using PRAMS Data to Inform the State Title V Needs Assessment

To build on what had been learned about utilizing the PRAMS dataset, and to support the states in meeting the federal requirement to produce a data-supported needs assessment in 2010 for their Title V MCH Block Grant, MCHEP and UIC faculty redefined the objectives of the initiative for 2008–2009. The emphasis changed from hypothesis testing to analytic approaches used to inform priority setting. Data products were still required, but were focused on monitoring progress toward achieving objectives and performance measurement. Specifically, each team was asked to identify three indicators available in PRAMS that were related to national, state, or local priorities. Next, each team developed and implemented analytic approaches to most effectively use the indicators to inform the needs assessment and support Title V priorities identified through the assessment process.

Didactic material was still delivered in the form of mini-lectures, the topics of which were prioritized by participants. UIC faculty and invited guest lecturers partnered to present conceptual and technical material about predicted prevalence estimates from multivariable regression models, population attributable fractions, small area analysis, and presentation of results in a needs assessment framework. As in the previous iteration, final products were shared and discussed among the teams before being finalized and used to support the states’ needs assessments and the Title V MCH Block Grant applications.

2009–2012: Analysis of Data from Complex Sample Surveys

In 2009, the MCHEP and UIC faculty developed a more ambitious plan to support state teams in efforts to more widely disseminate the important work being carried out in their agencies. A proposal to publish this special supplement of peer-reviewed articles authored by participants in the initiative was submitted to and approved by the Maternal and Child Health Journal. State teams chose to perform in–depth analyses of a complex sample survey, such as the National Survey of Children’s Health (NSCH, http://www.cdc.gov/nchs/slaits/nsch.htm) and the Behavioral Risk Factor Surveillance System (BRFSS, http://www.cdc.gov/brfss/) to answer a research question important to MCH populations.

Again, systematic, tailored technical assistance (TA) from faculty and peer-to-peer exchange was prioritized over lecture material, but experts were invited to provide lectures on innovative methods of interest to the participants such as propensity scores, multiple imputation, sub-state analysis of state-based surveys, and multilevel modeling. These lectures were interspersed with the analytic work and resulted in some teams adopting new methods to most appropriately address their research questions.

Assessment

While the objectives of the workforce development initiative changed over time, there were commonalities in technology, structure, and expectations over the four iterations. To convene and connect participants working in agencies from Massachusetts to Hawaii, Microsoft LiveMeeting® with synced audio conference calls was used for sharing lecture slides, state team presentations, and real-time demonstrations of SAS coding and procedures. All sessions were recorded and archived so participants could view or review them at their convenience. Syllabi, datasets, assignments, and lecture materials were shared via an online learning portal called Blackboard® which was made available through UIC. Associated discussion boards allowed for exchange of ideas and programming code across teams. All participants were provided with a UIC account to access Blackboard and, as a result, were also able to utilize the library’s electronic journal subscriptions to review the scientific literature, which were otherwise unavailable to some at their respective agencies.

A core requirement from 2007 forward was the submission of an analysis plan for each state team’s data product. The analysis plan was due during the first few months and required a statement of the team’s research question(s); background about the topic; selection of variables (independent, dependent, covariates) from the survey data set; a step-by-step statistical analysis plan that included initial descriptive, bivariate, and stratified analyses using contingency tables as well as proposed modeling procedures; and, finally, table shells laying out how results might be presented. This prompted teams to carry out a thoughtful review of the literature and conceptualization of the project before even beginning to work with data. UIC faculty gave detailed feedback about the structure of the teams’ research questions and hypotheses and provided suggestions for improvement of the analysis plan.

Feedback also was solicited from peers through small group conference calls, during which teams shared their progress on analyses with three to four other teams working on similar projects. Individual TA calls with faculty were required for each team at several points throughout the process. These calls not only gave teams intermediate deadlines to encourage progress, but also provided opportunities for teams to discuss analytic or logistical issues, brainstorm possible solutions, and share technical discoveries or lessons learned. Targeted benchmarks were set for each call to encourage incremental progress in carrying out the analysis plan.

Innovative Methods

For many teams, the opportunity to publish results in a peer reviewed journal provided the incentive to address their research questions by applying some of the new and innovative methods they were exposed to by UIC and other faculty. Common to several teams were projects examining geographic disparities in new ways (Bish C. et al.; Short V. et al.; Kasehagen L. et al.; Robl J. et al.; Herrara D. et al.). As a result of comparing geographic areas and constructing more meaningful subpopulations and regions, special analytic issues were encountered. For example, the US-Mexico Border team used two related surveys, the BRFSS, which was implemented on the US side of the border, and the Mexican National Survey of Health and Nutrition (ENSANut), a similarly structured survey implemented on the Mexico side of the border, to describe cross-border access to cervical cancer screening (Herrera D. et al.). Because the sampling design differed for each of these surveys, expert statistical advice was sought from statisticians on each side of the border to allow for combined analysis of the two datasets. In addition, language barriers and software capacity issues were overcome as a result of this partnership. The flexible structure of this workforce development initiative supported this and other efforts to generate new and important information about specific regions within and across states and countries.

Several teams used the new regression techniques they learned, such as log binomial regression (Lu E. et al.; Hernandez L. et al.), or used specialized methods with logistic regression (Bish C. et al.; Herrera D. et al.), to appropriately estimate prevalence ratios rather than odds ratios, especially with common outcomes. Another team created a thoughtful new composite measure to simultaneously assess access to and quality of health care provided to children, and used generalized logit modeling to examine this multi-category measure ([3], Ogbuanu C. et al.).

Other teams explored the utility of applying cutting edge statistical techniques to MCH issues. Prompted by a guest lecture from an expert in multilevel modeling of complex survey data [4], a team with members from Missouri and Florida performed multilevel modeling and estimated median and interval odds ratios [5] to describe the variance in preventive dental care across states and the contribution of state-level characteristics and policies to this variance (Lin M. et al.). Another team applied a new epidemiologic approach to mediation analysis [6, 7], and extended the method to health services research and health disparities (Bennett A. et al.). Overall, the emphasis on performing higher-level analyses, accompanied by the training materials and TA that were provided, have resulted in innovative projects that improve the knowledge base in MCH.

Collaboration and Partnerships

Collaboration and partnerships were emphasized throughout the process to ensure that this applied and innovative work by MCH epidemiologists generated evidence that was directly applicable to program planning and policy making to improve the lives of women, children, and families. Beyond the partnerships among MCHEP, state agencies, and UIC that provided the foundation for this initiative, the activities prompted new partnerships within and across state agencies, as well as with experts at the federal level and in academia, who collaborated on some of the state projects.

Beginning in 2007, the MCHEP assignee or other senior MCH epidemiologist within each state agency was named as the team leader and was encouraged to build a team to jointly contribute to the conceptualization, analysis, and dissemination of the state data product, and participate in other activities of the initiative. Depending on the availability of staff, which varied widely across states, the teams included junior epidemiologists, fellows or interns, program staff, administrators, and others. When PRAMS data were the focus, team leaders built stronger relationships with the PRAMS staff in their states. For the articles in this supplement, many teams reached out to relevant program staff to contribute to the conceptualization of the project and to comment on the policy and public health significance of the findings. For example, the team focusing on state-level factors influencing preventive dental care reached out to content-area experts from the Florida Department of Public Health Oral Health Program to help with the conceptualization of their analysis (Lin M. et al.).

Other collaborations across states and regions were developed or enhanced as a result of initiative activities. For example, if assignees were working in states that did not have PRAMS data available by 2007, a partnership was formed with an existing PRAMS state, resulting in enhanced capacity and partnership for both state agencies. The MCHEP assignee to Iowa partnered with participants in Nebraska to analyze Nebraska PRAMS data. This experience using another state’s PRAMS data strengthened Iowa’s subsequent successful application for PRAMS funding. For participants from Virginia, a state in which the PRAMS effort did not start until mid-2007, a partnership with Arkansas’s existing PRAMS program provided enhanced analytic capacity for Arkansas while helping prepare epidemiologists from Virginia to analyze their own data when it became available. For their article in the current supplement, the MCH Epidemiologists in Ohio and Pennsylvania partnered to examine preconception health indicators in the Appalachian region of the United States, a region that spans both states’ borders (Short V. et al.).

Successes and Challenges

As this workforce development initiative evolved over time, participants were challenged to disseminate their work more broadly, making it easier for them to justify their participation and demonstrate the benefits of supporting high level analytic work to their supervisors in state agencies. Additionally, since participants directly applied newly acquired skills in their work, the didactic content was solidified, building analytic capacity within state agencies. The structure of the initiative provided the foundation for participants to perform above and beyond normal expectations in a state agency setting. The accountability and motivation to increase the complexity of their work that was provided by this structure resulted in more detailed and nuanced evidence to inform program planning and policy in their agencies.

These activities were not without challenges, however. This initiative has spanned a difficult time period (2005–2012) for state agencies across the country, given the budget crises faced by many state governments. Staff morale and turnover, hiring freezes, and vacancies often plagued the teams, and unfortunately resulted in a few teams not continuing their participation in the initiative from 2007 to 2009, especially those without MCHEP assignees to continue the momentum in the face of these challenges. The inclusion of fellows and interns, especially fellows from the Council for State and Territorial Epidemiologists (CSTE), as integral members of teams helped tremendously in moving projects forward. However, the two-year time limits on fellowships and the inability of many agencies to hire them as permanent employees after their fellowships were complete did not allow for the continuity and consistency needed for optimal team functioning.

Estimating and providing the time needed to produce fact sheets, reports, and manuscripts for peer review were often difficult given the other daily responsibilities of the staff to their respective agencies. While participation in the initiative was required for MCHEP assignees, it was an extra activity for the assignees and other state staff. It also was difficult to pace the activities and deadlines given the heterogeneity of teams and situations across state agencies. Some teams were able to meet all deadlines or even felt delayed by the pace of the feedback and next steps, while others struggled to keep up. Many participants had never before submitted manuscripts for peer review and needed more TA than was anticipated regarding how to respond to peer reviewers’ comments and how to format materials according to journal specifications. For future efforts, faculty could consider setting individualized targets and deadlines for each team, taking into account the skill level of team members and resources in each agency.

Data access often became an issue, especially as teams explored more complex research questions. During 2007–2008, one team pursued a linkage between their state’s PRAMS data, birth certificates, Medicaid and WIC participation indicators, and Healthy Start program data for an evaluation of their state’s Healthy Start prenatal services. They faced layers of legal issues and bureaucracy at the state due to concerns about privacy and confidentiality that required almost a year to resolve. Teams pursuing geographic analyses faced similar access issues given the restriction of geographic codes in many datasets. For example, codes identifying counties with <50 respondents in a year were suppressed in the public-use BRFSS data-files [8]. While, for one project examining 1997–2005 BRFSS data, a partnership with the CDC allowed for indicators of Appalachian or non-Appalachian county residence to be linked to observations for women living in counties with suppressed codes (Short V. et al.), this is not possible for data from 2006 forward, given more restrictive confidentiality policies that require the suppression of certain county codes. These challenges affected the timeliness of the Appalachian regional analysis and may have implications for further geographic analyses using BRFSS. This highlights the importance of finding new ways moving forward to facilitate these informative geographic analyses of small areas while preserving the confidentiality of respondents.

For analyses using data from the NSCH, CDC staff partnered with the NCHS research data center (RDC) to access data that are suppressed from the public use dataset, which facilitated the use of detailed Rural–Urban Continuum Area codes for an analysis of factors related to physical activity in children (Kasehagen L. et al.). Since this team was not in geographic proximity to Atlanta or Washington, D.C., where RDCs are located, a remote web-based system was used that allowed for one approved team member to submit data requests, one at a time, to analyze the restricted data. This process was cumbersome for a complex multivariable analysis requiring many preliminary analyses and iterative model-building. In addition, many useful options in the statistical software procedures were unavailable, requiring more manual work by the analyst. Finally, this process hampered teamwork given that the system sent results and output to only one email address. Nonetheless, the team learned from the experience and provided feedback about these challenges that may be useful as the RDC works to facilitate future use of restricted data by researchers.

Despite the challenges, the goal of promoting the use of MCH data to inform public health action was met. Though the objectives, data sources, and data products were refined over time, the common elements of this initiative reflected the values of the MCHEP and UIC faculty members to support high quality, applied analytic work to best meet the evolving needs of participants and their agencies over time.

Conclusion

This partnership between the MCHEP and UIC faculty to enhance the analytic capacity of MCH epidemiologists has promoted more rigorous analysis of MCH surveillance data to address state priorities and has resulted in the wider dissemination of the findings of this important and innovative work. This and future efforts to increase the quality and rigor of state data projects promote high level epidemiologic analysis as the norm in state agencies rather than the exception. The model of workforce development presented here may have broader application beyond MCH epidemiology, in other areas of state government, or in other organizations. In an era of decreasing resources, such partnership efforts among state and federal agencies and academia are essential for promoting effective data use.

We want to thank the staff of the Maternal and Child Health Epidemiology Program at CDC for their dedication and interest in this workforce development initiative over the years. Their assistance with coordination and their suggestions for improvement made this endeavor a success.

Disclaimer: The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.

KroelingerC2012Collaboration at the federal, state, and local levels to build capacity in maternal and child health: the impact of the Maternal and Child Health Epidemiology ProgramJournal of Women’s Health215471475BlabeyMLockeEGoldsmithYPerham-HesterK2009Experience of a controlling or threatening partner among mothers with persistent symptoms of depressionAmerican Journal of Obstetrics and Gynecology2012e1e919539890OgbuanuCGoodmanDKahnKNoggleBLongCBagchiS2012Factors associated with parent report of access to care and the quality of care recieved by children 4 to 17 years of age in GeorgiaSuppl 1S129S42CarleAC2009Fitting multilevel models in complex survey data with design weights: recommendationsBMC Medical Research Methodology94919602263MerloJChaixBOhlssonHBeckmanAJohnellKHjerpeP2006A brief conceptual tutorial of multilevel analysis in social epidemiology: using measures of clustering in multilevel logistic regression to investigate contextual phenomenaJournal of Epidemiology and Community Health60429029716537344ValeriLVanderWeeleTJMediation analysis allowing for exposure-mediator interactions and causal interpretation: theoretical assumptions and implementation with SAS and SPSS macrosPsychological Methodsin press. (obtained via personal correspondence)VanderWeeleTJVansteelandtS2010Odds ratios for mediation analysis for a dichotomous outcomeAmerican Journal of Epidemiology172121339134821036955Centers for Disease Control and Prevention2011Behavioral risk factor surveillance system annual survey data[cited 2012 August 28]; Available from: http://www.cdc.gov/brfss/technical_infodata/surveydata.htm

Maternal and child health (MCH) Epidemiology Program/University of Illinois at Chicago Distance-Based Workforce Development Initiative for MCH Epidemiology Analytic Capacity Building—analytic focus, data products, participating states/agencies, and participants by time period

Time periodAnalytic focusState data productsParticipating states/agenciesNumber of participants
2005–2006Training in advanced regression methodsNone requiredDE, FL, GA, HI, IA, LA, MA, ME, MI, NM, OH, OR, CDC, NPAIHB40
2007–2008Analysis of data from the Pregnancy Risk Assessment Monitoring System (PRAMS)Fact sheet, report, or manuscript, plus abstracts to MCH Epidemiology ConferenceAK, AR/VAa, DE, FL, HI, IA/NEb, LA, MA, MI, MN, MS, OH, OR, VA, WA, US-MX Border MACH, CDC117
2008–2009Using PRAMS data to inform the state title V needs assessmentData fact sheets on three indicators identified as important for the stateAK, AR/VAa, FL, GA, HI, IL, LA, MA, MI, MN/WI, MO, MS, NE/IA/WYb, OH, WA, CDC115
2009–2012Analysis of data from complex sample surveysPeer-reviewed manuscript for MCHJ supplementFL, GA, HI, IA/NE/WI/WYc, IL, KY, LA, MA, MO, MS, OH, PA, US-MX Border MACH, AATCHB, CDC75

NPAIHB Northwest Portland Area Indian Health Board, MCHJ Maternal and Child Health Journal, AATCHB Aberdeen Area Tribal Chairmen’s Health Board, US-MX Border MACH United States-Mexico Maternal and Child Health initiative

Arkansas PRAMS data were used but team also included members from states that had not yet implemented PRAMS surveys at that time

Nebraska PRAMS data were used but team also included members from states that had not implemented PRAMS surveys at that time

Representatives from all four states worked together on an analysis of national data