Breadcrumb

  1. Cities Find New Tools For Action At Implementation Science Training 2014

Cities Find New Tools For Action At Implementation Science Training 2014

This is cross-posted in the National Forum on Youth Violence Prevention March 2014 newsletter. See the original post here (PDF, 7 pages).

Despite weather conditions that shut down the government on the first scheduled day, the Implementation Science Training Institute of March 3–5 in Washington, D.C., went smoothly—and to the satisfaction of all who participated. The Office of Juvenile Justice and Delinquency Prevention (OJJDP) event included grantees from the National Forum on Youth Violence Prevention, the Defending Childhood Initiative, and the Community-Based Violence Prevention initiative. Weather-related flight cancellations couldn't keep away the more than 40 who attended, in teams from Baltimore, Md.; Boston, Mass.; Camden, N.J.; Chicago, Ill.; Cleveland, Ohio; Detroit, Mich.; Grand Forks, N.D.; Kansas City, Mo.; Los Angeles, Salinas, and San Jose, Calif.; Memphis, Tenn.; Minneapolis, Minn.; Philadelphia, Pa.; and Portland, Maine. Federal staff also attended the training.

Allison Metz and Melissa Van Dyke, codirectors of the National Implementation Research Network, led the training. Over the 3 days of the event, Drs. Metz and Van Dyke presented with great clarity (and humor) a rationale for why implementation science is important, what can be learned from the evidence base about important dimensions of implementation, and how this information can be used to maximize the probability of reliably achieving desired outcomes through the implementation of particular plans, strategies, practices, and programs.

Content, Components, Connections

The first day of the training (relocated, because of the government closure, to the hotel where out-of-town participants lodged) concentrated on the content, components, and connections needed to implement a system change initiative. Metz and Van Dyke stressed that it can take 2 to 4 years to implement an evidence-based program and up to 10 years to successfully implement system change. It is necessary to improve the political environment that surrounds the system so it produces the policy and funding changes needed to create and sustain it. The opportunity for "quick, tangible wins" should not be overlooked. Communities should look for components of their plans that are not well developed yet doable within 3 months, and communicate these "wins" to all partners. In addition to implementing evidence-based programs, communities should enhance their "practice-based evidence" models.

Saul Green, Detroit’s Ceasefire project director, reported that his city used this approach in implementing its Safe Routes intervention (see "Safe Routes Send Kids to School With Confidence" in February issue [PDF, 8 pages]) to continually figure out what is imperfect and to make improvements. The day ended with sites examining their plans to ensure that all components are clearly described and operationally defined.

Stages and Drivers of Implementation

The second day of training, at OJJDP headquarters, focused on the stages and drivers of implementation, as well as on implementation teams. The stages capture the developmental (but not necessarily linear) nature of the implementation process. An example used to illustrate this point was the importance of buy-in across the developmental stages. Gaining buy-in is critical during exploration (the first of the four stages), but it is equally important to reaffirm it throughout the remaining stages (installation, initial implementation, and full implementation)—such as by reminding those involved why they bought into the effort during the awkwardness that inevitably arises during initial implementation.

The discussion of "drivers" offered insight into aspects of the infrastructure that create a "hospitable environment" for successful implementation. One example of a driver is coaching, which has been shown to dramatically increase the percentage of trainees actually using—rather than merely comprehending—new skills from 5 percent posttraining to 95 percent postcoaching. Participants learned about the importance of structuring linked teaming arrangements, with clear graphic depictions of how teams are linked, what the "terms of reference" or operating parameters and procedures are, and how teams are accountable to one another.

Collective Impact, Quality Improvement

The third day considered collective impact and quality improvement cycles. Time was given at the end of the morning for teams to identify action items they would address upon their return home.

One component of the training that the sites embraced was the "table time" built into the training, when individuals from a single site could work with their team to discuss how the information being presented could help reinvigorate their team. In their evaluations, many participants cited "hearing from other cities about their efforts" as the most useful part of the training. The in-person meeting allowed site representatives to discover issues they had in common, as well as to recognize issues unique to their cities. Participants also discovered numerous things they would like to discuss with other sites. Opportunities for many peer-to-peer calls and mentoring were discussed.

Using the Framework as a Diagnostic Tool

Participants expressed enthusiasm about the training, many noting they wished they could have had the training "a year or 2 ago" when they were getting their projects off the ground. At the same time, it became clear that many felt overwhelmed at trying to find the time and resources to "do it all."

Dr. Metz agreed that it would be impossible to do it all. Instead, she characterized the framework as a gift, because it clarifies "functions that need to be served." That is, the framework makes visible the functions that are important to the successful implementation of programs and strategies. She encouraged participants to use the framework as a diagnostic tool. Participants should look at the "things keeping them awake at night," then use the framework to identify what function from the framework was not being served.

She pointed to the different cities' examples to illustrate this point: Whereas the Memphis team discovered they could benefit from more clarity about the components or the "what" of their efforts, the challenges described by the Chicago team pointed to the need to catalyze adaptive leadership—one of the drivers described on Day 2 of the training.

Leveraging the Lessons

Asked how they plan to use what they learned when they return home, many attendees said they will share what they learned with people who have more "clout"—those who have the power to integrate these lessons into their violence prevention work. Annie Ellington, director of Detroit's Youth Violence Prevention Initiative, said she will "take action on citizens' items of concern." She intends to focus on the "fit, feasibility, and continuous quality improvement" of the work back home. Participants also said they plan to work more with their implementation teams and get them to incorporate what they learned to assess the strengths, laud the successes, and identify challenges and methods of quality improvement.

Shelby County Government Administrator (Memphis) Keisha Walker summed it up: "We plan to take a deeper dive into site-based services and determine what's supposed to be the end result."

Learn more about the Implementation Science Training Institute and materials used.