Accountability and Going to Scale with Quality

Watch and listen as Abraham Wandersman, Professor at the University of South Carolina Department of Psychology, outlines how to create a quality large scale program, emphasizing the need for accountability and the development of evidence-based processes. 

(click “+” to see more details.)


Abe Wandersman: Thank you, thank you very much. We had a panel meeting at lunch, and I thought I was going to be least and last presenter today, and Bob assured me I was the dessert. So, if a person is judged by the company he or she keeps, I will tell you that I am honored to have been invited here today. This is amazing stuff and I'm not tired. This has been great stuff today. I want to start off with some comments that were not in my PowerPoint, but that I thought of as we went today. So my comments I think will resonate with a lot of things that were said today and also perhaps be a bit different. My three or four initial points: one, there's a lot of mediocrity out there in the world of practice. Two, you'll have that to sink in for a moment. Two, the world of practice is not solely responsible for the state of affairs, and the solution will require a partnership of researchers, practitioners, funders, and consumers. Three, it's not a matter of evidence-based programs or homegrown programs, or fidelity, or adaptation. It's a matter of outcomes and accountability. And four, building on what Larry Green has said, and I certainly reinforce, that what we need more of our best practice or evidence-based processes or ways of doing things and not just evidence-based programs. What's the technique? Oh, okay, ah! I'm from the south so my Southern drawl makes this hard to be bare bones in fifteen minutes, and also as a professor I'm not, I'm also used to just talking for as long as I want to talk. So I'm going to try to…[interruption from audience] So do you have a plane to catch? [Abe Wandersman] Unfortunately I don't. No, I'm here for the duration. I will do my best to keep to the time. There were initial questions that were raised for…Can you hear me okay? Okay. There were initial questions that were raised for this panel. I don't have time to go through them, but at the end I will tell you how I've addressed them.

Alright, so an overview, where do babies come from? Where do EBPs come from and where do they go? Research to practice models, something called the interactive systems framework for dissemination and implementation, which is a framework that I've helped to co-develop, which we'll talk about that tries to bridge research and practice by integrating research to practice models with community centered and practice centered models. Next is getting to outcomes and results based accountability, I'll talk about that. Toward an evidence-based system for innovation support-, there's been a lot of talk about support today: tools, training, TA, etc. We're going to talk about the need for an evidence-based system for support, and I'm also going to talk about an example that tries to bring this together. It's the Office of Adolescent Health CDC Community Mobilization project. Is my 15 minutes up? Alright, his is where babies come from, where EBP's come from, I'm sure you're familiar with this. I'll be very quick. This is an IOM approach, an Institute of Medicine approach, coming out of the 1994 preventing mental health disorders, but it's a very typical approach. The first box is epidemiology, what's the prevalence and incidence of the problem, where is it occurring, with what kinds of populations, is there an uneven distribution health disparities, etc.? Number two is risk and protective factors that help explain why there are differences in disparities. Using those risk and protective factors there are efficacy trials in box three, generally randomized controlled trials. If they're effective, some of them presumably go on to effectiveness trials, box four. And then there's a version of what Karen has called and I've called “just do it.” Box five is “we've done it, we've done the curriculum, it's here, unwrap it, and just do it.” The arrow between box four and box five is the gap between science and practice, and I think realistically it's more of a chasm in general.

So I've helped to co-develop with colleagues at CDC something called the interactive systems framework for dissemination and implementation, and in the references there's a reference to it. And basically this is a way of demystifying what goes on between box four and box five. It's a way to potentially enhance how we can integrate research and practice better. So the bottom box is distilling the information. There are lots and lots of evidence-based programs, R01's, etc. How can that information be synthesized, put into one place? And often it needs to be translated. So syntheses, including meta-analyses, are excellent, but they're not very actionable for people in the field, and so sometimes there's translation. The top box is the delivery system. We're talking about prevention, but it could be treatment, it could be education. And there are two sub- boxes. This is a capacity building framework, and initially it talks about general capacity, so for example, what's the general capacity of a school or community? Is it a good host organization or a good host community? Is it ready, so to speak? And then there's the innovation specific capacity, and the innovation is basically the evidence-based programs we've been talking about. The stuff that's coming out of the bottom box, the synthesis and translation system is the innovation that we're trying to promote. The middle box is very, very under researched, and it's the support system. Where it exists and when it exists, what does it look like? What quality is it? And support systems also have differential amounts of capacity. Bob apparently has a great support system. They've been developing that infrastructure. They're also located sounds like regionally. And then what's the capacity of that support system to promote the different interventions? So OAH has twenty eight evidence-based programs in their teen pregnancy prevention. How many organizations are capable of providing training and TA on twenty eight evidence-based programs? So these are things to consider as we think about how to bridge research and practice and get information to the delivery system in a way that they can use it.

The support system is where we've been placing a lot of emphasis lately, and this is our logic model when we do support. So just the way we'd like organizations and people in the field to have logic models, we want to have a logic model and clearly be accountable for it. And basically it says we want to achieve a desired outcome. Now we realize that the organizations or communities we deal with will have different initial levels of capacity, which means we're not going to work with them all in the same exact way. And then we're going to provide tools like books. And we have found for example, the Getting to Outcomes books that we've done, they're several hundred pages, and one of them has been downloaded for free over seventy five thousand times. I have no idea what most of those people are doing with it. I'm very concerned about that. So I realize that training would be helpful, and then we realize when people come to a training, one size doesn't fit all. People come in with very different levels of experience, and they're also going back to very different organizations and settings. So that gets followed by TA: technical assistance or coaching. Tools, training, TA, that's done a lot, and the Feds very nicely have put lots of money into that. What they haven't done as much of is what we call quality improvement, quality assurance. So if you think about teaching somebody to drive, when we care about something in the real world, like teaching somebody to drive, we give them a tool, like the book from the Department of Motor Vehicles. We might give them a group training. They get individualized coaching by the person sitting next to them, but we don't give them a license depending on how many hours of coaching they got from their mom or dad. We don't just take their word for it, and say you're ready to go. We have a quality assurance approach, and I'm convinced that we need to have much more quality assurance and quality improvement in health and human services and education.

So what I've talked about so far is mostly from the researcher and funder perspective of getting the stuff out there? If you are a practitioner, if you are a school principal, a school superintendent, and let's say you want do something about teen pregnancy prevention, and you know there are twenty eight evidence-based programs, that's good, but there are many other things you've been told are important. You need to know about needs and resources. You need to know about goals. In addition to the best practices and programs, you need to know about fit, and there's a lot of discussion of fit here today. You need to have the capacity to do that program. These programs are often developed with highly trained personnel, highly paid, relatively well paid people, and then we expect in the real world with a fraction of the budget to do it, implement it in the same exact way it was created. There's a capacity issue there. The plan of who does what, when, where and how. The implementation is number seven. Process evaluation, the superintendent's been hearing about evaluation, evaluation. What about the process evaluation? What about the outcome evaluation? Number nine is continuous quality improvement. I keep hearing about continuous quality improvement. What does that mean? How do I do it? And last, but not least is sustainability. I know I'm only going to have this money initially for a few years. What's going to happen after? So most of the science is aimed at number three, which is necessary, but not sufficient to achieve sustainable outcomes. So we have developed, my colleagues and I have developed an approach that we think demystifies accountability. Takes some of the mystery and some of the fear out of accountability by helping people in the field understand what it is and how to achieve it. And there's a ten step approach called Getting to Outcomes. These are the ten steps in an artist's palette. They involve the ten things we just talked about: needs and resources, goals and objectives, best practices, fit, capacities, plan, implementation and process evaluation, outcome evaluation, continuous quality improvement and sustainability. And what you see here is that the eyes on the prize is on the middle. It's not all about evidence-based programs. It's about results.

All of this is geared toward results and what we have in GTO and GTO by the way, the University of South Carolina, where I'm at, my colleagues at Rand, they liked it, our university liked it, so they registered the trademark. That's what that little R is there. We put these as accountability questions, and what we believe is that in order to achieve outcomes, in order to be accountable, an organization or a school or a community or whatever should be able to ask and answer each of these questions, and these questions are not new, and there are literatures to how to address them, but it puts it all in one place. So by answering these ten questions about what do you need to be doing and how do you know you need to be doing it, given your needs and resources, what are your goals? Given your goals how are you going to get there? And that's where evidence-based approaches come in. And also thinking about fit. So you're picking the right evidence-based approach. Making sure you have the capacity to do it well, or it's not worth doing. Having a good quality plan. Implementing it with quality is number seven. Number eight is seeing after all this work whether you achieve the outcomes you set out for. Number nine is continuous quality improvement. It's rare, and it was mentioned today, it, you rarely, Karen mentioned too, you rarely your first try out reach that excellence level. And even when you do, things change over time. So you need to continuously quality improve what you do. And last but not least is if this was worth doing, if you got results, how do you keep it going? One of the issues with accountability is how do you have a language that goes across multiple levels. And what we have found, and we have applied the ten questions across different levels, like national level, state level, county level, agency level, and even brought it down to the provider level, and if you're doing treatment you can use to for a case plan. You can use those ten questions for a case plan. So, because of this major strong belief on the importance of having a support system, which several people have brought up today, we have recently proposed something that we called an evidence-based system for innovation support. So if we think that evidence-based programs are important, why shouldn't we think about evidence-based tools, evidence-based training, evidence-based technical assistance? Not all tools, training, technical assistance, are equal in quality, and yet we have treated those areas with very little quote science, with very little rigor. It's basically been, and I know there are exceptions to my generalizations, but I'm going quick,  it's basically been you know experiential. And we need to have evidence bases on how to do these support procedures efficiently and effectively. And we've applied the accountability questions to each of these approaches. How do you have an accountable tool that's results based accountable, how do you do training that's results based accountable, technical assistance, etc.?

OK putting it together. So there is an example that puts not all of this together, but quite a bit of it. And it's the Office of Adolescent Health CDC Teen Pregnancy Prevention project, and several people including Allison and Duane and Amy, and I may not have seen some of the other OAH people are here, who've been involved in this project. It's a major program, and among the things it uses are the interactive systems framework that I mentioned before, and what I mostly want to mention is quality improvement, quality assurance. This is relatively unique and extraordinarily important. So this is going on in nine different communities in nine different states. So in terms of going to scale this is a pretty sizeable operation. And in terms of quality improvement, quality assurance, one of the things that has happened is there is an interactive web based application of this Getting to Outcomes approach that's being used. And there's also a contracting approach that one of the sites is using, and I'll mention that in a moment. So this is a way of operationalizing and bringing into life this interactive systems framework. So the bottom is synthesis and translation. And in order to synthesize and translate information on how you can do teen pregnancy prevention in an accountable way, this project is using a version of Getting to Outcomes. And they put in one place how you answer those ten questions. It's a how to workbook. It's several hundred pages. It's wonderful. You can read in a night. So that's a version of the synthesis and translation. It's user friendly, and it puts in one place the ten things that these practitioners should be doing in order to reach outcomes. There's the support system that we've talked about that has tools, training and TA. That's the middle box. And then there's the delivery system. Now for those people who are familiar with the ISF from before in a new article that's coming out soon we hope, we've added more careful implementation in two places: implementation quality between the support system and the delivery system. How well is the support system delivering the tools, training and TA? What had been missing before though in this framework was the implicit link to outcomes by the delivery system. So before it was primarily talking about how do we build capacity to do the intervention better? Now the red arrow going horizontally to outcomes is very explicit. And this slide is missing, it's missing two things above here, and they're are examples of quality improvement, quality assurance. This is a priority. So one is there's an interactive web based application as the communities are putting, are doing their work, going through the ten steps, they are using this guided interactive web based application to help them do that, and it helps them keep track of what they're doing for each of the ten steps. In addition, the support system will be able to know where each of these projects are in real time. Not waiting for six months or a year away annual report, but in real time they know exactly where they're at, so that they can more efficiently target training and technical assistance. The last part of that is one of the sites is using contracting, so they're not just giving money to their partners, but they're setting up a contract that's outcome based and not service unit based. Last, so …[Audience] They just showed up, your quality assurance. [Abe] Oh did they? So it was, they fooled me. They animated it without me knowing. So in sum, evidence-based programs they come from the model we are all familiar with. These research to practice models are terrific. They are necessary, but not sufficient. We have suggested a framework that integrates research to practice models with community centered models, doesn't just expect the community or the organization to be a passive recipient of the science. We talked about a results based accountability approach that, and then the idea that evidence-based systems should be there, that when we do our support and spend millions and millions of dollars on support we should do it in an evidence-based way, and I've provided you with an example. And so if you want to remember anything from this, this is Jeopardy, OK. Here's the answer to all those questions that were presented at the beginning that I didn't have time to read. An answer is Getting to Outcomes and results based accountability and/or evidence-based support. So if you look at all these questions, that's the answer, and that's my summary.