Discussant Remarks, "Strategies for Supporting the Selection, Implementation, and Alignment of EBPs in Community Settings"
Watch and see as Caryn Blitz, Senior Program Analyst at the Administration on Children, Youth, and Families, Department of Health and Human Services, reflects on the second panel of the Forum and advocates for the development of methods to test the combinations and permutations of interventions; the identification of a small number of State-, community-, and organization-level factors that facilitate or impede the implementation of evidence-based programs; and the creation of enabling contexts and prevention infrastructures to do this type of work.
(click “+” to see more details.)
So, good morning, everyone. For those of you that don’t know me, I’m Caryn Blitz. I’m a policy advisor to Commissioner Bryan Samuels at the Administration on Children, Youth, and Families. We’re an administration with the Administration for Children and Families. You can tell from that the distinctions between our two offices.
So, actually, we have two bureaus, and in our Children’s Bureau, we administer the Child Welfare System, the Child Maltreatment Services, and the Foster Care System. And in the Family and Youth Services Bureau, we administer programs for runaway and homeless youth, teen pregnancy prevention, and domestic violence.
And a common thread with our populations is their exposure and the impact of trauma and toxic stress. So, like Arthur’s system – and there were so many parallels, I was jumping around in my seat – we’re focused on healing and recovery, but we’re also focused on thriving. And we have a well-being framework. And I mention this because I’m going to refer back to it a little bit later on.
So, I just want to thank our panelists for some really excellent and thought-provoking presentations. You know, I was listening to the conversations yesterday and the discussants, and they had big metaphors and big ideas, and I just – I’ve got nothing around that.
But instead, I’m going to talk a little bit about my thoughts. I’m gonna steal a little bit from Karen’s organization of her presentation to talk about training and technical assistance and some of the challenges from the federal perspective, from ACYF in particular. But I think my other federal colleagues and I talk about these issues all the time. And so, I’m in good company there.
So, in Karen’s talk, she spoke about core components and core ingredients and essential elements. And we know that program developers have not done that work. They have not disassembled the black box.
So, she and her colleagues have identified criteria that can be used to better define what the “it” is. I know the folks at the EPISCenter have developed logic models for this set of interventions. We have our grantees do that. They must submit logic models and theories of change. The logic models aren’t always logical, so that’s an issue.
But on the flip side, in thinking about what we need from researchers, what we need from other folks is really to think about using methods to test the combinations and permutations from the outset. Right?
So, Selene talked a little bit about how they’re putting together different components of interventions and testing them from the beginning. I know a lot of us in federal government and some of you out there are very familiar with Linda Collins’ Multiphase Optimization Strategy, which is a framework for engineering efficacious and effective interventions, and it’s rooted in engineering.
I’m not gonna go into the details, but it’s really a nice way of thinking, before we start, about what’s really important in terms of these ingredients and different interventions, so that we’re not doing the expensive unpacking at the other end.
But you know, going back to logic models, as I think about this panel and training and technical assistance, it’s all about capacity and infrastructure. So, even with the development of logic models, it’s not about them just spitting up a logic model. It’s really understanding what it is that you – what your activities are and how they connect to your outputs and your outcomes in a very, very meaningful way. So, that’s one piece that I’ve been thinking about.
And in terms of implementation science, Karen discussed the factors that either facilitate or impede implementation. And, whether you use NIRN’s (model, or Laura Damschroder’s model, Greg Aarons, Enola Proctor, we know implementation is multilevel.
And from the participant, to the larger ecology, the state, there are a ton of variables to measure. And the last time I checked, with the exception of fidelity, people haven’t really come up with the 5, or 10, or the 15 variables. Believe me, we tried.
There is a teen pregnancy prevention evaluation workgroup. My colleagues from OAH, CDC, ASPE, OPRE, ACYF, we looked at constructs; we looked at the literature for a year in trying to come up with that. We even brought Brian and some other people to help us with that and how you measure that.
So, you know, in terms of the field, we could really use some help in what are those factors and how, when we have very limited evaluation funding, and particularly around implementation process evaluation, you know, what can we learn out there that could be helpful to winnow down those variables. We know, though, in terms of training and technical assistance that coaching, supervision, and monitoring are exceedingly important.
In terms of capacity and infrastructure, I think this notion of enabling context and systems change is really crucially important. Because otherwise, like in Karen’s slide, you’re doing these different pieces, and they aren’t coordinated.
So, with the culture change that Arthur was talking about in his agency, they have an EBP philosophy. It’s embedded in practice in the context of recovery. And we’re doing something that’s very similar. But I think embedded in that is this idea of cultural change and how it’s absolutely crucial.
So, when I think about Communities that Care, and when I think about PROSPER, and I think about some of the other systems, where they have really built a prevention infrastructure, what can we learn from them? The cultural change is really important and crucial in terms of getting people to buy in.
So, you know, I heard a really great example when I attended the trauma grantee’s meeting last week, where in our trauma grants, we’re trying to get folks to do some systems change in their states, in their communities, in their counties, to adopt a service array that’s trauma informed and everything that goes with that.
And they have these learning collaboratives. I know with the EPISCenter and the work that’s done there, they have learning collaboratives around the state as well. And what it did was it brought together child protective service workers and the mental health folks in this learning collaborative that didn’t talk to each other. And in terms of scaling up these interventions, what was really important are the pieces about relationship and about cultural exchange.
We’ve tried to model this at the federal level with our partners at SAMHSA, with CMS around trauma. We’ve been talking to other partners at DOJ and Education as well. But how do you do – and we even had someone embedded at CMS because it wasn’t until we had this cultural exchange that we could really understand, what are the points of leverage? How do you understand each other’s perspective and then learn and grow? So, that, I think, is a crucial piece.
In terms of readiness, you know, we grapple about this with our grantees, and we get lots of proposals that are absolutely beautiful, but when push comes to shove, they really aren’t ready to implement. I also agree that a planning year is great, and sometimes that’s all we can get. But one year is often not enough. And so, how can we do more of what Abe was talking about in terms of planning and staged – different types of staged implementation and readiness.
So, getting people ready, doing what – an example that we have is a youth-supported housing FOA that just came out last week. Well, we have a two-year planning grant. We want people to come up with evaluation plans. We want them to come up with implementation plans so that they can develop that level of readiness over the two-hear period. We’ve got about 18 grantees that we’re going to fund, and at the end of that period, we’re gonna do a competitive re-compete so that folks who’ve developed that readiness and capacity can go on to do the implementation.
So, I think we need more of that. It’s not always valued in government, where we’ve got to demonstrate these distal outcomes in a short period of time. But I know SAMHSA has done that in the past with their prevention needs assessment contracts, building to collaboration pilots, and then development of service arrays. And I think it’s something that, you know, you need to think about and grapple with.
And then, you know, what we’ve done at ACYF, in terms of systems change, has really been to push the policy around evidence-based interventions and what that means. And it’s really been built around trauma and recovery and well-being.
And I think a big piece of that is the buy-in piece and, again, readiness for the field to understand what the motivation is and why this work is very important. And I think that has been crucial, as well as providing the policy and funding for folks to then value evidence-based interventions and all of the things that go with it, from the needs assessment to the strategic planning, the different capacity building pieces, the implementation, the evaluation, all of it.
So, you know, we think that that’s a really beneficial way to go, and that the system is really helpful in pulling these different pieces together. Thank you.
[End of Audio]