P3 NIA FAQs
ROUND 1 NIA FAQs
January 16, 2015; Updated February 24, 2016
A. Public Input
A-1. How did the Administration seek input from the field to help develop P3?
Throughout development of P3, the Administration relied on extensive consultation with diverse stakeholders. For example, stakeholder input was solicited through the White House Council on Community Solutions; implementation of Executive Order 13563, Improving Regulation and Regulatory Review; the President’s February 28, 2011 Memorandum on Administrative Flexibility, Lower Costs, and Better Results for State, Local, and Tribal Governments; and, most recently, the Request for Information on Strategies for Improving Outcomes for Disconnected Youth (RFI) that was published in the Federal Register on June 4, 2012 (77 FR 32959).1
The consultations yielded valuable insights from practitioners, youth advocates, and others on the front lines of service delivery. These diverse stakeholders pointed to significant challenges to effective service delivery for young people who are struggling to make successful transitions to adulthood. Among other factors, these challenges include: limited evidence and knowledge of what works; poor coordination and alignment across the systems that serve youth; policies that make it difficult to target the neediest youth and overcome gaps in services; fragmented data systems that inhibit the flow of information to improve results; and administrative requirements that impede holistic approaches to serving youth who are disconnected.
Stakeholders agreed that addressing many of these challenges often requires services and expertise from multiple systems, including schools, health and mental health, workforce development, job training, housing, social services, and criminal justice. Disconnected youth may achieve better outcomes when programs are coordinated and resources are well-targeted.
B-1. May Territories (American Samoa, Guam, the Commonwealth of the Northern Mariana Islands, and the Virgin Islands) apply for P3 along with States, Tribes, and localities?
Yes, if the applicant, whether directly or through one of its agencies or entities: (1) is wholly or partly administering a Federal program; (2) is classified as a State or local government for purposes of that Federal program; and (3) proposes to include that Federal program in the pilot.
B-2. What role can nonprofits play in applying for or implementing a P3 pilot?
A nonprofit organization may not serve as the pilot applicant or the fiscal agent for pilot implementation, but it still may play a significant role in the design and governance of a performance partnership pilot. For example, a nonprofit may:
- Facilitate the development of the pilot and prepare the application;
- Deliver services and coordinate service delivery under the pilot;
- Oversee broader implementation of the pilot, including providing progress updates and recommended course corrections to activities administered by government partners;
- Represent the State, local, or tribal partnership in meetings, communications, and negotiations with the Federal government on matters when all the partners, including the Federal government, agree that this is an appropriate role for the nonprofit; and
- Secure commitments from philanthropy, other nonprofit organizations, academic and research organizations, employers, or other private sector organizations.
When a performance partnership proposal envisions a role for nonprofits in the pilot, the applicant should clearly explain the proposed responsibilities of the nonprofit organizations, their role(s) in the governance structure, and their prior experience in successful collaboration with the participating State, local, and/or tribal governments. Nonprofits may be signatories to a performance agreement along with — but not instead of — participating State, local, and/or tribal government representatives. In these cases, the State, local, and/or tribal governments will continue to be the parties primarily responsible for meeting the terms of the partnership agreement. More information about the circumstances under which participating nonprofits may be signatories will be made available in the solicitation and during the application review process and may depend in part on the specifics of individual pilot proposals.
B-3. Must an Indian tribe be federally recognized to be eligible to apply for P3?
No. State-recognized Indian tribes, as well as tribes recognized by the Federal government, are eligible to apply.
B-4. Is my organization eligible to apply for P3, and, if not, how can we still be involved?
To be eligible to apply for P3, the lead applicant organization must be a recognized entity of either a State (as defined at 2 CFR 200.90), local government (as defined at 2 CFR 200.64), or tribal government (2 CFR 200.54 provides a definition of “federally recognized Indian tribe,” and, as stated in FAQ B-3, State-recognized tribes are eligible to apply for P3 as well), represented by a chief executive, such as a governor, mayor, or other elected leader, or the head of a State, local, or tribal agency. The lead applicant must be represented by an individual who is the authorized representative of the entity, able to bind the entity to legal commitments.
Lead applicants — including departments, divisions, or other agencies &madash; must be recognized as government entities by the State, locality, or tribe that they represent. Any department or division of a State, local, or tribal government that is so recognized is an eligible lead applicant. For example, a city, county, municipality, town, township, parish, or governor’s office could be a lead applicant. In addition, a State, local, or tribal education, health, human services, housing, or labor agency, division of such agency, or State or local workforce investment board could be a lead applicant. Because recognition as a government entity may differ across jurisdictions, the specific State, local, or tribal statute or charter that authorizes an entity may need to be consulted in order to determine whether the entity is so recognized.
In some cases, the lead applicant for P3 will represent a partnership that includes multiple organizations or entities. As a result, it is important to understand that nongovernmental organization may still participate as a partner in a pilot even if that organization is not eligible to serve as the lead applicant. Nongovernmental entities or other partner organizations can serve as critical partners in developing and designing a pilot application and also executing a pilot, such as by coordinating across partners, delivering services, or helping to manage data. The P3 partnership may involve any public and private organizations, including nonprofit, for-profit business, industry, and labor organizations, as well as other State, local, or tribal government entities that are not the lead applicant.
B-5. How can I find potential P3 partners in my area?
There are several information resources available that may be helpful to potential P3 applicants to identify potential partners including:
- Map My Community: An interactive mapping tool designed to locate federally supported youth programs in a community. This tool is available at youth.gov: http://youth.gov/map-my-community.
- America’s Service Locator: A search tool designed to locate Workforce Investment Boards, libraries, community colleges, or local employers. This tool is available at Careeronestop.org: http://www.servicelocator.org/contactspartners.asp.
B-6. NEW 2/24/2015 May an applicant submit more than one application?
Applicants are not prohibited from submitting multiple proposals, nor are partner organizations prohibited from participating in more than one proposal. However, in order to be considered separate applications, each proposal must be distinct. We encourage applicants to indicate as clearly as possible in each application (e.g., in the title and abstract of their proposal) that they are submitting multiple applications, and that these applications are different from one another. Additionally, applicants and partner organizations must be capable of fully implementing each proposal if selected. Applicants should ensure that multiple distinct proposals do not propose to blend the same funds, including funds received by any of the partner organizations included in the proposal.
B-7. NEW 2/24/2015 Must the applicant already receive Federal funding?
No. The applicant must be a government entity but is not required to already be receiving Federal funds to be eligible to apply (see FAQ C-7 for further information). In order to qualify for a pilot, the proposal must include at least: (a) two Federal programs that have policy goals related to P3; and (b) one of which is administered (in whole or in part) by a State, local, or tribal government.
Federal funds proposed for inclusion in a pilot may be contributed by organizations and agencies, other than the applicant, that would also be partners under the pilot. Per page 76 of the application package: “Partnerships are critical to pilots’ ability to provide innovative and effective service-delivery and systems-change strategies that meet the education, employment, and other needs of disconnected youth. We encourage applicants to build on strong, existing partnerships that have experience in working together to improve outcomes for disconnected youth. Partnerships will vary depending on the nature and focus of individual projects, but may cut across: State, local, and tribal levels of government; education, employment, and other agencies or programs operating within the same level of government; and governmental, non-profit, and other private-sector organizations.”
Information on programs that may be eligible for inclusion in a P3 pilot can be found in Appendix B of the notice inviting applications (see http://www.federalregister.gov/articles/2014/11/24/2014-27775/applications-for-new-awards-performance-partnership-pilots#h-43).
C-1. What kinds of waivers will Federal agencies consider? Are there specific waivers that will not be considered?
The Consolidated Appropriations Act, 2014 ((the Act; P.L. 113-76, Div. H, §526, Jan. 17, 2014; 128 Stat. 413)) provides broad waiver authority for P3 projects. The Act allows the heads of affected Federal agencies to waive statutory, regulatory, and other requirements that they are otherwise authorized to waive, as well as those that they might not otherwise be authorized to waive. The affected agencies are the Departments of Education (ED), Labor (DOL), and Health and Human Services (HHS), along with the Corporation for National and Community Service (CNS) and the Institute of Museum and Library Services (IMLS) (collectively, the Agencies). With respect to requirements that the Agencies might not otherwise be authorized to waive, the Act includes important safeguards that applicants and the Agencies must meet (see sections 526(d) and (f)). Specifically, those waiver requests must be: consistent with the statutory purposes of that program; necessary to achieve the pilot’s outcomes and no broader in scope than necessary; and able to result in efficiencies or increased ability of individuals to obtain access to services provided by those Federal program funds. Requirements related to nondiscrimination, wage and labor standards, and allocations of funds to State and sub-State levels cannot be waived. In addition, the heads of the Agencies must determine that their agency’s participation and the use of proposed program funds: (1) will not result in denying or restricting individual eligibility for services funded by those programs, and (2) will not adversely affect vulnerable populations that are the recipients of those services (see FAQs C-2 and C-3 for more information).
The Agencies will consider waiver requests on a case-by-case basis in the context of the applicant’s full pilot proposal and these statutory protections. During the technical review, applicants will be scored, in part, based on the extent to which they: (1) demonstrate that the requirements for which they are seeking waivers are hindering successful achievement of outcomes for the target population of disconnected youth who are identified for the proposed pilot; and (2) provide a justification of how the waivers, individually or together, will reduce barriers, increase efficiency, support implementation of the pilot, and produce significantly better outcomes for the target population. Applicants should focus waiver requests on changes to major program requirements that would otherwise inhibit implementation. Examples of waivers include changes to eligibility requirements, allowable uses of funds, or performance reporting.
Following the technical review, the top-scoring applications will undergo a flexibility review of the applicant’s proposed waivers by interagency teams. Representatives of the Agency from which program flexibility is sought will evaluate whether the waivers requested by top-scoring applicants, in addition to the proposed blending of program funds, meet the statutory requirements for Performance Partnership Pilots, and are otherwise appropriate. For example, if an applicant is seeking flexibility under programs administered by HHS and DOL, its requests for flexibility will be reviewed by HHS and DOL officials; and these DOL and HHS officials will determine the appropriateness of the flexibility request. During the flexibility review process, applicants may also be asked to participate in an interview in order to clarify requests for waivers and other flexibility, and potentially other aspects of their proposals.
C-2. Section 526(d) of the Act states that funds must not be blended in a pilot if doing so would result in “denying or restricting the eligibility of any individual for any of the services” of a program whose funding is proposed to be blended in the pilot. What should an applicant consider in ensuring that it complies with this requirement?
Under standard practice in many Federal programs and as permissible under the authorizing statute, the entire eligible population does not actually receive services each year or grant cycle. Eligibility requirements set an outer boundary for the target population, but program design, funding, or other limitations may result in services being delivered to only a subset of the eligible population. If allowed under the relevant statute or regulations, grantees may focus activities on a limited subset of the eligible population. This means that, even in programs in which funds are allocated to grantees based on the identification of specific types of individuals, there is no guarantee that each identified individual will receive services.
Similarly under P3, applicants may propose to focus their activities either on a limited subset of the eligible population of a particular program or on a broader eligible population without “denying or restricting the eligibility” of the individuals to receive services. An applicant may propose to blend a portion of funding from a program to serve a target population that differs from the program’s exact statutory eligibility requirements. An applicant may do so by either proposing to waive eligibility requirements to broaden the target population, or proposing to work with a targeted subpopulation of a particular program.
One important factor for applicants to consider is how much a proposal will affect the proportion of eligible individuals who actually receive services under a particular program (including existing program-funded services as well as any new or comparable services provided under the pilot). For example, an applicant could violate this provision if it proposes to blend all or most of the funds of a particular program, but would serve only a very limited subset of that program’s eligible population, as defined in the program’s authorizing statute, through its P3 activities. Such a proposal could deny or restrict the eligibility of individuals for service of a program because its implementation could directly result in the vast majority of a program’s eligible population not receiving services.
C-3. Section 526(d) of the Act states that funds must not be blended in a pilot if doing so would result in “adversely affect[ing] vulnerable populations that are the recipients of such services” of a program whose funding is proposed to be blended in the pilot. What should applicants consider in ensuring that they comply with this requirement?
The Agencies have determined that there are at least two situations in which a proposed blending of funds would result in an adverse effect on the recipients of services under a particular program. The first situation involves a program that creates a universal entitlement that enables all eligible individuals to receive services or benefits. Funding from such programs may not be blended under a pilot if the pilot would serve only a subset of the eligible participants, thereby adversely affecting the remaining participants. The second situation involves programs that provide individuals with direct benefits (such as vouchers, credits, scholarships, or other payments). Funding from these programs may not be blended under a pilot under any circumstances because such a pilot would adversely affect the recipients of the direct benefit.
For all programs for which a pilot applicant proposes to blend funds or seek waivers, the applicant must describe how it will ensure in its pilot proposal that the recipients of services under the original program will receive a level of services or maintain a level of outcomes comparable to what would occur in the absence of the P3 activities. In considering whether blending funds would adversely affect the recipients of services funded by the original program, the applicant should also consider whether there are other non-mandatory Federal funds or non-Federal funds that will be used to continue to serve the recipients.
C-4. What factors must an applicant consider in justifying its waiver requests?
An applicant must provide strong justification that the new approach that would result from any waivers or other flexibility is necessary to achieve the outcomes of the pilot, is no broader in scope than is necessary to achieve those outcomes, and will result in either (1) realizing efficiencies by simplifying reporting burdens or reducing administrative barriers with respect to such discretionary funds; or (2) increasing the ability of individuals to obtain access to services that are provided by such discretionary funds. (See Section 526(f)(2)(B) of the Act.) Applicants must provide this description in response to Selection Criterion (B) in the P3 notice inviting applications (NIA).
C-5. If my State already has a title IV-E child welfare waiver from the Administration for Children and Families in the Department of Health and Human Services, can I include my title IV-E funds in a P3 pilot proposal?
P3 pilots may represent valuable flexibilities for agencies and communities seeking to improve outcomes for disadvantaged youth including agencies that are already participating in a title IV-E demonstration project authorized by section 1130 of the Social Security Act. However, the flexibility authorized for P3 under the Act cannot be applied to allow blending of costs used to draw title IV-E foster care matching funds because these funds are considered mandatory funds, regardless of whether a State has a title IV-E waiver. This means that these funds cannot be blended with other Federal funds or be subject to the additional waivers available under P3.
Nonetheless, in order to improve youth outcomes, applicants could still propose to coordinate a P3 pilot with a title IV-E demonstration project, including by braiding together funding streams so that IV-E funds retain their original identity and requirements. In using title IV-E funds, agencies must ensure that they continue to follow all applicable requirements of the title IV-E program and their waiver terms and conditions.
C-6. Can I propose to use some of the cross-program blended or P3 start-up funds to construct or renovate a facility to serve disconnected youth?
Some of the Federal funds that could potentially be blended in a P3 project have restrictions related to the use of funds for construction. For example, most funds awarded by the Department of Education cannot be used for construction. See 34 CFR 75.533. Consistent with P3 Application Requirement (b)(1) and Selection Criterion (B), if an applicant would like to use some of the P3 funds which normally cannot be used for construction for construction purposes, the applicant must include a waiver request in its application that identifies the programs and provisions for which it is requesting a waiver and an explanation for why the relevant agencies should waive provisions prohibiting use of funds for construction.
We note that construction does not include minor remodeling, which means minor alterations in a previously completed building. Minor remodeling also includes the extension of utility lines, such as water and electricity, from points beyond the confines of the space in which the minor remodeling is undertaken but within the confines of the previously completed building.
C-7. NEW 2/24/2015 May an applicant propose to blend and braid funds administered or received by a separate organization?
Yes. Applicants may propose to blend and braid funds that they or any of the partner organizations under the proposed pilot receive and administer. The applicant itself need not be a direct grantee or sub-grantee for these funds. However, in order for funds to be blended or braided under a pilot, the organization that receives and administers these funds must be represented in the proposed governance structure and ultimately must be a party to the pilot’s performance agreement.
C-8. NEW 2/24/2015 What may an applicant propose with regard to blending FY 2015 funds?
The Consolidated and Further Continuing Appropriations Act of 2015 provides authority for certain discretionary funds appropriated in the Labor-HHS-Education bill to be included in any P3 pilots awarded under this competition.
Applicants may include in their application the FY 2015 funds they propose to incorporate into their pilots. The statutory requirements for programs eligible to be included in a pilot remain the same for FY 2015 programs as for FY 2014: they must target disconnected youth, or be designed to prevent youth from disconnecting from school or work, and provide education, training, employment, and other related social services. Similarly, for a program to be blended as part of a pilot, under the statute, the Federal agency must determine that doing so will not: (1) deny or restrict an individual’s eligibility to receive services; or (2) adversely affect vulnerable populations that receive services from that program. As with FY 2014 funds, the statute does not permit pilots to blend mandatory funds, meaning funds from entitlement programs such as Medicaid, Social Security, most Foster Care IV-E programs, and Temporary Assistance for Needy Families.
When proposing to incorporate FY 2015 funds, the amounts and sources of funds proposed do not have to be the same as FY 2014. Applicants may propose to expand the number of Federal programs supporting pilot activities in FY 2015 or use other Federal funding that may be awarded in future years within the proposed duration of the pilot.
As indicated in Application Requirement (g), applicants must provide a detailed budget and a budget narrative describing the FY 2014 and FY 2015 Federal program funds that the applicant proposes to blend. The budget must cover all years during which FY 2014 and FY 2015 Federal funds would be used to support the pilot and must include at least the first full year of the pilot.
For FY 2015 competitive grants, applicants may indicate if they plan to apply for, have applied for, or have received an award that they would like to blend with other Federal funds under their pilot. Competitive grant applications and awards will still be managed through the existing processes of the agency that administers the respective competitive grant programs. Including a competitive grant in a P3 application will in no way alter an applicant’s chances of receiving that competitive grant.
In some cases, FY 2015 funding may include different requirements than FY 2014 funds appropriated by Congress for similar purposes. For example, the Workforce Innovation and Opportunity Act will become effective on July 1, 2015, for certain funding streams under the Departments of Labor and Education. As a result, Federal agencies and pilots may negotiate performance agreements in stages, meaning that agreements may first be finalized for FY 2014 funds and amended, updated, or otherwise modified to account for the appropriate use, accountability, and oversight for FY 2015 funds.
Additionally, the Consolidated and Further Continuing Appropriations Act of 2015 authorized FY 2015 discretionary funds appropriated to the Department of Justice Office of Justice Programs (OJP) to be used to participate in P3 pilots, and the Agencies plan to issue additional guidance on the inclusion of OJP funds.
D. Competitive Grants
D-1. Can FY 2014 competitive grants be included in a P3 Pilot?
The Agencies will consider the inclusion of FY 2014 competitive grant funds that have already been awarded on a case-by-case basis. The Agencies will determine if the scope, objectives, and target population(s) of the grant appropriately and sufficiently align with the scope objectives, and target populations of the proposed pilot. Situations in which it may be appropriate to include an already-awarded competitive grant or grants in a pilot include cases in which there are similarities between the competitive grant and the proposed pilot, such as the project plan, performance goals and metrics, proposed participants, leveraging of diverse funding, and partnership approaches, and an increased potential to amplify an existing program model and improve outcomes for disconnected youth.
Situations in which it may not be appropriate to include already-awarded competitive grant funds in a P3 pilot include cases in which, for example, the competitive grant is undergoing a rigorous evaluation that could be negatively affected or interrupted by the inclusion in the P3 pilot. Additional situations in which it might not be appropriate include if the competitive grant had been awarded based on a proposal to serve a specific population that would not align with the pilot’s proposed target population, or if the proposed pilot approach could, in any way, adversely affect that targeted population or the overall goals of the competitive grant.
The Agencies will consider the strength of the applicant’s justification for including the already-awarded competitive grant funds in its proposed P3 pilot. This requires the applicant to clearly demonstrate that the scope, objectives, and target population(s) of the competitive grant appropriately and sufficiently align with the proposed pilot’s scope, objectives, and target population(s). The applicant must also justify any potential changes in terms and conditions of the existing competitive grant that may be required for the purposes of the pilot (such as allowable costs and activities).
E. Needs Assessment
E-1. What is a comprehensive needs assessment, and how will it inform my P3 pilot?
In general, a comprehensive needs assessment is a systematic process to develop an informed understanding of the gaps or needs that exist, as well as the factors or root causes that contribute to the existence of those needs. A needs assessment first defines the scope of the assessment and may outline key questions to be answered by the assessment. Next, it gathers data to analyze and document findings, which may include strengths, gaps, opportunities, and challenges. Using this information, and other applicable evidence-based research, the assessment then establishes priorities and strategies for addressing the identified issues.
The Agencies acknowledge the diversity in definitions and processes for conducting a needs assessment. One example of a specific type and process for conducting a needs assessment is the community needs assessments described by HHS, Administration for Children & Families (ACF), at http://www.acf.hhs.gov/programs/ocs/resource/conducting-a-community-assessment-1#Overview.
While a specific model or process is not required, the P3 review process includes consideration of the extent to which the applicant used a comprehensive needs assessment that was conducted or updated (either by the applicant or by other partners or organizations) within the past three years. The needs assessment should use representative data on disconnected youth in the jurisdiction(s) to be served by the pilot that are disaggregated according to relevant demographic factors (such as race, ethnicity, gender, age, and disability status) to: (a) show disparities in outcomes among key sub-populations; and (b) identify an appropriate target population of disconnected youth with a high level of need. For example, a comprehensive needs assessment that an applicant conducts or uses for purposes of this application may analyze workforce, education, and well-being data for disconnected youth in defined areas of service and identify a target population with significant outcome disparities in comparison to other peer groups. Applicants, especially those that are conducting a needs assessment for purposes of the P3 application, are encouraged to align priorities and next steps identified through the needs assessment to the pilot logic model in order to inform the overall project design (See FAQ F-1, Logic Models).
F. Logic Models
F-1. What is a logic model, and what information should it include?
A logic model (also referred to as theory of action) is a well-specified conceptual framework that identifies key components of the proposed process, product, strategy, or practice (i.e., the "active ingredients" that are hypothesized to be critical to achieving the relevant outcomes) and describes the relationships among the key components and outcomes, theoretically and operationally (34 CFR 77.1). In other words, a logic model clarifies what the applicant is seeking to change or produce through the pilot — the expected results, and the intermediate and long-term outcomes — and identifies how the project’s activities will contribute to achieving that result.
As described in the NIA, P3 applicants are required to submit a graphic (no longer than one page) that depicts the pilot’s logic model grounded in a specific theory of change for how the pilot’s strategy will produce intended outcomes. The first step in developing a theory of change, after identifying the issue(s) to be addressed, is to identify the theoretical solution(s) based on available data. The next step is to describe the desired outcomes and impacts in addressing the issue and develop a plan for attaining those goals. Using this information, a logic model communicates how the program would operate when implemented. A variety of frameworks are used to describe the parts of a logic model, and P3 does not require a specific model. However, applicants are encouraged to include the following elements in their logic model.
- Inputs include the resources that are needed to carry out the program plans. Examples of inputs are personnel, facilities, funding streams, supplies, and equipment.
- Activities are the services and interventions that are proposed as part of the program design. It is helpful to consult evidence from the field regarding the effectiveness of the activities in achieving the desired outcomes and goals. It should be clear from your logic model how the key components are related to, or expected to produce, the outputs that ultimately lead to the intervention’s intermediate and longer-term outcomes.
- Outputs are the immediate results or products of the project activities, which are often (but not always) described in numerical terms. For example, outputs might include the number of youth who complete a certification program.
- Interim indicators are goals that the intervention is expected to help achieve that lead to achievement of long-term outcomes. It may be helpful to include indicators that encompass different levels (such as participant-level, organizational-level, or system-level outcomes) and across time (such as short-term and long-term).
- Long-term outcomes are the expected changes in behavior, attitudes, aptitude/skill, knowledge, etc. for the target population. In particular, because these pilots are intended to improve outcomes for disconnected youth, long-term outcomes are related to reconnection of youth or successful prevention of disconnection, including by ensuring youth are enrolled in school or gainfully employed.
Logic models may also show assumptions made by the applicant, as well as any external factors that may bear on the intermediate and long-term outcomes. These elements provide context for the proposed interventions.
For additional information on how to develop a logic model, see “Evaluation Toolkit for Prospective WIF Grantees,” starting on page 10: http://www.doleta.gov/workforce_innovation/pdf/grantees/FINAL_WIF_EvaluationToolkit_5-12-2014.pdf (PDF, 74 pages). The Regional Educational Laboratory Pacific, one of the 10 Regional Educational Laboratories established and funded by ED’s Institute of Education Sciences, also has produced an Education Logic Model Application that can be used to build logic models. The Education Logic Model Application is available at: http://relpacific.mcrel.org/ELM.html.
G. Outcomes and Interim Indicators
G-1. Do applicants have to propose the education- and employment-related outcomes and interim indicators that are listed as examples in the NIA?
No. While Application Requirement (f)(2) requires applicants to propose at least one outcome measure in the domain of education and at least one in the domain of employment, the education- and employment-related outcomes and interim indicators listed in the NIA are examples for applicants to consider while developing their proposals. Applicants are not required to use these specific outcomes and interim indicators. The specific outcomes and interim indicators that applicants select should be grounded in their logic model/theory of action, and informed by program results or research, as appropriate. Following are additional examples of measures that may be appropriate for use in the pilot. As with the examples in the NIA, use of these measures is not required.
Employment and Training
Examples of measures for adult employment and training programs are entered employment rate (such as the percentage of participants who are employed in the quarter after leaving a program), employment retention (such as the percentage of participants who are employed six months later), and earnings (such as quarterly earnings in the two quarters after entering employment).
For example, common performance measures used for youth employment and training programs for the Workforce Investment Act Youth Program are placement in employment or education (such as the percentage of youth who entered employment, including the military, or enrolled in postsecondary education and/or advanced training/occupational skills training in the quarter after leaving the program), attainment of a degree or certificate (such as the percentage of youth participants who attain a diploma, GED, or certificate by the end of the third quarter after leaving the program), and literacy and numeracy gains (such as the percentage of youth participants who advance one or more educational functioning levels). These three measures are the Workforce Investment Act’s common performance measures for youth.2
Note: President Obama signed the Workforce Innovation and Opportunity Act (WIOA) (Public Law 113-128) into law on July 22, 2014. WIOA replaces the Workforce Investment Act of 1998, and changes, among other things, the performance measures for employment and training programs. Under WIOA, the primary indicators of performance for youth will include: the percentage of program participants who are in education or training activities, or in unsubsidized employment, during the second quarter after leaving the program; the percentage of program participants who are in education or training activities, or in unsubsidized employment, during the fourth quarter after leaving the program; the median earnings of program participants who are in unsubsidized employment during the second quarter after leaving the program; the percentage of program participants who obtain a recognized postsecondary credential, or a secondary school diploma or its recognized equivalent, during participation in, or within one year after leaving, the program; the percentage of program participants who, during a program year, are in an education or training program that leads to a recognized postsecondary credential or employment and who are achieving measurable skill gains toward such a credential or employment; and the indicators of effectiveness in serving employers.
The recent White House report for the President’s job-driven training initiatives, “Ready to Work,” also discusses these key measures as part of a job-driven checklist for employment and training programs.3
Examples of educational outcomes for all youth, including those who are disconnected from school, include: (1) attainment of a high school diploma; (2) attainment of a credential that is equivalent to a high school diploma; and (3) enrollment in, and completion of, at least some kind of postsecondary educational program leading to a degree or certificate. Each of these has a variety of possible interim indicators (such as school/program attendance or credits earned) as well as non-cognitive measures (such as persistence or time management) that are potential indicators of progress for achieving the key outcomes.
Housing Stability and Other Well-Being Outcomes
Outcomes related to well-being of disconnected youth include stable housing as well as those related to personal, cognitive, and developmental status, such as self-regulation, coping skills, conflict-resolution skills, personal efficacy, ability to plan, and pro-social behavior.4
Housing Stability and Homeless Reduction
The United States Interagency Council on Homelessness (USICH) published a report, "Framework to End Youth Homelessness," in February 2013 that identified four core outcomes for youth who are experiencing homelessness: stable housing, permanent connections, education or employment, and social-emotional well-being. The report did not provide interim indicators for these outcomes or the pathways to improving services, but it did include a logic model.5
Personal and Developmental Well-Being
Research indicates that major predictors of future youth disconnection include poor grades, poor health (including mental health and severe disability), problem peers, and early parenthood.6 Tracking interim outcomes related to these risk factors, like pregnancy prevention and improved mental health, could improve an applicant’s ability to serve this population. For information on the role of risk assessment in service-planning and achieving short-term and long-term outcomes, see http://www.acf.hhs.gov/programs/opre/resource/framework-for-advancing-the-well-being-and-self-sufficiency-of-at-risk-youth.
H. Evidence-Based Interventions
H-1. Do applicants have to use studies from Federal registries of evidence-based interventions, such as the What Works Clearinghouse (WWC), to inform their pilot design?
While applicants are not required to use studies from Federal evaluation clearinghouses, applicants are encouraged to use (and cite) research, such as studies that appear in various Federal evaluation clearinghouses, to inform their pilot design, as relevant. While each Federal Clearinghouse on evidence-based interventions uses somewhat different procedures and criteria in its work, they are similar in that the research studies that they include are summarized, and the strength and rigor of their findings are assessed according to specific guidelines.
Clearinghouses with evidence that is related to potential P3 pilots include:
- ED’s What Works Clearinghouse (WWC): Evidence on programs, products, practices, and policies in education (http://ies.ed.gov/ncee/wwc/).
- DOL’s Clearinghouse for Labor Evaluation and Research (CLEAR): Evidence on labor-related issues (http://clear.dol.gov/).
- HHS’ Teen Pregnancy Prevention Evidence Review: Evidence on programs with impacts on teen pregnancies or births, sexually transmitted infections (STIs), or sexual activity (http://www.hhs.gov/ash/oah/oah-initiatives/teen_pregnancy/db/tpp-searchable.html).
- SAMHSA's National Registry of Evidenced-Based Programs and Practices (NREPP): Evidence on mental health and substance abuse interventions (http://nrepp.samhsa.gov/).
- HHS’ Home Visiting Evidence of Effectiveness: Evidence on home visiting program models that target families with pregnant women and children from birth to age five (http://homvee.acf.hhs.gov/).
- DOJ’s CrimeSolutions: Evidence on criminal justice, juvenile justice, and crime victim services, programs, and practices (http://www.crimesolutions.gov).
Other useful Federal Clearinghouses that include literature summaries, program resources, and promising practices, although they do not rate the quality of the analysis or findings, include:
- Self-Sufficiency Research Clearinghouse: Research on low-income and TANF families (http://www.opressrc.org).
- Workforce Strategies Solutions: Research on education and training; employment, retention, and advancement; and management and operations (http://strategies.workforce3one.org).
I-1. What are the key components of a rigorous evaluation?
Although there are many different types of evaluation, only an impact evaluation can establish whether a program or intervention caused an observed outcome. A randomized controlled trial (RCT), when appropriate, can provide the most rigorous causal evidence.
An RCT research design measures the “impacts” of the intervention or program on individuals or systems. An impact is an estimate of the direction (positive or negative) and magnitude (by how much) of the change in outcomes that can be directly attributed to the intervention.
The key to this design is random assignment. Eligible applicants are randomly assigned, such as by lottery, to the treatment group that receives the services provided by the intervention or to a control group that does not. This approach ensures that the two groups are identical in all respects except that one will participate in the intervention (program services) and the other will not. Therefore, any differences in outcomes between these groups (such as different rates of employment) can be attributed to the intervention.
RCTs are considered the “gold standard” (i.e., the most reliable form) of evaluation because they allow programs to claim, with a certain degree of confidence, that participants have improved their outcomes because of that program. Although RCT studies can require more effort to design and implement, if random assignment is conducted correctly, the results provide clear, rigorous evidence of program effectiveness. Additionally, the results from an RCT evaluation will provide important contributions to the evidence base for the intervention. Results from this evaluation approach are also valuable to stakeholders and scholars in determining whether the expected impacts were realized, and in developing approaches that build on this evidence to refine and expand programs.
Quasi-experimental design (QED) studies are similar in most respects to RCTs except, importantly, they determine the members of a treatment and control group by methods other than random assignment. Typical methods including “matching” a treatment group of service participants to a group of similar individuals who did not participate, using characteristics of the individuals prior to their choice to participate. When RCTs cannot be used, quasi-experimental design (QED) studies can sometimes provide good estimates of impact, though they cannot fully distinguish the effects of the intervention (program services) from differences between the two groups that existed before the intervention began and that: (1) cannot be easily measured or used in matching (e.g., persistence, motivation, grit), and (2) could be related to important outcomes. For this reason, impacts estimated from QED studies have to be treated with some caution.
Implementation studies are important components of rigorous evaluations, no matter the design used. An implementation study illuminates and explains “what is happening and why” in the design, implementation, administration, operation, services, and outcomes of social programs. This type of study can provide context and information that makes evaluation results more useful for improving program implementation. In addition, findings from implementation research can be used to inform future program development or replication.
For further details and guidance on the key components of evaluation, please review "The Evaluation Toolkit for Prospective Workforce Innovation Fund Grantees" available at http://www.doleta.gov/workforce_innovation/pdf/grantees/FINAL_WIF_EvaluationToolkit_5-12-2014.pdf (PDF, 74 pages).
J-1. How should applicants budget for the annual "community-of–practice" meetings that are required of pilot sites?
Applicants’ budgets must include funds for their participation in two meetings during the project period. The meetings will take place in Washington, DC, and are expected to last for three days. Applicants should plan to bring at least two project staff and may send up to four. An applicant that proposes to conduct an independent evaluation of its pilot may bring an additional fifth person from the evaluation team. All participant travel, accommodations, and meals should be budgeted using start-up funds.
J-2. NEW 2/24/2015 By when must pilots expend startup funds?
Pilots may propose to spend start-up funds at any point over the course of the project period (which may not exceed September 30, 2018). Start-up funds do not need to be expended in the first year of the program. As stated in Application Requirement (g), applicants should request a specific start-up grant amount that is between $400,000 and $700,000 and describe how the pilot will use these start-up funds to support effective implementation, such as planning, governance, technical assistance, site specific evaluation, capacity-building, and coordination activities. Examples of other uses include supporting the measurement of pilot performance and results, such as modifications to information systems.
K. Technical Assistance
K-1. What kind of technical assistance (TA) resources will the Federal government make available to pilot sites?
Based on input from the field about the importance of TA, the Agencies are helping sites meet their needs in a number of ways:
- Start-up grant funding. Applicants may propose to use the start-up grant funding for a variety of purposes, including TA that is specific to the needs of the proposed pilot.
- Other Federal funds blended under P3. Pilot sites may also secure TA using a portion of other Federal funds, beyond the start-up grants, that are blended for P3 purposes.
- Community of practice. All P3 grantees must commit to participating in a community of practice that includes an annual meeting of pilot sites (paid with grant funding that must be reflected in the pilot budget that is submitted as part of the application) and peer-to-peer learning activities. A community of practice is a group of grantees that agrees to interact regularly to solve a persistent problem or improve practice in an area that is important to them and the success of their projects. Establishment of communities of practice under P3 will enable grantees to meet, discuss, and collaborate with each other regarding grantee projects.
- Evaluation. Federal agencies are working to identify resources to support P3 pilots (and their independent evaluators) conducting rigorous impact evaluations. Such support could include workshops on common approaches to conducting such studies or common problems encountered, or site-specific assistance on specific issues that arise.
- youth.gov. The Agencies will support the community of practice in part by using http://www.youth.gov to organize and disseminate TA tools and resources that have been created and/or identified by the Agencies that would have broad applicability across the P3 pilots. These resources might include links to grantee webinars or transcripts and recordings from calls with project directors, written guidance to assist pilots in understanding program requirements and relevant laws and regulations, and program announcements and other news. The Agencies also intend to provide guidance on Federal information and privacy laws (such as the Paperwork Reduction Act, the Family Educational Rights and Privacy Act, and the Health Insurance Portability and Accountability Act) and assist grantees in developing model consent forms.
L. NEW 2/24/2015 Other Issues
L-1. NEW 2/24/2015 Would a Pay for Success (PFS) model be appropriate for P3?
Pay for Success, also referred to as Social Impact Bonds, is an innovative financing model that leverages philanthropic and private dollars to fund services up front, with the Government reimbursing investors after the services generate results. Both P3 and PFS are funding models that focus jurisdictions on defining specific outcome goals for a well-defined target population, using reliable data to measure progress, and generating evidence about cost-effective interventions.
Both P3 and PFS are complex, emerging models that the Federal government is testing, and each one requires some experience and expertise to implement successfully. As noted in the notice inviting applications, P3 applications will be scored, in part, based on partnership capacity. This capacity takes into account the extent to which partners have the necessary authority, resources, expertise, and incentives to achieve the pilot’s goals, resolve unforeseen issues, and sustain efforts to the extent possible after the project period ends, including by demonstrating the extent to which, and how, participating partners have successfully collaborated to improve outcomes for disconnected youth in the past. PFS brings together new partners, such as outside investors, to implement a new model of service delivery that, in many cases, the partners will be implementing for the first time. As a result, applicants may be challenged to demonstrate partnership capacity, including how partners have successfully collaborated to improve outcomes for disconnected youth, if they propose a PFS model that partners have not previously implemented.
Although applicants may be challenged to demonstrate partnership capacity if they propose blending PFS funds or supporting a PFS model with P3 funds, there are ways in which a P3 project may be used separately from a PFS project that would still complement or prepare for a PFS project. For example, a jurisdiction that is considering pursuing PFS but lacks needed information about current outcomes for youth before structuring a project might first propose a P3 project to understand these outcomes through the careful tracking required under P3. Or a jurisdiction seeking to sustain a successful intervention first tested through PFS might propose to continue supporting the intervention through a P3 project.
Other Federal resources that are helping jurisdictions to build capacity to implement Pay for Success include grants recently awarded by the Social Innovation Fund. More information on the Pay for Success model is available at http://www.payforsuccess.org.
L-2. NEW 2/24/2015 Is there a minimum or maximum number of youth to be served in the pilots?
No. There is no minimum or maximum number of youth to be served by the pilots. However, applicants must define the target population to be served by the pilot, based on data and analysis demonstrating the need for services within the relevant geographic area. Please see Application Requirement (a), Statement of Need for a Defined Target Population.
1 The RFI is available at http://www.federalregister.gov/articles/2012/06/04/2012-13473/request-for-information-on-strategies-for-improving-outcomes-for-disconnected-youth.
2 See ETA’s performance measures Web site at http://www.doleta.gov/performance/guidance/tools_commonmeasures.cfm.
3 See Chapter 2 of this report, available at http://www.whitehouse.gov/ready-to-work.
4 See “Community Programs to Promote Youth Development” at http://mnliteracy.org/sites/default/files/youth_development_brief.pdf (PDF, 8 pages).
5 See http://usich.gov/resources/uploads/asset_library/USICH_Youth_Framework__FINAL_02_13_131.pdf (PDF, 19 pages).
6 Fernandez, Adrienne; Gabe, Thomas. 2009. Disconnected Youth: A Look at 16- to 24- Year Olds Who are Not Working or in School. Washington, D.C.: Congressional Research Service.
Hair, E.; Moore, K., Ling; T, McPhee-Baker, C.; Brown, B. 2009. Youth who are Disconnected and Those who then Reconnect: Assessing the Influence of Family, Programs, Peers and Communities. Washington D.C., Child Trends.