By Celeste Richie
The Pennsylvania Workforce Development Board (PA WDB) voted last week to define evidence of effectiveness for Pennsylvania’s public workforce system for the first time in its history. RFA worked closely with James Martini, RFA Workforce Fellow and PA WDB Executive Director, and the PA WDB Youth and Continuous Improvement Committees, on this ground-breaking effort which will help prioritize evidence in the workforce system across Pennsylvania and encourage other state workforce agencies to do the same.
PA WDB has been on a journey to build a data- and evidence-based workforce system for some time now, including the adoption last fall of a measure committing the state to increased transparency in performance data and discretionary grant spending and outcomes. Defining evidence of effectiveness for the system was the next step for PA WDB, and they drew on definitions for strong, moderate, preliminary, and pre-preliminary levels of evidence from both the Corporation for National and Community Service and the evidence tiers defined in Re-employment Services and Eligibility Assessment (RESEA) program regulations (see below for full definitions).
Reflecting on the definitions, Chekemma J. Fulmore-Townsend, PA WDB’s Youth Committee Chair, said: “I’m really excited about taking this important step in moving Pennsylvania’s Workforce Development system in a direction that strongly prioritizes evidence and evaluation in developing our programs, policies, and practices.” James Martini, the Executive Director of the PA WDB agreed, saying “a clear definition of evidence will allow us to evaluate our programs in an effective and equitable manner. This will not only ensure that we are allocating resources in a fair way, but that we can identify the most successful programs and scale those across the commonwealth.”
As part of RFA’s State and Local Workforce Fellowship, Allison Jones, the Deputy Secretary of Policy and Planning for PA Governor Tom Wolf; James Martini, the Executive Director of the PA WDB; Erica Mulberger, the Executive Director of the Central PA Workforce Development Corporation; and Dillon Moore, the Director of Policy for Partner4Work, have been working together to identify and implement priority strategies from RFA’s 7 Ways to Improve Workforce Outcomes Using Evidence: 2019 Policy Roadmap for State & Local Officials. Along with seven other state teams in the RFA fellowship cohort, the Pennsylvania Team members are honing their data and evidence skills and implementing policies and practices that will strengthen PA’s public workforce system and improve the lives of the people they serve.
“A clear definition of evidence will allow us to evaluate our programs in an effective and equitable manner. This will not only ensure that we are allocating resources in a fair way, but that we can identify the most successful programs and scale those across the commonwealth.”
- James Martini, Executive Director of the Pennsylvania Workforce Development Board
RFA’s Vice President of Workforce Development, Celeste Richie, joined the PA WDB’s Board meeting to speak in support of adopting these definitions and the Board voted unanimously to adopt them. The endorsements from the Youth and Continuous Improvement Committees detailed their rationale for recommending adopting the definitions (page 49 of the PA WDB 5/5/2020 Briefing Book):
- Evidence guidelines provide qualitative and quantitative data in support of program structure and offerings;
- Having specific guidelines in place will benefit individuals and program providers by offering expectations related to the additional categories that may need to be included in the development of new programs, or to improve upon existing programs;
- Provides a basis for an understanding of what constitutes an evidence-based program;
- Beneficial to have a “gold standard” in place to “raise the bar” toward the improvement of the overall quality of programs;
- Allows potential grant applicants to understand and plan for the types of data that they will gather to illustrate the effectiveness of their proposed programs and/or services;
- Allows for both newly established entities and veteran organizations to have equal opportunity to compete for grant offerings;
- Offers a better idea of where the needs are in terms of program delivery and helps to shape standardized capacity building and training that could be more easily and consistently deployed at all levels, particularly on the ground; and
- Gives state and local agency leaders a solid oversight framework for evaluating program effectiveness and cost-efficiency.
Please contact Celeste at email@example.com if you’d like to learn more about how your state or local workforce board/agency can adopt similar evidence definitions.
PA WDB Evidence of Effectiveness Definitions
On May 5th, the PA WDB unanimously voted to adopt the following 4-tiered definitions of evidence for use throughout the PA workforce system:
Strong evidence: meaning at least two evaluation reports have demonstrated that an intervention or strategy has been tested nationally, regionally, at the state- level, or with different populations or locations in the same local area using a well-designed and well-implemented experimental design evaluation (i.e., Randomized Controlled Trial (RCT)) or a quasi-experimental design evaluation (QED) with statistically matched comparison (i.e., counterfactual) and treatment groups. See CLEAR.dol.gov for full definitions of strong or moderate study design. The overall pattern of evaluation findings must be consistently positive on one or more key workforce outcomes. The evaluations should be conducted by an independent entity external to the organization implementing the intervention.
Moderate evidence: meaning at least one evaluation report has demonstrated that an intervention or strategy has been tested using a well-designed and well-implemented experimental or quasi-experimental design showing evidence of effectiveness on one or more key workforce outcomes. The evaluations should be conducted by an independent entity external to the organization implementing the intervention.
Preliminary evidence: meaning at least one evaluation report has demonstrated that an intervention or strategy has been tested using a well-designed and well-implemented pre/post-assessment without a comparison group or a post-assessment comparison between intervention and comparison groups showing evidence of effectiveness on one or more key workforce outcomes. The evaluation may be conducted either internally or externally.
Pre-preliminary evidence: meaning there is program performance data for the intervention showing improvements for one or more key workforce outputs or outcomes.