Module 9: Section 3: Routine Assessment and Self-Evaluation

This section will help you understand the type of routine assessment and evaluations your agency should conduct to support your ongoing TJC initiative. You will also learn the steps needed to plan your evaluations.

What Is Routine Assessment?

Routine assessment is the process of regularly gathering, analyzing, and interpreting your data to help you and your partnering agencies improve and revise the TJC initiative and its components. An important aim is to use your data to answer key questions about your jail transition processes – e.g., who are you serving, what are the criminogenic risks and needs of those individuals, what programs and interventions are these individuals receiving and do these programs meet their identified criminogenic needs? – and to modify and strengthen the application of the TJC model in your community based on the answer to those questions. A quality assurance process uses similar data but goes beyond data analysis to include assessment of how services and programming are delivered and may also include client satisfaction measures

Here is how to begin:

  • Convene your TJC's coordinating reentry council to determine the key outcomes that are of interest to partners and potential funders to show progress in achieving the TJC initiative.
  • Form a specific data or evaluation working group.
  • Jointly produce a theory of change model. This process will highlight the overall model outcomes, including immediate, intermediate, and long-term outcomes.
  • Develop a data collection procedure based on your consensus outcomes, ideally with different agencies helping with the data collection and analysis.
  • Analyze the data to answer agreed upon research questions designed to evaluate important outcomes identified by the coordinating reentry council.   
  • Have reentry council members interpret analyses and results.
  • Disseminate the findings to stakeholders on a regular basis.

Feedback

We encourage you to establish mechanisms—such as forums, focus groups, routine reports from partner agencies and client satisfaction surveys—to obtain early and frequent feedback from partners and constituents.

Think of feedback as having the following components:1

  • Data are used to objectively examine the TJC initiative, focusing on the model at the system and individual level.
  • Structured meetings are held to review the data and increase collaboration among the partners.
  • Use the analysis to make decisions and ensure that the TJC initiative is being implemented as expected and is improving.
  • Repeat the process on a routine basis.

Assessment and Evaluation Capacity

The TJC initiative recommends that at least one partnering agency has the capacity to plan and implement routine assessments and evaluations of the initiative. This agency will utilize this capacity to advance the overarching goals of TJC and will feedback its results regularly to stakeholder decision makers to inform decision making, organizational reorientation, and resource allocation. Building your internal capacities to make evaluation part of your agency, instead of using outside consultants or evaluators to analyze your TJC initiative, is important because it

  • Increases responsibility for competent data management and collection by the partnering agencies.
  • Decreases the likelihood that the TJC initiative and its outcomes will be opposed.
  • Influences organizational culture to accept data findings and resultant changes in policy or practice.
  • Teaches agencies to improve their TJC initiative without relying on outside help.
  • Allows for a system to make educated and targeted decisions on where they would like to allocate their resources.
  • Builds the necessary resources to sustain long-term assessment and evaluation.

In Jacksonville, Florida, the Sheriff’s Department analyzes risk screen scores to identify candidates for pretrial risk assessment and for risk and criminogenic needs assessment. After reviewing the data, Jacksonville changed their procedures and began releasing inmates to pretrial servies with risk screen scores below certain cutoffs.

However, if you don't have in-house research staff, you may want to partner with local research organizations or academic institution to help you with your evaluations.

Field notes from Allegheny County, Pennsylvania

Urban Institute researchers recently evaluated two Allegheny County, Pennsylvania Second Chance Act-funded reentry programs. Both programs use core correctional practices such as risk/needs assessment, coordinated reentry planning, and evidence-based programs and practices to reduce recidivism; one connects clients to a reentry case manager pre-and post-release (Reentry program 1), the other to a reentry probation officer (Reentry program 2). The evaluation found that both reentry programs reduced rearrest among participants and prolonged time to rearrest. These findings are supported by ample evidence of implementation fidelity. For example, both programs consistently targeted moderate-to high-risk inmates, conducted assessments, used coordinated case planning, and linked clients to EBPs including cognitive behavioral interventions.2 Read the full report at http://www.urban.org/sites/ default/files/alfresco/publication-pdfs/413252-Evaluation-of-the-Allegheny-County-Jail-Collaborative-Reentry-Programs.PDF.

 


1 Miles, Mathew, Harvey Hornstein, Daniel Callahan, Paula Calder, and R. Steven Schiavo. 1969. “The Consequences of Survey Feedback: Theory and Evaluation.” In The Planning of Change, edited by Warren Bennis, Kenneth D. Benne, and Robert Chin (457–68). New York: Holt, Rinehart and Winston.

2 Buck Willison, Janeen, Sam G. Bieler, Kideuk Kim. 2014. “Evaluation of the Allegheny County Jail Collaborative Reentry Programs.” Washington, DC: Urban Institute.

Evaluation Techniques

The type of assessment and self-evaluation you decide on depends on the data you have and the outcomes you wish to evaluate. Though we often use the term self-evaluation in the general sense, there are many types of evaluations. The five most common you might use for the TJC initiative are

1. Process Evaluation: Documents all aspects of program planning, development, and implementation and how they add value to services for those transitioning from the jail to the community.

Data sources that support process evaluations usually include program materials, direct observation of the intervention, and semi-structured in-person interviews with staff and other stakeholders that focus on the intervention.

2. Outcome Evaluation: Assesses the extent to which an intervention produces the intended results for the targeted population; outcome evaluations typically use some kind of comparison group (e.g., participants who are similar to the target population but don't get the intervention being evaluated). This technique is more formal than performance measurement.

Note: Outcome evaluations are in-depth studies that include comparison groups; these evaluations take many months to obtain results and are often expensive. An independent evaluator may be needed. The benefit of an outcome evaluation is that it answers specific questions and it attributes outcomes directly to the program or initiative studied.

3. Performance Measurement: Based on regular and systematic collection of data to empirically demonstrate results of activities.

Note: Performance measurement only tracks outcomes. Unlike an outcome evaluation, it cannot attribute those outcomes or changes to specific program activities. However, performance measurement is relatively easy to design and implement, and it is less resource intensive than outcome evaluations.

4. Cost-Benefit Evaluation: Measures how much an initiative, its programs, and partnerships cost, and what, if any long- short-term savings the initiative generated.

5. Quality Assurance (QA) Assessment: Involves systematic monitoring of the various aspects of a program, service, or process to ensure that standards of quality are being met; under TJC, this would include your screening, assessment, programming and case planning services. For example, QA data collection that supports QA practices could include a pre- and  post-test  administered short questionnaire to participants before class starts and then at the end or a brief client satisfaction survey asking them about the  quality of services they received. 

Below we explore two evaluation techniques in more depth:

Process Evaluation

A process evaluation will help you determine whether the TJC initiative and its programs are being implemented in the intended way, and what types of clients typically participate in the initiative.

The process evaluation focuses on capturing the basic elements of the TJC initiative as it presently functions in your community.

These data would be captured through structured observations of the TJC stakeholders, interviews with program staff, and a review of all available documentation.

Basic system-level questions you would seek to answer include

  • What is the overall TJC initiative strategy?
  • How is it different from business as usual?
  • Who is involved? Who are the stakeholders?
  • What does each stakeholder contribute?
  • What are the core elements of the approach?
  • What are the mechanisms for collecting data on clients—prior history, current experiences, and follow-up?

Additional questions include

  • How many agencies, partners, and clients participate in the TJC initiative?
  • What is the pool of potential participants?
  • What are the eligibility criteria to participate?
  • How many participate in each program?
  • How long do they remain engaged with each service provider before and after release?
  • How do potential participants learn about the TJC initiative?
  • How do TJC participants differ from others incarcerated?
  • What types of services or referrals does each participant receive?
  • What are the background and demographic characteristics of participants for each service?
  • Why did the participants show up to the community providers after release?

Process evaluations also assess penetration rates and program fidelity. These terms are defined below:

Penetration Rate: The TJC initiative's reach into the target population. In other words, the number of inmates engaged in the program divided by the number of eligible inmates in the target population.

Program Fidelity: How closely the implementation of a program or component corresponds to the original model.

This is particularly important in the TJC Initiative because with limited time and resources it is imperative that all program elements adhere to the originally designed program model in order for the intervention to be as successful as possible.

Quality Assurance:  A robust QA process supports the improvement of transition work over time (and makes deterioration in quality less likely). A QA plan allows all providers to participate in a process of self improvement. It also pushes the development of clear shared standards for how key elements of the transition process should be carried out, fostering consistency of approach throughout the system.

The following programmatic Quality Assurance strategies/activities are critical in monitoring how effective your programs are performing.

First, identify the key components that make this a quality, evidence-based process:

  • Is it an evidence-based or a best-practice program?
  • What types of offenders are best suited to benefit from the program?
  • Are risk to reoffend screening data used to inform placement and/or system action?
  • How are offenders identified for placement in the program (e.g., based on what criteria? By whom?)? 
  • What are the minimum resources required to implement the program effectively (e.g., qualified staffing, adequate space, appropriate technology, sufficient time, participant criteria)?
  • Does the program come with a comprehensive curriculum and training documents provided by the program developer?
  • Is there an understanding of how the program was intended to be implemented? For example, the program's duration, class size, frequency of sessions or activities, and  materials to be used or discussed in delivery of the program.         
  • Is there an agreement on what system and individual level outcomes would indicate program success (i.e., the program is achieving the desired outcomes)? Is there a clear target population for the program?
  • Does the program target and reduce specific criminogenic needs?

Second, work with staff on site: What were the criteria for program staff selection?

  • Is the staff familiar with the participants' needs?
  • Does the staff person have a background in delivering groups?
  • Are staff experienced in delivering these curricula to an offender population within a correctional environment?
  • Was the staff provided comprehensive training before program implementation?
  • Does staff understand and support screening and assessment and identification of offender groups for programming? 
  • Does the staff maintain characteristics that facilitate communication?
  • Is a thorough implementation plan developed prior to the start of the program?
  • Are appropriate resources made available to staff and participants?
  • Does the staff have access to a staff training manual?
  • Is there ongoing training and supervision for the program staff?
  • Has the staff been tested to insure on their understanding of program curriculum, requirements, and goals?

Third, monitor the program's operations and measure the program's performance.

  • Are screening and assessment procedures and process followed as designed – e.g., are the right people being screened and assessed?
  • Are program eligibility criteria adhered to?
  • Are programs being facilitated/delivered by trained (certified) staff?
  • Are case plans being developed in a timely manner according to established benchmarks determined by the initiative's partners?
  • Do case plans incorporate assessment data and address the individual's criminogenic needs?
  • Is the program held in an adequate space?
  • Is there an agreement on what aspects of the program will be measured?
  • Does sufficient data exist in electronic format to enhance performance evaluation?  Is a system in place and evaluation tools developed to gather performance and outcome feedback from the program participants and staff (e.g., observations, surveys, administrative data, audits, assessment instruments, and file reviews)?
  • Is there adequate record keeping?
  • Can you measure short, intermediate, and long-term outcomes?

Fourth, improve the program through:

  • Quality team collaboration
  • Using a strength-based, supportive approach
  • Being results-oriented based upon objective, transparent measures
  • Using measures that are individual- and system-focused
  • Embracing a learning organization orientation
  • Enhancing long-term sustainability through policy adjustments that are informed by objective evaluation
  • Celebrating success and improvement

 Sample System Questions for consideration to maintain program philosophy and integrity

  • What staff will be allocated to oversee the quality assurance (QA) process?
  • How will QA outcomes be reported, to whom, and for what purpose? 
  • How will observations and feedback be structured?
  • How will system and individual audits be structured?  How often will they be conducted? By whom? How will outcomes be utilized? 
  • How will this quality assurance process guide the adjustment of curriculum and programming to better meet the needs of the clients being served?
  • How will gaps between the current and expected levels of quality be addressed?
  • What process will be enacted to utilize QA outcomes to revise policy, procedure, and/or practice?
  • How will revisions be reported to TJC, system, or organizational stakeholders?

 Final Report: Process & Systems Change Evaluation Findings from the TJC Initiative is a detailed account examining how implementation worked in the TJC Phase 1 learning sites.

Resources

  1. Bureau of Justice Assistance (BJA). Center for Program Evaluation and Performance Measurement. State and local agencies will find useful resources for planning and implementing program evaluations and for developing and collecting program performance measures at this site.  
  2. Council of State Governments. 2005. Report of the Re-entry Policy Council: Charting the Safe and Successful Return of Prisoners to the Community .
  3. Denver’s Crime Prevention and Control Commission. Quarterly progress report of TJC system service providers including basic performance measures.
  4. Denver. Community Reentry Project (CRP) Evaluation. A two page self-report survey from Denver’s Community Reentry Project designed to determine how clients perceive the programmatic staff and programs. 
  5. Fresno County, CA. Fresno County presentation on the TJC Unit to the CCP.
  6. Domurad, Frank and Mark Carey (2010). Coaching Packet: Implementing Evidence-Based Practices.  The Center for Effective Public Policy.
  7. Klekamp, Jane (2012). An Investment in the Future: La Crosse County Charts a Course for Transition from Jail to the Community. National Jail Exchange. This report examines La Crosse, Wisconsin implementation of the Transition from Jail to Community Initiative.
  8. Moore, Byron. 2008. Programs Unit Report: Jail Re-Entry . Multnomah County Sheriff's Office.
  9. The Omni Institute (OMNI). Online survey and cover letter developed by OMNI institute to learn more about Denver’s reentry process.
  10. Orange County, CA. Transitional Reentry Center Exit Interview on Program Responsivity.
  11. Orlando, FL., Inmate Re-Entry Program Review Score Sheet . The process evaluation was designed by the Orange County, Florida, Corrections Department as an easy-to-use program review score sheet for its reentry program to determine if the program was in compliance with the department’s standards.
  12. Process and Systems Change Evaluation Findings from the Transition from Jail to Community Initiative (2012). This report examines the six Phase 1 sites' TJC implementation experiences and presents findings from the cross-site systems change evaluation.
  13. Urban Institute. Outcome evaluations require some level of statistical knowledge. Click here for a brief discussion of descriptive, bivariate, and multivariate statistics.
  14. Urban Institute. Excel formula tutorial to help agencies evaluate the number of unique bookings.
  15. Urban Institute and Douglas County, KS. Transition from Jail to the Community Stakeholder Survey.
  16. Urban Institute. TJC Core Leaders Phone Interviews. 2009. A guide to evaluate TJC progress.

Let's Review

Let's revisit what we have learned so far in the Self-Evaluation and Sustainability module. Please answer the following question.

A process evaluation documents

How much the TJC initiative, its programs, and partnerships cost.

The impact of the TJC initiative and its programs.

All aspects of the TJC's program planning, development, and implementation.

All of the above.

Summary

Self-evaluation through appropriate data collection and analysis is no simple task; however, due to the complexity of a full jail transition effort, constant evaluation is essential to ensure that your resources are being spent wisely and key outcomes are achieved. Established processes of self-evaluation also influence organizational and system culture by examining and monitoring costs, processes, and outcomes, and generate data-driven policies and procedures. Local or departmental capacity should be assessed and developed to accomplish proper evaluation of the TJC implementation; however, if time or resource prohibits this, research-oriented agencies or universities often are willing to offer assistance.