Skip to main content
U.S. flag

An official website of the United States government

Section 5: Terms Used in the Field

This section defined basic terms used in this module. These terms have been highlighted in purple throughout the module allowing you to roll over the term to see the definition.

Actions taken in order to meet objectives.
“Evaluation has several distinguishing characteristics relating to focus, methodology, and function. Evaluation (1) assesses the effectiveness of an ongoing program in achieving its objectives, (2) relies on the standards of project design to distinguish a program's effects from those of other forces, and (3) aims at program improvement through a modification of current operations.”1
A process in which outside staff and organizational members collaboratively gather, analyze, and interpret data and then use their findings to alter aspects of the organizational structure and work relationships.
What an initiative is designed to achieve, typically general in nature and describing long-term outcomes.2
The changes at the individual, organizational, or system level intended as the result of an initiative.
Completed activities internal to the initiative or organization as specified strategies are implemented.3
Performance measurement:
“Involves ongoing data collection to determine if a program is implementing activities and achieving objectives. It measures inputs, outputs, and outcomes over time. In general, pre-post comparisons are used to assess change”.4
Performance measures:
“Ways to objectively measure the degree of success a program has had in achieving its stated objectives, goals, and planned program activities. For example, number of clients served, attitude change, and rates of rearrest may all be performance measures”5
“The evaluation of a program by those conducting the program”6
Logic Model/Theory of change model:
“A diagram and text that describes/illustrates the logical (causal) relationships among program elements and the problem to be solved, thus defining measurements of success.”7


Self-evaluation and sustainability are key components of the TJC initiative. Self-evaluation helps you understand how well the initiative is working and what changes need to be made to achieve better outcomes. As you have seen, the process is not complicated. First, draft an evaluation roadmap that outlines how you plan to evaluate the TJC initiative, including developing your TJC performance measurements. Next, form a data/evaluation working group, formalize your data collection procedures, analyze the data, and disseminate the findings. Ensure the TJC initiative's sustainability by clarifying roles and responsibilities of the initiative's participants, develop a culture of data sharing, outreach, and leverage your community's resources to support the initiative.

1 U.S. Environmental Protection Agency. n.d. “Program Evaluation Glossary.”

2 Miles, Mathew, Harvey Hornstein, Daniel Callahan, Paula Calder, and R. Steven Schiavo. 1969. “The Consequences of Survey Feedback: Theory and Evaluation.” In The Planning of Change, edited by Warren Bennis, Kenneth D. Benne, and Robert Chin (457–68). New York: Holt, Rinehart and Winston.

3 Ibid.

4 Bureau of Justice Assistance. Center for Program Evaluation and Performance Measurement. 

5 Ibid.

6 U.S. Environmental Protection Agency. n.d. Online Program Evaluation Glossary.

7 U.S. Environmental Protection Agency. n.d. “Introduction to Logic Modeling, Performance Measurement and Program Evaluation: A Primer for Managers.”