Leadership Program: Making a difference?
Evaluation explores how the program affects rural communities
The Ford Institute for Community Building has been conducting leadership classes in rural communities throughout Oregon and northern California for the last nine years. The anecdotal evidence that we’ve heard strongly suggests it works — that the program develops community leaders, increases civic engagement and helps build strong organizations.
But thinking we know what works is different than knowing, and that’s why we continually evaluate the program. This issue of Community Vitality is devoted to the results of the comprehensive evaluation recently completed by the Oregon State University Extension Family and Community Health program.
the numbers can tell us only part of the story
The evaluation was conducted in segments over the past several years and included participant surveys, focus groups and interviews with past participants and community residents.
We wanted to know not just if the Leadership Program made a difference, but how. We explore the answers in the pages that follow.
There are a lot of good reasons to conduct program evaluations. The most obvious, of course, is to determine the results of the large investment made in the Leadership Program and to look for indications that it is benefiting the communities it aims to serve.
We also use the results of the evaluation to improve program delivery. The feedback we get from the people on the ground is invaluable in helping us make mid-course corrections and improvements. We are now, for example, in the sixth major revision of the Leadership Program curriculum, changes driven largely by comments and suggestions from participants.
The evaluation also allows us to share the lessons we learn with other audiences that can benefit—other foundations, for example, as well as the public and other interested people. A nonprofit foundation in British Columbia even adapted the Leadership Program to the needs in its province.
What does success look like?
Although it’s easy to do an evaluation without talking to people, the numbers can tell us only part of the story. When we designed the evaluation, we intentionally set out to collect the soft data—anecdotes and stories—that really tell us what is going on. The focus groups and interviews allow us to collect information that may not be covered in the survey. It provides people with a way to share what the impact of the program has been, and to tell us what has worked and what has not. The stories bring the data alive for everyone.
The evaluation set out to answer several questions, including these: Does the program develop effective community leaders? Does it contribute to increased civic engagement? Does the program build strong organizations?
The evaluation’s conclusion came as no surprise: the answers were all yes. The program’s overarching approach works: Community leaders who attend the classes learn skills, maintain them and use them.
And now that we know that the program is meeting its goals, we can look to the future. The hope is that these leaders with increased capacity will use their skills and their commitment to make things happen in their communities.
Next, we’d like to measure the effect the program has had on the economic health of our communities. But for now the evaluation findings confirmed for us something else we already knew—this is long-term work.