How Emergent Learning Hones Grantmaking and Deepens Partnerships

Haley Sammen, Evaluation & Learning Partner

By Haley Sammen

Peak Grantmaking

March 17, 2023

At the core of everything we do is the belief that we can do better—both with how we support people with mental health and substance misuse challenges in our community, and how philanthropy operates. Since we opened our doors in 2019, we have sought to live that belief through a learning-centered approach. Over the last four years, we have embedded learning into almost every aspect of our work while stewarding more than $100 million to over 200 unique organizations in the city. We do this because we know we can’t be truly effective without centering community insights to better understand what works and what doesn’t. Creating a learning partnership with our community is critical to being responsible stewards of these public funds. We need to learn with our grantees about their successes and challenges to gain insight into how we can all adapt and drive toward better results.

To put community learning and insight at the center, we spent our first year engaging with over 1,500 people in Denver uncovering their thoughts on what needs to look different in our community. They helped us to interpret available data from community needs assessments and other sources as well as shared their own experiences so we could better understand what it will take to address mental health and substance misuse in our community. Their insights drove the creation of a shared impact plan (SIP), which is the basis of our grantmaking as well as our evaluation and learning approach. The SIP names three specific systemic shifts the community identified for addressing mental health and substance misuse: inclusive access, attention to fit, and care over time. It also identifies 12 signals of progress the community believes will help us know if the funded work is on the right path towards the intended impacts. In partnership with communities, we are using the SIP to explore, experiment, and refine work across the city and our learning approach allows us to be in dialogue in real time about those experiences.

We are aiming to push boundaries for how learning can drive insights, adaptations, and accountability in philanthropy. One way we directly do this is through our evaluation and learning approach with grantees. We designed an approach rooted in curiosity, transparency, collaboration, and honoring community members as experts. We’ve spent the past four years building, piloting, and adapting this approach, which includes the following essential components, which are also detailed in our Evaluation Touchpoints guide for grantees.

We are aiming to push boundaries for how learning can drive insights, adaptations, and accountability in philanthropy.

1. A living project framework

At the beginning of each grant, the program officer and evaluation team member work collaboratively with staff from each grantee organization to create a living project framework. The framework outlines grant strategies, intended or ideal outcomes, and the types of supporting information a grantee already has or plans to collect to articulate the outcomes. This creates an intentional space where both parties are encouraged to clearly describe the priorities and purpose of the work, laying the groundwork for future honest conversations. This framework sets the stage for learning and adaptation by identifying the ideal results of the work, so that it can be compared to the “actual” of the work over time.

The project framework also includes space for program officers and grantees to collaboratively define how the grant aligns with the foundation’s SIP and what measures of the work will best illustrate impact. The assigned evaluation partner then works with the grantee to document a reporting plan specific to the data they currently have or plan to collect and what they consider to be meaningful measures of their work. The upfront work of cocreating this document allows grantees to inform the foundation on what they hope to see change, how they will know it when they see it, and how they will report on it, while keeping the data collection and reporting burden low.

2. Collaborative learning calls

We host collaborative learning calls with grantees at the midpoint and the end of each grant year. These conversations are loosely structured on the emergent learning practice of performing after-action reviews, a practice regularly used and promoted by the EL Community Project. Program officers create a nonjudgmental space when they check in on the shared expectations of the grant-funded work and support grantees in sharing about their progress through oral reporting. The structure and flow of the questions regularly prompts honest insights from both grantees and the foundation, creating opportunities to adapt and pivot the work in real-time. There is also an opportunity to refine evaluation measures to ensure the data is telling the most meaningful story of the work from the grantee’s perspective. 

On the end-of-grant-year call, we collaboratively review quantitative outcome metrics with grantees to ensure their experiences with and the larger context of the work are not lost behind the metrics they’ve reported. Through this dialogue, the grantees share with us what the data tells them about their work and progress as well as the experiences of the people they are serving. This process generates understanding, ideas, and hypotheses about the work that wouldn’t be possible by reviewing the data on our own.

By making space for meaningful numerical assessments as well as insightful qualitative reflections, both types of data are valued and used to inform actions within the partnership. Often, this also uncovers insights on broader drivers and barriers within the systems influencing mental health and substance misuse. 

This process is also part of Caring for Denver’s internal learning strategy, and we use the information to help plan future funding opportunities and refine organizational operations. Additionally, we share key insights back with the community through rapid cycle learning briefs to ensure that we are being transparent in how learning is shaping our work and to help inform broader thinking and change beyond our doors.

3. Tailored evaluation support

We recognize that our grantees have varying degrees of resources, capacity, and skills for evaluation and learning, so each grant has a designated evaluation partner. Grantees can connect with that person at any point during the year to help them with their evaluation and learning processes and goals. This can take the form of survey design or refinement, innovative data collection methods, or simply talking through what meaningful measures of their work can be. Through working directly with grantees on their evaluations, we are uncovering important insights about grantees’ strengths, concerns, assumptions, and needs related to evaluation. This real-time learning and feedback drive frequent adaptations and allows us to offer customized capacity-building opportunities for grantees with additional partners.  

This process is also part of Caring for Denver’s internal learning strategy, and we use the information to help plan future funding opportunities and refine organizational operations.

Our model puts community partnership and collaborative learning at the center of evaluation, and we find it drives higher quality, more accurate, and more meaningful data. With that in mind, learning partnership compensation dollars are also added to each grant as an acknowledgment of the time and expertise organizations invest in working with us. We believe this is an important way to honor the intellectual and emotional labor their participation in our approach to funder-grantee relationships requires and the immense value it brings to the foundation and community.

Each year, our small team of three evaluation staff in collaboration with five program officers collectively engages in over one thousand learning or evaluation support calls. To be honest, there are days when we question our time-intensive, sometimes messy process. Is this really a sustainable approach as the number of grantees grows? Is this worth the intense diligence it requires of us to ensure customization for each grant? But we are lucky enough to have regular and consistent wins that reinforce that this model is the right path for us.

4. Measuring learning success

We see signs of the trust and transparency our approach builds in several ways. Partner organizations are generally willing to share about the delays, mishaps, and lessons learned when grant accountability is centered on learning and adapting rather than perfectly executing on a funded proposal. Knowing the true state of the work rather than a well-intentioned but rose-tinted picture helps us better plan and support our grantees. 

For instance, we currently fund several unique workforce-care grant pilot projects directly stemming from grantees’ honest feedback on struggles with hiring, training, and retaining behavioral health staff. Without their honest feedback about what isn’t working and why, we wouldn’t have been able to help address their challenges.

Another success is in the evolution of our own understanding of signals of progress. A great example is in measuring what matters related to substance misuse. We initially assumed demonstrating sobriety or reduced substance use was the most important goal. Through our time on learning and evaluation support calls, we were able to expand our thinking thanks to our conversations with numerous community care leaders and behavioral health workers. Their experience in the community led them to articulate to us that, because substance misuse is often a lifelong issue, a more meaningful measure of progress is when people reengage with supports after having a recurrence of misuse. This indicates that people who are struggling are aware of and connecting to trusted means of support. Measuring program stability and participant engagement and reengagement with a program was more important to the community, and it’s now important to us.

We are also seeing grantees change their mindsets about the value of participating in evaluation and learning. In the beginning, this approach was so different that grantees were confused and even frustrated with our need to collaborate. Many times, they would say, “Just tell us what you want us to measure, and we’ll do it.”  We are steadily experiencing more grantees excited to tell us what measures are important to them and being motivated to lean into this approach. They willingly engage in conversations to identify their program-specific measures, and they are growing in how they think about data. On a recent end-of-grant-year learning call, a grantee showed impressive metrics, reaching almost 100% in all their measures. While they celebrated their success with us, they also decided to change their metrics for the next grant year. As a team, they had reassessed how to define success. They didn’t just want to affirm what they already knew even though it looked great on paper. They wanted to pick metrics that could help them learn more about their program.

We love that grantees are beginning to see themselves as expert learning partners, and many are using the tools to help bolster their ability to sustain their work. Several grantees have told us about using their project framework to help them successfully apply for local and national funding beyond Caring for Denver. A few have even used the tools and their learning reflections to restructure their internal evaluation practices within their organizations. Hearing how grantees are integrating learning practices into their work outside of our grants is one of the greatest rewards coming from our learning investments.

5. Learning from mistakes

One of the places we failed forward was in our project framework template. This document has taken on four variations from much too complex to excessively simplified, we are hopeful we are approaching something closer to just right. Because other versions of the template were not getting to the right level of shared clarity about the work, we found that it was not effectively setting up the learning partnership to foster transparency, curiosity, and ultimately trust. This need to adapt means some grantee partners have not yet had a clear and consistent experience with the evaluation aspect of their grant. It also means that our internal teammates have some adaptation fatigue when they are longing for consistency. We recognize the psychological impacts adaptation has on all parties. We are striving to stay humble with our asks and patient with our progress, acknowledging with our partners that we too are growing and learning. 

Despite some growing pains with this approach, one of our biggest wins is for our team. It is clear that our approach is naturally leading us to curiosity about the how’s and why’s of impactful community work. Learning what makes a program or project successful (or not) from the grantee’s perspective, rather than just checking a box that a project was done, makes our jobs a lot more interesting. We are all building our listening and learning muscles every day and doing better work because of it.

About Caring for Denver Foundation

Caring for Denver Foundation is a taxpayer-funded, nonprofit foundation addressing Denver’s mental health and substance misuse needs.