The organization identifies and tracks key process metrics for engaging PWLE.
- To what extent do we assess our process of engaging PWLE?
- What metrics do we track or could we track to assess our engagement process?
- Collaborate with PWLE to identify metrics. Work with PWLE to determine what to measure and how, ensuring that the metrics reflect what “meaningful engagement” means to them.
- Use a mix of quantitative and qualitative measures. Quantitative metrics (e.g., attendance, representation, survey responses) reveal patterns and scale, while qualitative measures (e.g., reflections, post-meeting check-ins) provide depth and insight into experience. Together, they give a complete picture of engagement.
- Keep tracking simple and integrated. Start small by identifying engagement process data you may already be collecting and take time to review it to observe trends and changes. Incorporate engagement tracking fields into meeting and program management documents (e.g., meeting minutes), and set aside time during meetings to discuss and document feedback from PWLE as well as staff about their experience working together.
- Consider formal evaluation tools. For a more formal approach, use evaluation tools such as the ones cataloged by the National Academy of Medicine: Assessment Instruments for Measuring Community Engagement
- Review and adapt regularly. Schedule quarterly or annual debriefs with PWLE and staff to review process data, celebrate successes, and identify opportunities for improvement.
Process metrics you can track include:
- Participation
- Number of PWLE involved in your engagement programs
- Number of new PWLE engaged/recruited
- Attendance at meetings and retention over time
- Accessibility and inclusion
- Diversity of PWLE involved in your engagement programs
- Consider collecting demographic information from PWLE who are engaged and comparing this to demographics of the overall community and/or your organization’s service population
- Availability and/or improvements made in ensuring accessibility of engagement opportunities
- Such as providing translation and interpretation, transportation, digital access, etc.
- Diversity of PWLE involved in your engagement programs
- Capacity and support
- Availability and/or improvements made in orientation/training for PWLE
- Availability and/or improvements made in ongoing supports (e.g., peer networks, mentorship, learning opportunities)
- Availability and/or improvements made in compensation and reimbursement practices
- Authenticity and power-sharing
- Degree of engagement and participation at meetings
- Such as what percentage of PWLE involved actively participate or speak up and/or self-reported feedback about the degree to which people feel comfortable and able to participate
- Influence of PWLE on decisions (i.e. tokenism vs. co-decision-making)
- Examples of specific changes made as a result of input from PWLE and/or self-reported measures of how PWLE feel their input makes a difference
- Feedback and transparency to PWLE and staff about how input is shared and used
- Such as existence of and/or improvements made in a feedback loop process
- Degree of engagement and participation at meetings
- Satisfaction and trust
- PWLE perception of respect and value
- Such as self-reported measures or open-ended feedback from PWLE about whether they feel respected and valued
- PWLE perception of respect and value
- Sustainability
- Resource availability for engagement
- Such as budget allocations, funding, staff roles, and other resources available
- Structures for engagement (e.g., advisory councils, standing committees, governance seats for PWLE)
- Such as number and type of permanent or semi-permanent avenues for PWLE to partner with the organization
- Resource availability for engagement
Community Health Plan of Washington’s approach to process measurement makes space for measures from the perspective of the organization and from the perspectives of PWLE.
“There’s the operational successes that are kind of guided by contract requirements and our strategic planning target goals and all of that. And then there’s this sense of trust and engagement within the actual spaces themselves that are a little bit harder to quantify or measure, but that are just as important. We’ve got annual goals related to the processes around the advisory councils. We have contract language about how many members are participating; the distribution of those members across demographic factors; the number of meetings that we want to have every year; how many folks sign up and then attend – those kinds of things that certainly show us that we’re moving in the right direction. But I think when I talk to the team and they talk about the depth and level of the feedback, it’s clear that members trust us and are willing to open up and share about things that are really tough. That tells me a lot more about how impactful the space is from the people with lived experience perspective, not the plan perspective and operational perspective. So yeah, I kind of think about it in those two ways.”
“One of our measures of success is how our members feel after the meeting. We have that post member advisory committee survey that we’re doing and we ask “how confident are you that your feedback is going to be used to drive change in our programs and our policies?” I think we’ll have a better understanding of how we’re doing probably once we’ve gotten a few more surveys under our belt, but we’re hoping to do that each quarter. And then we’re also giving members the chance to share open-ended feedback by asking ‘how would you want this feedback process to work to make you feel more confident?’”