Intuitively, we know that involving people with lived experience of the challenge we want to address makes sense. But how do we know we executed a consultative process well?

Recently, I have been struggling with the lack of depth of the standard participatory processes that we use in social and behaviour change (SBC) programmes. Creating solutions that align with individuals’ needs, experiences, challenges, and aspirations is not merely a moral obligation; it is also essential for discovering methods of ‘doing development’ that are effective. One of the key challenges that I face is that it is difficult to find a clear evidence-based process that outlines how, when and under which circumstances enough participation has been done.

That is until I came across a spectacular publication from Project Evident titled Experts by Experience. How Engaging People with Lived Experience Can Improve Social Services. One of the things that caught my attention about this highly engaging read is how they point out that there is a lack of evidence for the effects and efficacy of leveraging “lived experience” in policymaking and programme design. Comparative studies on the impact of lived experience are currently lacking, and there is a limited amount of literature compiled in this specific area. While there is a growing movement to involve affected communities and people with lived experience (PLEs) in programme and policy development, more research is needed to understand how and why lived experience works and its effects on outcomes of interest.

The paper goes on to highlight that even though there is a lack of ‘rigorous’ evidence, there is plenty of high-quality qualitative evidence, along with the intuitive, logical and ethical considerations, that support the value of engaging PLEs even in the absence of extensive quantitative research.

Similarly, the Quality and Standards Framework which focuses on human-centred design (HCD) and adolescent sexual and reproductive health (ASRH) states that HCD should be used in ASRH programmes to ensure that interventions are user-centred, evidence-based, and responsive to the needs and aspirations of young people. This promotes collaboration, customisation, and continuous improvement, ultimately leading to more effective and impactful interventions in ASRH.

What are the key requirements for good engagement?

Both documents propose a set of principles to follow towards meaningfully engaging communities (see the table below for a summary).

Experts by Experience focuses on the importance of treating community knowledge as a form of expertise, which requires an intentional investment of resources and a mindset shift. It states that organisational leadership is essential for the success of such initiatives and PLEs involved in policy and programme development should reflect the community’s diversity. I think this point is particularly overlooked in many SBC programmes, and we often end up with a homogenous representation of the community. The creation of infrastructure and roles for meaningful power-sharing is also included.

The Quality and Standards Framework, which was designed for ASRH programmes, recommends the engagement of youth as design partners and the equitable inclusion of diverse subsets of young people. It calls for the development and implementation of safeguarding plans for young people and supports an iterative approach to programme design and implementation. It also suggests integrating primary and secondary learnings from evidence, engaging the ecosystem of influencers, and integrating disciplines crucial for adolescent wellbeing. Lastly, it advises documenting methods and key design decisions for transparency and accountability.

Adapting Arnstein’s Ladder of Citizen Participation for SBC

I was introduced to Arnstein’s Ladder of Citizen Participation for the first time through the Experts by Experience paper. I find that it is a useful quick framework to help us gauge the degree to which we are meaningfully engaging with communities in the SBC design process. I’ve proposed an adaption of it to SBC programmes in the table below.

Could this guide us towards a structured approach for assessing the level of community involvement in SBC programmes? At the highest level, “Citizen Control”, communities independently lead programmes with full decision-making authority. “Delegated Power” and “Partnership” designate significant community influence on programme decisions, either through majority control or collaborative governance. In contrast, “Placation”, “Consultation”, and “Informing” indicate lower degrees of participation, where community input may be sought but is not necessarily instrumental in shaping outcomes. Here, I’m afraid, is where many SBC programmes find themselves. The lowest rungs, “Therapy” and “Manipulation” represent nonparticipation, where community engagement serves to validate the programme without substantial input or power. I still come across too many programmes that fall in this category. Therefore, if we want to have SBC programmes that strive for authentic engagement, empowering communities to drive meaningful change, then we need to re-evaluate how we are engaging communities.

To conclude, we have tools that can help us understand whether we are engaging communities properly in programme design, implementation and evaluation. The question is, to what extent are we following the existing guidelines. We can reflect on our SBC processes using Arnstein’s adapted Ladder of Citizen Participation and improve our practices where they are lacking.

Where do you think SBC programmes lie on the participation ladder? And how can we improve the quality of our community participation processes in SBC? Let me know in the comments!