Blog 3: Monitoring, evaluation and learning from a Transformative Innovation Policy test project

Estimated Time
30 minutesutes
Resource Type
Learning objective/Key features

Top Takeaways on Conducting a TIP Experiment:

For Policy Teams:

  • Soften into the theory with more activities and time to build trust and rapport with the TIPC project team so that the theory lands more easily and people can ‘locate’ themselves within it. At the beginning it can feel like ‘a roller-coaster in a thunderstorm!’ Prepare for the change in perspective and not doing ‘business-as-usual’.
  • Trust and good collaboration arose midway in the process. But at the beginning, as with many things, it was daunting and there was uncertainty. So, be open to this and have reflexive sessions for the team built into the side of the project.
  • The prototyping experience and learning can be taken into other key projects. For example, South Africa are applying it to their National Strategy for Natural Capital Accounting (NCA).

For TIPC research teams:

  • For Stakeholder management, have committed and deeper conversations and workshops with partners to discuss their needs, mutual expectations of the collaboration, their perceptions of TIPC, and experiences with MEL throughout the process.
  • Clarify the role of the TIPC team (not consultants, but partners) and the purpose of MEL for cocreation towards transformative outcomes.
  • Dedicate more time to translating concepts, brokering activities, and overall accessibility. Making a reiterating connection to project partners’ work and contexts.
  • Translate theoretical concepts into something meaningful to partners, emphasize a two-way learning process.
  • Add an internal ‘team learning’ strand to the learning strand.
  • Involve the same people across various policy experiments to support learning and facilitate the transfer of knowledge and understandings.
  • Use and understand theoretical concepts in similar ways across projects, especially since generalization helps identify common challenges in different contexts.
  • Develop generalizable tools for beyond the TIPC lifetime.

“It was like a roller-coaster in a thunderstorm! Once… familiarity grew, it was easier to talk about concepts. I found the practitioner-academic dynamic so fascinating. I’m so grateful for this experience.”

Evaluation comes at the end, right? Along with the learning? And the outcomes of the project? Not with the TIPC methodology and approach. It does not accept this linear conceptualisation of MEL. The TIPC approach looks at doing things differently – not ‘business-as-usual’. Across the Consortium with policy or programme experimentation, we think again about how to monitor, to evaluate and to learn from projects, overlaying the ‘TIPC methodology’.

Together, researchers and policymakers put policy practice under the microscope to examine and co-create transformative experiments that help tackle the Sustainable Development Goals (SDGs). The newly-devised ‘TIPC Methodology’ is a multi-step learning journey that utilises Formative Evaluation as a springboard towards Transformative Outcomes. Here’s how the South Africa water project team found the new approach.

With the TIPC methodology Monitoring, Evaluation & Learning (MEL) happens continuously in situ, not at the final stages. The MEL strands interweave throughout to make up the strength of the methodology. MEL is not completed by a separate team hovering over, dissecting the project at the end with a tick-box check-list, but by those running, doing, living, adapting and evolving the project, programme or policy. And so, the findings have immediate application and action.



Bloomfield, G; Daniels, C (2021) Blog 3: Monitoring, Evaluation & Learning (MEL) from a Transformative Innovation Policy test project: Reflections from South Africa’s Water and Biodiversity Programme

One thought on “Blog 3: Monitoring, evaluation and learning from a Transformative Innovation Policy test project

  1. What a wonderful story and experience! Well done to TIPC and the team 👌

Leave your feedback on the resource

Your email address will not be published. Required fields are marked *