Evaluating

Popper et al. (2010) Evaluating Foresight

A Fully-Fledged Evaluation

Evaluating Foresight (Popper et al., 2010)

Reflecting on the journey of advancing foresight evaluation, particularly through the development and application of a set of 20 criteria for a comprehensive foresight evaluation methodology, brings to mind the experiences and insights garnered from the evaluation of the Technology Foresight Programme in Colombia back in 2010. This endeavour was not just about assessing a program; it was an exploration into the depth and breadth of foresight’s impact, particularly in the spheres of science, technology, and innovation policy.

Foresight, traditionally seen as a forward-looking discipline, often escapes rigorous evaluation due to its forward-looking nature and the complexities involved in measuring its outcomes and impacts. The challenge has been not only to observe foresight’s immediate outputs but to understand its deeper effects on policy, innovation, and strategic direction. In Colombia, we ventured beyond traditional metrics, recognising that foresight’s value lies not merely in its predictive accuracy but in its ability to engage stakeholders, inform decision-making, and shape the trajectory of science, technology, and innovation.

Our approach to evaluating the Colombian Technology Foresight Programme was guided by a triad of evaluation concerns: accountability, justification, and learning. These dimensions served as the foundation upon which the 20 criteria were developed, encompassing efficiency of implementation, impact and effectiveness, and appropriateness of the foresight activities. Through these lenses, we sought to not only account for the resources invested and activities conducted but also to delve into the foresight programme’s rationale, its alignment with the broader policy environment, and its capacity to catalyse learning and improvement in foresight practice.

The process of developing these criteria was an exercise in humility and reflection. It was about recognising the inherent uncertainties and complexities of forecasting the future and the consequent need for a methodological approach that is both rigorous and adaptive. It was a journey that required us to engage deeply with stakeholders, to listen and learn from their experiences and insights, and to continuously refine our approach to better capture the multifaceted impacts of foresight activities.

Applying the 20 criteria to the evaluation of the Technology Foresight Programme in Colombia revealed the programme’s nuanced impacts, from fostering networks and collaborations to influencing policy directions and innovation trajectories. This application was not just an academic exercise; it was a practical initiative that provided valuable insights into how foresight can be effectively conducted, evaluated, and, most importantly, utilised to inform and shape future-oriented policies and strategies.

In retrospect, the development and application of these criteria represent a significant milestone in my journey in foresight evaluation. It underscored the importance of a systematic, comprehensive approach to evaluation that goes beyond measuring outputs to understanding outcomes and impacts. It also highlighted the critical role of engagement and alignment with the policy and innovation ecosystems in which foresight operates.

This experience, while specific to the Colombian context, offers broader lessons for the foresight community. It demonstrates the value of a methodological approach to evaluation that is adaptable, reflective, and grounded in the realities of foresight practice. As we continue to navigate the complexities of the future, such approaches to evaluation will be indispensable in ensuring that foresight remains a relevant, effective tool for guiding innovation and policy in an ever-changing world.

We considered the following 20 criteria to evaluate the programme:

  • Criterion 01: Appropriateness and level of achievement of objectives
  • Criterion 02: Performance of the management and funding mechanisms
  • Criterion 03: Justification of the programme in terms of value for money
  • Criterion 04: Effectiveness and efficiency of the organisational structure
  • Criterion 05: Effectiveness and efficiency of the approaches and methods
  • Criterion 06: Effectiveness and efficiency of implementation and aftercare
  • Criterion 07: Level of capacities and foresight culture achieved
  • Criterion 08: Level of national, sub-national and international presence
  • Criterion 09: Level of commitment of participants
  • Criterion 10: Level of novelty and impact of projects
  • Criterion 11: Impact on public and private policies and strategies
  • Criterion 12: Impact on agendas of STI programmes and institutions
  • Criterion 13: Impact on the consolidation of research groups
  • Criterion 14: Impact on the consolidation of S&T capacities
  • Criterion 15: Impact on other (inter)national projects
  • Criterion 16: New products and services (publications, courses, etc.)
  • Criterion 17: New policy recommendations and research agendas
  • Criterion 18: New processes and skills (management, research)
  • Criterion 19: New paradigms or scientific/technological developments
  • Criterion 20: New players (e.g. sponsors, collaborators, networks)