Diamond
Important Note If you wish to use the information provided in this page, please reference as follows: Popper, R. (2008) Foresight Methodology, in Georghiou, L., Cassingena, J., Keenan, M., Miles, I. and Popper, R., The Handbook of Technology Foresight: Concepts and Practice, Edward Elgar, Cheltenham, pp. 44-88. |
The Foresight Diamond (Popper, 2008) is a framework that positions methods based on their main type of knowledge source (based on creativity, expertise, interaction or evidence). However, it is important to emphasise that these domains are not fully independent from one another. These sources of knowledge consist of:
Creativity-based methods normally require a mixture of original and imaginative thinking, often provided by technology “gurus”, via genius forecasting, backcasting or essays. These methods rely heavily on (a) the inventiveness and ingenuity of very skilled individuals, such as science fiction writers or (b) the inspiration which emerges from groups of people involved in brainstorming or wild cards sessions. As Albert Einstein once stated: “The only real valuable thing is intuition … Imagination is more important than knowledge. Knowledge is limited. Imagination encircles the world” (Einstein as noted by Viereck, 1929).
Expertise-based methods rely on the skill and knowledge of individuals in a particular area or subject. These methods are frequently used to support top-down decisions, provide advice and make recommendations. Common examples are expert panels and Delphi, but methods like roadmapping, relevance trees, logic charts, morphological analysis, key technologies and SMIC are essentially based on expertise. A warning note about expertise is sounded by Arthur C. Clarke (1962, p. 14): “If an elderly but distinguished scientist says that something is possible, he is almost certainly right, but if he says that it is impossible, he is very probably wrong”.
Interaction-based methods feature in foresight for at least two reasons – one is that expertise often gains considerably from being brought together and challenged to articulate with other expertise (and indeed with the views of non-expert stakeholders); the other is that foresight activities are taking place in societies where democratic ideals are widespread, and legitimacy involves “bottom-up”, participatory and inclusive activities, not just reliance on evidence and experts (which are liable to be used selectively!). Scenario workshops, voting and polling are among the most widely used methods here; of course these often require some sort of expertise to apply the method and inform the interactions. Other methods like citizen panels and stakeholder analysis are becoming popular because of their potential contribution to further networking activities. But it is not always easy to encourage participation and the anonymous saying accurately states that “the world is ruled by those who show up”.
Evidence-based methods attempt to explain and/or forecast a particular phenomenon with the support of reliable documentation and means of analysis. These activities are particularly helpful for understanding the actual state of development of the research issue. For this reason, quantitative methods (e.g. benchmarking, bibliometrics, data mining and indicators work) have become popular given that they are supported by statistical data or other types of indicator. They are fundamental tools for technology and impact assessment and scanning activities (see Porter et al., 1980). These methods can also be employed to stimulate creativity (sometimes by challenging received wisdom). And while supporting workshops, evidence-based information is quite useful to encourage interaction and getting feedback from participants. A word of warning here, for both practitioners and users, may be the well-known quote attributed to Benjamin Disraeli by Mark Twain (1924): “There are three kinds of lies: lies, damned lies, and statistics” – which basically points out that sometimes statistics are used to mislead the public.
Information technology (IT) tools (see also Futures Diamond) are being applied to most of these approaches, especially interaction- and evidence-based activities. Many applications are available now to support modelling, data mining, scanning, participatory processes, and visualisation – there are even tools designed to facilitate creativity. Use of IT does not always mean more effective application of foresight techniques, however. Salo and Gustafson (2004) identified five factors which need to be met in order to make good use of IT here: a clear mandate from the sponsoring organisation; high-quality process and technical facilitation; presence of senior representatives; presentation of unequivocal information inputs; and sufficient time for informal debate.
The important role of evidence-based methods in foresight was revealed in the first examinations of foresight practices in several hundred cases (mostly from Europe), by Popper et al. (2005, 2007) and Keenan et al. (2006). A more comprehensive and detailed analysis of foresight practices in Europe and other world regions is available in the Mapping Foresight report.
The methodological framework used in a foresight project should be tailored to meet the specific objectives of the project and the resources and capabilities that are available. While the previous section provides advice with respect to the use and selection of methods, this section draws attention to inter alia the articulation and combination of methods. Many of the methods described above can be used at different stages in a foresight process and practitioners should take into account (a) the contribution of each in the context of the study as a whole, and (b) the ways in which individual methods can be combined and synthesised to positive effect.
There is no “ideal” methodological framework providing the “best” combination of methods. In fact, there is no “ideal” number of methods to be used in a project. Popper et al. (2005) took a sample of 130 cases from 15 countries (Austria, Belgium, Czech Republic, Denmark, Estonia, Finland, France, Germany, Italy, Netherlands, Spain, Sweden, Turkey, UK and the USA) and found an average of five to six methods per exercise. Countries such as Turkey and the UK demonstrated a high propensity to mix several methods, whilst others (e.g. Denmark and the USA) tended to exhibit greater conservatism in terms of methodological scope. Whilst this is interesting, the reliability of the results should not be taken at face value given the relatively small numbers of exercises considered. However, if we assume for a moment that, on average, foresight projects will combine six methods, then, with all 33 methods above as eligible options, a question we might wish to address is “what number of possible permutations (i.e. a selection of methods in which the order of the methods matters) exists?” In other words, how many ways can six methods from a set of 33 be combined in order to generate a methodological framework? The answer is simple, using the permutation formula, there are nearly 800 million ways of combining six methods from a set of 33 to build a methodological framework. Having said this, it would be remarkable to find practitioners selecting methods from the vast range available in a random fashion: it is of course always the case that expertise and accumulated know-how in the use of certain methods will provide a rational justification for the selection of a particular combination (see also Popper et al., 2007, pp. 25).
To illustrate this, the following charts plot two idealised frameworks showing how different techniques may be combined within overall methodological frameworks (using six methods only). The methodology is described in two ways:
- Forward (combining methods in one sequence); and
- Backward (combining methods in reverse order of sequencing).
The role of methods in Methodology X (forward)
The following bullets indicate the sorts of application that may be scheduled for each of the techniques, illustrating how these may be used if Methodology X is carried out in a forward sequence:
- Scanning: detailed analysis of main issues around a particular sector/theme (sub-contracted);
- Delphi: large-scale exploratory study assessing the likelihood of occurrence and possible impacts of issues highlighted by the scanning activity;
- Wild Cards: workshop-type activity aimed at identifying events which may challenge the occurrence of “highly probable” situations in the future;
- Citizen Panels: conference-type activity aimed at identifying major public concerns on critical issues;
- Expert Panels: reduced group of key stakeholders looking at future implications of findings;
- SWOT: internal activity for synthesising outcomes in terms of current strengths/weaknesses and future opportunities/threats.
The role of methods in Methodology X (backward)
The following bullets indicate the sorts of application that may be scheduled for each of the techniques, illustrating how these may be used if Methodology X is carried out in a backward sequence:
- SWOT: large-scale workshop aimed at identifying strengths, weaknesses, opportunities and threats related to a specific sector or industry;
- Expert Panels: groups of experts looking at future implications of SWOT findings and clustering main issues into broader dimensions, e.g. social, technological, etc.;
- Citizen Panels: regional task forces contextualising key issues and evaluating public acceptance;
- Wild Cards: internal activity aimed at identifying disruptive trends and events;
- Delphi: large-scale normative study aimed at formulating policy recommendations;
- Scanning: internal activity aimed at identifying similar policies being implemented in comparable contexts.