Mistra’s evaluations

Mistra’s research programmes are evaluated in the application phase, at the end of the first phase. Evaulations can also take place at any time in the course of the programme, sometimes as part of a thematic evaluation covering several programmes.

    According to the Statutes, Mistra should ensure ‘that the research funded by the Foundation is evaluated regularly’ (Article 19).

    Mistra’s research programmes are evaluated in different ways and on different occasions. Evaluation is always done in the application phase, and at the end of the first phase (the midterm evaluation). Evaulations can also take place at any time in the course of the programme, sometimes as part of a thematic evaluation covering several programmes.

    The purpose of the evaluations is to safeguard the quality of research programmes and other activities that Mistra funds. The evaluations give the Board and management a vital basis for their decisions.

    The evaluations, which usually take the form of peer review, focus on scientific quality and how the research benefits society. Other types of evaluation, covering such aspects as leadership and organisation, are also carried out. Mistra also continuously evaluates, for example, its own asset management and various communication efforts.

    Evaluation criteria

    Mistra usually applies the following criteria:

    • Approach, i.e. how far the programme has a solid, coherent concept and innovative focus, how well the goals are formulated and how well the anticipated effects (including indicators) are reported.
    • Scientific quality, i.e. how well the programme meets the requirements of advanced scientific competence, theoretical height and methodological quality.
    • Benefits, i.e. how well developed collaboration with users of research results is (and is expected to be), and what supportive communication processes and methods are used to achieve effective implementation.
    • Management and organisation. i.e. how the programme will be integrated into the host organisation, managed and organised, and how efficiently resources will be used.

     

    In addition, an evaluation may include one or more ad hoc criteria, such as ‘contributions to Sweden’s competitiveness’ or ‘establishing strong research environments’. For special evaluations, the criteria may be different.

     

    Impact of research on society

    It is difficult to predict exactly what results a research programme will generate or what impact the research will have. From an evaluation perspective, however, it is nonetheless important to try to formulate assumptions on how the results will be used. All applications should therefore contain a section on the anticipated effects on society. Known as ‘programme theory’ or ’impact logic’ in the language of evaluations, this goes beyond just measuring the degree of target fulfilment. Those who submit programme proposals to Mistra can themselves choose a method for reporting the intended impact logic, but the following is one example of a possible breakdown:

    • Results, such as scientific articles, reports, policy briefs
    • Outcomes, such as shared understanding, systems awareness and consensus on the nature of a problem
    • Effects on society, such as a change of course, transformational change and institutionalisation
    • Impact, i.e. long-term changes in society.

    Terms of reference for evaluations

    For each evaluation, terms of reference are drawn up. As well as defining the background and purpose of the evaluation, they list the criteria to be applied. The terms of reference also describe the review process as a whole, including the time schedule. Terms of reference are produced by Mistra’s secretariat and adopted by the CEO, who also decides on the composition of the evaluation panel.

     

    Questions about Mistra’s evaluations, contact:

    Johan Edman, Programmes Director Mistra

    johan.edman@mistra.org