Drawing from interviews with representatives from the Research Council of Finland, the Research Council of the Faroe Islands, Formas and Forte from Sweden, Innovation Fund Denmark, the Independent Research Fund Denmark, the Icelandic Research Fund Rannis, the Norwegian Research Council, and Vetenskapsrådet from Sweden, this chapter presents a collective analysis of their methodologies, philosophies, workflows, and definitions. It also explores the specific challenges they face and the future directions they envision. Through this examination, the chapter aims to provide a further understanding of the strategies employed across the Nordic region and the common themes that emerge.
3.1 Research impact
Defining research impact itself presents multiple challenges, reflective of the broader issues in the global academicsociety interface, some of which are philosophical in nature, others epistemological and methodological.
One of the foremost challenges in defining research impact is its inherently complex nature. Research impact is seldom considered limited to academic advancements but extends to societal, economic, and policy influences. This multifaceted aspect makes it difficult to create a universally applicable definition. For instance, while academic impact might be, and often is, assessed quantitatively through publications and citations, societal impact might involve capturing changes in public policy, improvements in health outcomes, or technological advancements. The diverse nature of these “impacts” requires a broad lens for evaluation, which can complicate the process of establishing a clear and concise definition. Further, the difficulty in quantifying certain types of impact, particularly those that are non-tangible or qualitative, adds to the challenge.
While quantitative metrics like citation indices, patent counts, or economic returns are commonly used to measure certain aspects of impact, these may not adequately capture the broader societal, cultural, or environmental influences of research. Qualitative assessments in turn, though valuable in capturing these aspects, often lack the objectivity and standardisation of quantitative methods, leading to challenges in consistent and comparable impact evaluation. Another challenge is accounting for the long-term effects of research caused by the inherently stochastic processes in science and the time-lag this may cause before any observable change, influence, or impact. Many research outcomes, particularly those related to societal and policy changes, unfold over extended periods. This delay poses problem in immediate assessment and reporting, as the full extent of the impact may not be apparent for years or even decades. For example, research in climate change or public health might take a long time to manifest tangible societal benefits. Consequently, assessing and reporting on the immediate impact of such research may be misleading or incomplete at best.
Adding to the challenge, attributing specific outcomes directly to particular research activities is a complex task. This is especially true in collaborative and interdisciplinary research efforts where multiple entities contribute. Determining the exact impact of one research project or researcher within a larger collaborative framework is next unachievable.
Furthermore, in fields where progress is incremental and built on a wide array of preceding work, pinpointing the specific impact of a single research initiative becomes even more problematic. Thus, some chose to disregard the effort of demonstrating causal relationships from research to impact, instead opting for plausible “contributions”, “influences” or “indicators”. The definition of impact is also subject to varying interpretations by different stakeholders. For some parts of academia, impact might be primarily measured in terms of scholarly contributions, while industry stakeholders might prioritise commercial viability or technological innovation. Policymakers, on the other hand, may focus on societal and policy changes. This variation in expectations and values among different stakeholders leads to subjective interpretations of what constitutes a significant impact. Further, the dynamic nature of research fields and societal needs means that the definition of impact is constantly evolving. What is considered impactful today might change in the future, as new societal challenges emerge and research priorities shift. This evolving landscape necessitates a flexible and adaptive yet robust approaches to defining and measuring impact, capable of accommodating new forms of research outputs and their influences while simultaneously providing some clear metrics that are comparative across fields and across time. These challenges necessitate a multifaceted, flexible enough approach that accommodates different dimensions of impact, recognises the long-term effects of research, navigates the attribution
complexity, respects stakeholder diversity, facilitates trust, transparency, and evolves accordingly. Addressing these challenges effectively is important for developing a comprehensive and accurate understanding and ways to demonstrate the value of investments in science and research.
3.2 Motivations
The motivations and goals of Nordic research councils and institutions in measuring research impact are rooted in several key objectives, similar across all respondent organisations.
Foremost is the aim, and sometimes requirement, to ensure accountability and transparency in the use of public funds by demonstrating the value and effectiveness of research investments. Some report increasing pressure on this front, as economic restraints become ever more prominent. Impact measurement data is also percieved to have the potential to guide policy development and strategic decision-making within the research councils and governmental bodies.
By evaluating impact, these entities aim to encourage high-quality research and sometimes direct funding towards projects promising societal or academic benefits and/or facilitate impact potential among projects. The hope is that this process aids in fostering collaboration between academia, industry, and other sectors, while showcasing the benefits of research to policymakers and the public.
3.3 Methodologies
The Nordic countries employ varied models for measuring and reporting on research impact, reflecting their specific priorities but also, and perhaps mainly, their resources. However, the workflows also share many similarities across the countries.
3.3.1 General observations
Most of the respondent organisations fund research through thematically directed calls or/and evaluate research efforts funded by other entities in their respective countries. In the call applications, data on projects expected impact is gathered and sometimes accounted for in the evaluations. This information is mostly descriptive and partially hypothetical, written from the researchers’ expert perspective. Further, some form of final report is generally expected from all funded projects. These reports typically include a description or narrative of the project, and a bibliography of its results, activities, and outcomes. In some cases, projects are further expected to report activities and outcomes in a quantitative format, sometimes throughout the funding period; scientometrics, bibliometrics and sometimes different aspects of altmetrics. How this data is then utilised, if at all, varies greatly, from systematic reporting to stakeholders and use for process iteration to simply archiving.
While some methods to measuring research impact, especially in field-wide reports, generally take a top-down approach, where data is collected and collated by the research councils themselves or outsourced third parties, many formats rely on researchers self-reporting their activities and outcomes. Apart from the biases and omissions typically encountered in self-reported formats, a challenge here lies with the researchers’ reporting burden, which revolves around the complexities and time-consuming nature of documenting and communicating the impact of their research. Researchers are often required to provide detailed reports and evidence of their research’s impact, which can be a time-consuming and intricate process. This requirement often leads to additional workload, taking valuable time away from their primary research activities. The challenge here is in balancing the need for thorough impact assessment with the practical workload limitations faced by researchers, underlining the need for meaningful and valid operationalisations of research impact to report on.
Regardless of methodology, ensuring the accuracy, relevance, and comprehensiveness of the data collected is another notable challenge. In self-reporting there is an inherent risk of subjective bias, misinterpretation, or overlooking important impact aspects, particularly those that are less tangible or longer-term. Additionally, the variability in data collection methods across different fields and institutions may lead to inconsistencies, making comparative analysis challenging. These factors collectively contribute to the complexity in ensuring high-quality data in collated research impact assessments.
In contrast, qualitative assessments and case study approaches are emphasised by some respondents for their effectiveness in capturing a more nuanced view of research impact. These qualitative methods allow for a more in-depth understanding of the impact, particularly in areas that are less quantifiable. Nonetheless, they also present challenges in terms of data quality. These challenges include the potential for subjective bias in narrative descriptions, difficulties in standardising and comparing qualitative data, and the time-intensive nature of conducting detailed case studies. Nonetheless, these qualitative approaches are valuable for providing a more comprehensive view of research impact.
None of the respondents reported a systematic or broad awareness of how other Nordic countries work with impact assessments. This knowledge was deemed to rely on specific contact points on the individual level.
3.3.2 Denmark
The Innovation Fund Denmark (IFD) focuses on fostering growth and employment alongside addressing key societal challenges through its investments. IFD targets sectors such as green technology and innovation, life science, health and welfare technology, and digitalisation, technology, and innovation, aiming to stimulate innovation and technological advances, interdisciplinary alliances, thriving entrepreneurship, and research excellence. The IFD’s approach to impact measurement is evolving quickly at the moment. The IFD, since its inception in 2014, has undergone various changes in leadership and location, influencing its focus on impact analysis. Initially, there was a significant emphasis on conducting elaborate impact analyses, but this approach experienced shifts with changes in leadership and structural relocations.
Currently, their work involves building an impact model across all programs of the fund, focusing on integrating theories of change, monitoring models, and evaluations to influence future investment choices. This approach indicates a shift towards a more structured and systematic method of impact assessment, aiming for a balance between operational feasibility and the fund’s long-term goals, such as growth, employment, and addressing other societal challenges.
The IFD recognises the complexity of measuring societal impacts and the challenge of operationalising these measurements, especially when the societal challenges addressed by research are not precisely defined and may emerge bottom-up from applicants. This makes program evaluation and the aggregation of individual project impacts challenging. Another significant consideration is the alignment of long-term goals with short-term measurements, a common challenge in impact assessment.
The IFD has an ongoing effort to find meaningful short-term measurements that can indicate progress towards long-term impacts, balancing the need for immediate results with the recognition that significant impacts, particularly in research and innovation, often manifest over extended periods. Further, the IFD is in the process of developing more structured and systematic approaches, drawing from Theories of Change, acknowledging the need for both quantitative and qualitative measures, and is exploring ways to operationalise these assessments in a manner that aligns with their strategic goals and the realities of the funding and political environment.
At present, the IFD lacks the infrastructure, in terms of resources and integrated systems, for collecting data and reporting. Nonetheless, the IFD now has the opportunity to develop bottom-up, first producing a robust and valid theoretical framework, followed by development of necessary infrastructure for data management.
The Independent Research Fund of Denmark (DFF), on the other hand, focuses mainly on basic research, which is reflected in their comprehensive approach to research impact as well.
DFF considers research impact to be of significant importance and has been collecting data on research impact for almost three years, with the process now in the analytical phase. The DFF has published their views on how to define and measure research impact in their report titled “Five Ways to Research Impact”, which outlines impact across cultural and societal, industry and business, policy and legislative, as well as scientific and educational domains.
The impact measurement process at DFF begins after a project ends, with researchers required to submit a final report that includes their impact metrics, consisting of a broad range of questions. Reports are submitted through DFF’s own reporting portal. This report is followed up three years later to assess any new impacts that have emerged. In the report, DFF asks researchers for both qualitative and quantitative data regarding the impact of their projects. The metrics are defined by the funds strategy and cover a range of areas including career advancement, additional funding received, academic impacts (e.g., presentations at meetings), and impacts in the public sector (e.g., analysis used in public sector decision-making). These data are collated into a simple database construct that feeds into a PowerBI-built dashboard, that provides analytics at a glance for the user.
DFF acknowledges challenges such as ensuring researchers are aware of what impact they contribute to, dealing with missing reporting, and varying individual definitions of impact. They emphasise the importance of their approach of collecting data both immediately after project completion and three years later to capture longer-term impacts and are currently focusing on capturing the non-linearity and stochasticity of the impact process. Further, DFF plans to start publishing “impact-reports” on their findings and continue refining their approach to impact measurement.
3.3.3 Finland
The Research Council of Finland’s approach to research impact assessment is notably comprehensive. The Council has developed a strategy that blends both qualitative and quantitative methods to evaluate research impact, emphasising the importance of capturing a wide range of factors, from academic advancements to societal influences. The standing definition of what the Council calls societal impact is “the ways in which research contributes to developments in society and to dealing with social issues and impact is complex phenomenon that arises in interaction between research data and other factors, often over the long period of time”.
For funding applications, like the Flagship Programme for example, the Council requires a detailed research and impact plan that encompasses aims, objectives, implementation strategies, and expected societal effects beyond academia. Applicants are encouraged to self-assess the expected societal impact of their research, acknowledging the potential for science to contribute to prosperity, policymaking, skill development, and the broader development of society.
Furthermore, the Strategic Research Council (SRC) within the Research Council of Finland provides funding for long-term and program-based research, aimed at addressing major challenges facing Finnish society. The SRC’s projects are selected based on scientific quality, societal relevance, and impact, emphasising the Council’s commitment to research that seeks concrete solutions to societal challenges. Monitoring funded projects and evaluating their impact are indeed part of the statutory duties of the SRC (Act on the Research Council of Finland, section 5 b). The SRC operates under a governance model that includes strategic decision-making and impact assessment, reflecting a structured approach to fostering impactful research.
While the project reports are the Councils main source of scientific impact measures, their methodology includes employing a variety of metrics to assess outputs, collaboration, and researcher mobility, alongside a self-evaluation component for projects to gauge their societal impact. This multidimensional approach allows for nuanced and thorough analysis of the consequences of research activities, even though their theoretical framework regarding impact faces the same definitional and epistemological challenges as their peers. Moreover, the Council engages in periodic interim evaluations on specific funding instruments, like the Flagship programme, sometimes involving external reviewers for more specialised analysis. This practice may facilitate objectivity and diversity in evaluation.
In addition to these practices, the Council produces reports on the state of scientific research in the country. These reports incorporate both quantitative and qualitative analyses, providing a comprehensive view of research trends, outputs, and impacts. This approach of regular reporting serves a proactive and continuous effort to monitor and evaluate the research landscape in Finland.
The Council’s approach to impact assessment is not without its challenges. The respondent highlights concern regarding the quality of data and reporting, signalling an ongoing effort to refine and improve the processes and tools used for impact assessment. To address these challenges, the Council aims to enhance its reporting system, targeting a more structured and efficient process.
The Council’s approach to research impact assessment is thus characterised by a blend of diverse methodologies with both quantitative and qualitative assessments, and a continuous effort to improve and refine impact measurement practices. This strategy reflects a drive to understanding the nature of research impact and a commitment to capturing a broad spectrum of research outcomes, from scientific contributions to societal benefits.
To summarise, the Council measures research impact through project reports, including basic measures for outputs, collaborations, mobility, and self-evaluation of societal impact. They use bibliometric studies for scientific impact and qualitative measures for societal impact. The Council is developing a structured reporting system to better report societal impact alongside scientific impact. Challenges include improving the quality of reporting data and developing tools for analysing impact. They aim to improve data quality and dissemination of research impact to validate public investment in research.
3.3.4 Iceland
In Iceland, the Icelandic Centre for Research, Rannís, is responsible for administering domestic grants in research and innovation, and as such manages the only comprehensive bottom-up research fund in Iceland. It is hence primarily a research funding organisation. Rannís’ approach to research impact assessment is marked by a pragmatic and resource-conscious stance. While there is an acknowledged legal obligation to collect and report on data on research outputs and impact, Rannís faces challenges in systematically capturing and utilising this data, largely due to resource constraints and a stated lack of coherent digital infrastructure.
The current practice involves collecting progress and final reports from funded projects, which provide some degree of insight into outputs, outcomes and impact. However, these reports are not always systematically processed or analysed thoroughly, mainly due to the lack of resources. This has led to a situation where the data may reside in files without much further utilisation, systematisation, or analysis.
Rannís supports occasional external third-party impact assessments, especially for large-scale reviews, indicating a will to focus on impact measurement but also a reliance on external expertise for more comprehensive analysis.
Despite these challenges, there is a clear recognition of the need for valid impact measurement and reporting. Rannís is exploring digitalisation and online systems for better data management, indicating a move towards more structured and systematic impact assessment methods in the future. Rannís’ approach highlights the challenges smaller research councils and organisations face in effectively measuring and reporting research impact. It underscores the need for resource optimisation and the potential benefits of leveraging digital tools and cooperation to enhance impact assessment capabilities.
Conclusively, Rannís collects progress and final reports, which include typical impact parameters, but lack a systematic approach to utilise this information effectively. Rannís supports external impact assessments by contracting third parties, which is considered an efficient method. However, due to limited resources, Rannís’ ability to capture and act on impact data is described as underdeveloped and under-resourced, highlighting a need for more structured approaches and more resources for impact measurement.
3.3.5 The Faroe Islands
The Faroe Islands’ Research Council’s approach reflects a pragmatic perspective similar to the Icelandic one.
In their methodology, funding applicants are required to include expected impacts within their project proposals. These are evaluated by external experts, emphasising a forward-looking assessment of potential outcomes. Projects are expected to provide annual and final reports, which include information on impacts. However, the approach is not systematic or standardised across all programs, highlighting the council’s resource constraints and the challenges in implementing a comprehensive impact assessment strategy.
Furthermore, the Council occasionally conducts evaluations of specific programs or initiatives to assess general societal impact, relying on qualitative rather than quantitative assessments. The Council also engages in outreach activities to communicate research impacts, showcasing success stories to illustrate the practical outcomes of funded research. This approach, while less formalised, underscores a priority to identify and communicate the benefits of research, particularly to the wider community and stakeholders.
3.3.6 Sweden
The approach to research impact measurement in Sweden, as represented by Formas, Forte, and Vetenskapsrådet, showcases a mix of traditional and progressive understandings of research impact. It is notable, however, that Formas, Forte, and Vetenskapsrådet all have different mandates which naturally are reflected in their aims and methods.
Formas, focusing on sustainable development, has evolved to include funding for both the private and public sectors. They emphasise the impact of their funded research and innovation on societal progression towards sustainable development. Their strategy includes a mix of qualitative and quantitative methods, with recent advancements in data-driven approaches like successfully integrating PowerBI into their workflow, facilitating the aggregation and analysis of impact data. However, while Formas is dedicated to fostering conditions that enable projects to have an impact, they consider measuring impact challenging.
Formas has a broad and evolving approach to measuring research impact, focusing on both quantitative and qualitative assessments. They use Power BI to aggregate and analyse data from project reports, allowing them to visualise the impact in various dimensions, such as policy influence, scientific publications, and societal engagement. Formas also stress the importance of maintaining a balance between data-driven approaches and recognising the nuanced, qualitative aspects of impact, acknowledging the challenges of directly linking specific research projects to societal changes. This approach reflects an adaptive strategy towards understanding and enhancing the impact of research on sustainable development.
Forte considers impact “that research results are used by decision makers and professionals in in different organisations, in practices and policy to ultimately improve the lives of people and our society”. In emphasising societal impact, they face challenges in systematically measuring and reporting such impact. To counter these challenges, they are exploring the potential of AI and digital tools for future impact evaluation.
However, and quite similar to Formas, Forte focuses mainly on facilitating research impact, rather than directly measuring and reporting on it. Their approach emphasises creating good conditions for impact by demanding researchers to ensure their work is relevant for society and has a clear plan for implementation.
Forte defines research impact as research results being used by decision-makers and professionals to improve societal conditions. Emphasising societal impact, they face challenges in systematically measuring and reporting such impact. To counter these challenges, they are exploring the potential of AI and digital tools for future impact evaluation.
As Forte sees itself mainly as a facilitator, they encourage collaboration between research and society and integrating impact considerations from the start of projects. The complex task of measuring impact is acknowledged, however the current efforts more focused on setting the right conditions for impactful research rather than quantitatively measuring outcomes and analysing their impact.
Vetenskapsrådet, balancing academic excellence with societal relevance, employs a diverse set of methodologies, including a model inspired by the Research Excellence Framework in England as well as case studies written in collaboration with researchers, which have gotten positive responses from the researchers so far. Vetenskapsrådets organisational statues stipulate that they ought to “evaluate the quality and significance of research”, where they interpret “the significance of research” as societal impact or societal significance of research.
In all, Sweden seems to be moving towards structured, data-driven methodologies, balancing the need for quantitative data at Forte and Formas, and the richness of qualitative insights through the incorporation of case studies by Vetenskapsrådet.
3.3.7 Norway
The Research Council of Norway plays a central role as a tool for the Norwegian government in shaping research and innovation to address societal challenges and enhance sustainability. It functions as the government’s key advisor on research policy, managing substantial annual funding for research and innovation projects. The Research Council encourages applications from research organisations, companies, and public sector entities, supporting a wide range of fields through its project portfolios.
The approach of the Research Council of Norway to research impact measurement is both comprehensive and systematic. The Council conducts subject-specific evaluations of Norwegian research every 10 years, which include qualitative assessments and the traditional scientomentrics. Since 2015, societal impact has been integrated into these evaluations, indicating a growing emphasis on broader research outcomes.
The Norwegian approach underscores the importance of both immediate and long-term research impacts, balancing the need for more immediate, measurable results with the recognition that significant impacts, especially in societal and policy domains, often materialise over extended periods. This method of assessment demonstrates Norway’s aim for a thorough and evolving understanding of research impact, aligning with the country’s broader research and innovation goals.
The Council conducts comprehensive evaluations of all research fields every ten years. Additionally, impact assessment is integrated into funding proposal evaluations since 2019, covering both research and societal impact. The Council uses a logic model of its own for planning and describing expected outputs, outcomes, and impacts, emphasising the contribution of research to societal benefits.
3.4 Challenges
The challenges in research impact measurement, as explored through these interviews with Nordic research institutions, encompass several key areas.
Firstly, there is a profound complexity in measuring societal impact. Quantifying and directly attributing these impacts to specific research activities pose significant challenges. Secondly, resource constraints are a notable issue, especially for smaller organisations like those in Iceland and the Faroe Islands. Thirdly, these feed into the difficulty of establishing and committing to any digital infrastructure and workflow for extended periods of time, even though such an infrastructure would, admittedly, facilitate ease of both reporting, collecting, and presenting different forms of data.
These constraints impede the ability to conduct thorough and systematic impact assessments. Another major challenge is the time-lag it may take for impact to materialise, particularly in societal and policy areas, where significant effects often unfold over extended periods, complicating immediate assessment and reporting. Additionally, councils struggle with balancing quantitative data and qualitative insights to provide a comprehensive view of research impact. Ensuring high-quality, systematic data collection and analysis remains an ongoing challenge, further complicated by the evolving nature of research fields and societal needs.