Beyond impact factor, h-Index and university rankings: Evaluate science in more meaningful ways

Conference report

Renowned experts highlighted the limitations of metrics in capturing scientific quality and the resulting pressure on the quality of scientific output and presented approaches that challenge conventional metrics. The implications for the Swiss science landscape long-term were the subject of a stakeholder roundtable and discussion with the audience.

Conference report "Beyond impact factor, h-Index and university rankings"
Image: SCNAT

The Swiss Academy of Sciences (SCNAT) organised this international conference on behalf of the Swiss Academies of Arts and Sciences, as part of its "We Scientists Shape Science" initiative.

The following points emerged from the presentations and discussion:

  • Current metrics do not measure and therefore do not reflect scientific quality
  • The h-Index and Journal Impact Factor are pseudo-objective and unhealthy for research
  • Broader “portfolio” approaches are needed that also take into account criteria such as outreach, managerial, mentoring or teaching skills
  • Defining meaningful and discipline-specific indicators requires co-design by producers and users of evaluation results
  • Enhancing science assessments needs changes in research culture
  • Renewing the social contract between science and societyneeds an agreement on the goals to be achieved by scienceand on the indicators to measure its contribution
  • Changing metrics requires concerted global action by science stakeholders, namely in relation to the private sector providers of metric and ranking systems

Authors: Dr Roger Pfister

Categories

  • Evaluation
  • Indicators
  • Quality
English