Foundations

As recipients of public funding, universities are under the obligation to report on and evaluate their activities. In addition to teaching, conducting research is a key component of an academic institution, and research findings are inter alia documented in scholarly publications like journals or book. Such publications promote dialogue between researchers and represent an important channel of communication in academia. Nevertheless, publications give only a partial picture of the significance and the impact of research. Conferences, too, represent valuable platforms for researchers to connect and exchange ideas, and research is both imparted and conducted in the framework of lectures and other courses.

Simultaneously, academics are increasingly confronted with the demand to document and demonstrate the social relevance and impact of their research. Such requirements are particularly placed on the humanities and social sciences, where scholars are required to explain the contribution they make to both individuals and society at large. Providing such evidence calls for methods that are suited to the discipline in question and that enable non-experts to access these fields.

Decision-makers at universities in turn place value on procedures that make the quality of research conducted at their institution visible and contribute to evaluations. To create these procedures, research quality criteria must be defined – and researchers must deem the criteria to be relevant and meaningful in their discipline.

Innovative methods to enhance academic visibility

Up to just a few years ago, bibliometrics was the preferred method to measure research output – that is, using quantitative analyses to measure the effect and resonance of scholarly publications. Today, however, the validity of such methods are debated, and in most fields within the humanities and social sciences, bibliometrics do not factor in largely because the language-related and discipline-specific characteristics as well as the research and publication culture practised in these fields diminish the value of a one-sided weighting of publishing activity.

As part of the effort to address the entire spectrum of research in the humanities and social sciences, the former Rectors’ Conference of the Swiss Universities (CRUS) developed the program Performances de la recherche en sciences humaines et sociales. The program subscribes to the principles that evaluate quality and output of research should not be an end in itself; rather, such measurements should always observe a concrete epistemological interest, which has been defined clearly and transparently by the persons responsible. Here – and in accordance with the strategic plans of the Swiss universities for the period 2013-2016 – not only the importance of research for the academic community, but also subject-specific and cultural characteristics as well as the practical value of research are to be documented, under due consideration of the strategic goals set by each individual university. As such, this approach differs from that taken by several other countries aiming to rate an entire higher education system on the basis of only a few indicators. A variety of guidelines for developing suitable methods and instruments can be extrapolated from the approach.

  • The fact that an indicator can be measured does not justify its use in a rating.
  • No single instrument and no single method is capable of presenting a comprehensive overview of the diversity of research.
  • When developing methods and instruments to enhance academic visibility, it is important to identify unintended effects in advance, and to avoid negative consequences.
  • Bottom-up approach: the researchers involved are to be involved in developing instruments.
  • Research in the humanities and social sciences should also be made increasingly visible to non-specialists.
  • Developed instruments must take into account the various profiles of Swiss institutions of higher education.

Developing effective instruments

Taking these guidelines as a basis, ten individual initiatives and eight implementation projects have been, or are being, carried out in joint undertakings between various Swiss universities. Through the projects, various stakeholders inside and outside an university are able to adequately represent the quality of research and place it into an overall context. Although the individual projects are independent, the network of experts affiliated with the program nonetheless provides opportunity for regular exchange between projects. At the same time, the program develops principles which serve as guidelines for research evaluation in the Swiss HE system.

Through the program, the Swiss universities can render visible the value of research conducted in the humanities and social sciences, while still accounting for various dimensions and differing, discipline-specific understandings of quality. Particular value is attached to reflecting on the diversity inherent in the federalist structure of the Swiss higher education system, and to providing the universities with the necessary means to further develop their strengths.

Contact: Dr. Alexander Hasgall, Scientific Coordinator. alexander.hasgall@unige.ch

Web: www.performances-recherche.ch