153
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      If you have found this article useful and you think it is important that researchers across the world have access, please consider donating, to ensure that this valuable collection remains Open Access.

      Prometheus is published by Pluto Journals, an Open Access publisher. This means that everyone has free and unlimited access to the full-text of all articles from our international collection of social science journalsFurthermore Pluto Journals authors don’t pay article processing charges (APCs).

      scite_
       
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      What can bibliometrics tell us about changes in the mode of knowledge production?

      Published
      research-article
        a , * ,
      Prometheus
      Pluto Journals
      Bookmark

            Abstract

            One of the most influential contributions to the fields of science policy research and science and technology studies during the last 20 years was The New Production of Knowledge by Gibbons et al. (Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, S., Scott, P. and Trow, M. (1994) The New Production of Knowledge: The Dynamics of Science and Research in Contemporary Societies, Sage, London). The authors argued that over recent decades a different form of knowledge production has emerged, one which they termed ‘Mode 2’. In this, knowledge is produced in the context of application, generally on the basis of transdisciplinary research efforts, by a heterogeneous range of institutional actors, who are subject to wider social accountability and more diverse forms of quality control than in the traditional ‘Mode 1’ knowledge production. Although there have been a number of attempts to examine the claims of The New Production of Knowledge empirically, the evidence is, at best, rather ambiguous. The study reported here analyses highly cited publications in the field of bibliometric research to establish whether the themes of those publications and, more specifically, the changes in these themes over the last 20 years, provide any evidence of a growing incidence of Mode 2 knowledge production. The paper concludes that there is some evidence that bibliometrics, as a field of research, has exhibited a shift towards Mode 2 knowledge production over the last two decades. In addition, it would seem to have played a part in a similar shift across science more generally, offering policy-relevant tools and analyses, helping scientific research to respond to increased demands for accountability, and contributing to changes in the approach to the quality assessment of research. At the same time, and perhaps inadvertently, it may have contributed to bringing about changes in publication and citation practices as more and more authors seek to maximise their ‘score’ on one or more bibliometric indicators.

            Main article text

            Introduction

            The aim of the workshop for which this paper was originally prepared was to explore what changes have occurred in science and its institutions with regard to the production of knowledge since Gibbons and his colleagues published The New Production of Knowledge in 1994.1 The main organiser of the workshop, Michael Gibbons, invited the author to investigate what light might be cast on this by analysing studies in the area of bibliometrics (the use of quantitative indicators of science, such as those derived from publication and citation analysis). The specific focus of this paper is on what we might learn from an analysis of high-impact bibliometric publications over the last 20 years and, in particular, what changes, if any, can be detected compared with the previous 20 years with regard to the mode of knowledge production.

            The research questions on which this study has focussed relate directly to the characteristics of Mode 2 knowledge production as described by Gibbons et al. (1994), namely whether the research is carried out in the context of application, whether it is transdisciplinary in nature, whether it exhibits institutional heterogeneity with regard to the actors involved in the research, and whether it is subject to external accountability and quality assessment. The approach adopted here involves first identifying highly cited publications in the bibliometric area for 1990–2009 along with those for the earlier period of 1970–1989. The themes of these highly cited publications are analysed, in particular to identify how these themes have evolved over the last 20 years. The paper then examines whether these identified trends are related to Mode 2 characteristics.

            In what follows, we look first at the argument set out in The New Production of Knowledge about changes in the mode of knowledge production, and then review various studies that have attempted to find evidence as to whether Mode 2 knowledge production is on the increase. The following section outlines the methodology adopted in the study reported here. Next, we identify the main themes of highly cited bibliometric publications in the 20 years before and after 1990. We examine whether changes in these themes are related to a growing incidence of Mode 2 knowledge production. The final section summarises the main conclusions to emerge from the study.

            Background

            The starting point for the thesis set out in The New Production of Knowledge is the traditional form of knowledge production (which the authors label ‘Mode 1’), a mode in which research is driven mainly by academic challenges, such as solving problems suggested by theory, experiment or the disciplinary paradigm, is carried out primarily within individual disciplines (it is mostly mono-disciplinary in nature), is conducted mainly by researchers in universities and other academic research institutes (it is institutionally homogeneous), and where researchers enjoy considerable academic autonomy, being subject merely to internal accountability and quality control (primarily through peer review).

            In recent decades, according to Gibbons et al. (1994), we have witnessed the emergence of a different (and apparently new) form of knowledge production (labelled ‘Mode 2’), in which knowledge is produced in the context of application (as opposed to ‘basic’ research, the results of which subsequently need to be ‘applied’). Mode 2 knowledge production generally requires transdisciplinary research efforts (the mobilisation of a range of theoretical frameworks, empirical approaches and so on from different disciplines and, through a process of dynamic interaction among these, the development of new theories, concepts, research methods or whatever). Mode 2 knowledge is socially distributed across a more heterogeneous set of organisational actors (e.g. company R&D laboratories, government research laboratories, hospitals, consultancies, think tanks), and it is subject to wider social accountability and more diverse forms of quality control.

            Partly in response to criticisms of The New Production of Knowledge,2 these ideas were subsequently extended in Nowotny et al. (2001, 2003). In particular, the authors argued that ‘society as a whole has been permeated by science’, giving rise to a ‘Mode 2 society’. They placed greater emphasis on ‘reflexivity’ as a distinguishing characteristic of Mode 2. Whereas Mode 1 involves the ‘objective’ investigation of the natural or social world, Mode 2 is based on a process of dialogue between researchers and their research subjects, or between ‘science’ and ‘society’, with researchers becoming more aware of the wider consequences of their work. Linked to this is the drive towards wider economic and social accountability. Nowotny et al. (2001) also expanded the notion of knowledge produced in the context of application, identifying different levels of contextualisation (weak, middle range and strong). They put forward the notion of the agora as ‘the problem-generating and problem-solving environment in which the contextualization of knowledge production takes place’ (Nowotny et al., 2003, p.192), arguing that Mode 2 hence yields more ‘socially robust knowledge’ than Mode 1.

            Several previous studies explore or test the thesis of The New Production of Knowledge. One of the first was by Hicks and Katz (1996), who carried out an analysis of bibliometric data to identify the nature and extent of changes towards Mode 2. They found that there was some evidence that research was becoming more interdisciplinary and was being carried out increasingly within networks. However, the evidence was less clear as to whether there was also a shift towards research in the context of application. While Hicks and Katz were able to offer some evidence with regard to interdisciplinarity, they found it harder to address the somewhat different concept of ‘transdisciplinarity’ with their particular methodology.3 Here, a short digression on concepts and terminology is necessary. The notion of ‘transdisciplinarity’ as set out in Gibbons et al. (1994) was somewhat ill-defined, and the difference between this and the related concepts of ‘multidisciplinarity’ and ‘interdisciplinarity’ have since been the source of much confusion. In recent years, other authors have attempted to come up with more rigorous definitions of these three concepts and how precisely they differ, as well as how they are related. One of the most convincing expositions is that by Klein (2010, see especially Table 2.1).4 According to her, multidisciplinarity involves merely drawing upon and juxtaposing knowledge, methods, perspectives or whatever from two or more disciplines, while interdisciplinarity entails an element of linking, blending and integrating these various inputs. Transdisciplinarity, in contrast, involves a higher level and more fundamental transformation of these inputs, so that the research then transcends the original disciplinary boundaries. We shall return to these distinctions later when we examine the changing nature of bibliometric research over time.

            Table 1. Bibliometric HCPs published in 1970–1979
             TitleAuthorsSourceDateCit’nsTheme
            1Citation analysis as a tool in journal evaluation – journals can be ranked by frequency and impact of citations for science policy studiesGarfield, E. Science, 178, 4060, pp.471–791972768JIF
            2Co-citation in scientific literature – new measure of relationship between 2 documentsSmall, H. Journal of the American Society for Information Science, 24, 4, pp.265–691973452Co-cit’n
            3General theory of bibliometric and other cumulative advantage processesPrice, D.J.D. Journal of the American Society for Information Science, 27, 5–6, pp.292–3061976307Gen bib’s
            4Structure of scientific literatures. 1. Identifying and graphing specialtiesSmall, H., Griffith, B.C. Science Studies, 4, 1, pp.17–401974299Co-cit’n
            5 Evaluative Bibliometrics Narin, F.[Computer Horizons Inc., New Jersey]1976222Res eval
            6Some results on function and quality of citationsMoravcsik, M.J., Murugesan, P. Social Studies of Science, 5, 1, pp.86–921975207Gen bib’s
            7Citation indexing for studying scienceGarfield, E. Nature, 227, 5259, pp.669–711970156Gen bib’s
            8Introducing citation classics – human side of scientific reportsGarfield, E. Current Contents, 1, pp.5–71977142HCPs
            9Expansion of citation classics – 250 unique commentaries per yearGarfield, E. Current Contents, 1, pp.5–121979136HCPs
            10Citation influence for journal aggregates of scientific publications – theory with application to literature of physicsPinski, G., Narin, F. Information Processing & Management, 12, 5, pp.297–3121976133JIF
            11Structure of scientific literatures. 2. Toward a macrostructure and microstructure for scienceGriffith, B.C., Small, H.G., Stonehill, J.A., Dey, S. Science Studies, 4, 4, pp.339–651974129Co-cit’n
            12Coherent social groups in scientific changeGriffith, B.C., Mullins, N.C. Science, 177, 4053, pp.959–641972121Coll’n
            13Journal citation studies. 18. Highly cited botany journalsGarfield, E. Current Contents, 2, pp.5–91975113HCPs
            14Significant journals of scienceGarfield, E. Nature, 264, 5587, pp.609–151976103JIF
            15Gatekeepers in scienceGarfield, E. Current Contents, 2, pp.5–71976103Gen bib’s
            16Is citation analysis a legitimate evaluation toolGarfield, E. Scientometrics, 1, 4, pp.359–751979102Gen bib’s
            17Scientific communication – its role in the conduct of research and creation of knowledgeGarvey, W.D., Griffith, B.C. American Psychologist, 26, 4, pp.349–62197199Gen bib’s
            Source: Number of citations as recorded in the Web of Science (as of September 2009). Further explanation of the classification by theme can be found in the main text, while the abbreviations used here are spelt out in more detail in the summary Table 7 below.

            In another study to assess the claims in The New Production of Knowledge, Godin and Gingras (2000) showed that, although there has been some diversification in the institutional location of knowledge production (e.g. to firms, government laboratories, hospitals etc.), universities still remain central. Moreover, where there has been growth in knowledge production in other sectors, the research involved is often performed in collaboration with academic institutions (not least reflecting government policies encouraging university–industry links and other forms of collaboration and networking), implying that universities are arguably now more central in knowledge production than previously, not less.

            A third study was by Gulbrandsen and Langfeldt (2004), who interviewed senior scientists in Norway drawn from 10 disciplines in universities and colleges, research institutes and industry. They found little support for an increase in Mode 2 as reflected in certain of its dimensions (e.g. new criteria for research assessment, or increasing convergence of universities, industry and research institutes). A more specific study was that by Hemlin and Rasmussen (2006), who focused on changes in quality control. They reported evidence of a shift from traditional quality control towards quality monitoring, with a wider range of people involved, including those from industry and other users or stakeholders. They also noted a shift in the focus of quality assessment from individuals to organisations, and from retrospective assessment to ongoing monitoring. However, they concluded that their study offered only limited empirical support for the claims of The New Production of Knowledge.

            More recently, Porter and Rafols (2009) employed a new indicator of interdisciplinarity to assess changes in six research fields. They found that, although there had been an increase of approximately 50% between 1975 and 2005 in the number of disciplines cited in papers (along with a similar increase in the average number of references per article), their index of interdisciplinarity exhibited only a modest increase of 5% over this 30-year period. They concluded that, although science is becoming more interdisciplinary, it is apparently doing so in relatively small steps, drawing mainly on neighbouring fields while only slightly increasing the number of connections to cognitively more distant fields.

            In addition, there have been studies focussing on specific research fields. For example, Harvey et al. (2002) found that the most effective research groups in medical research tended to operate in Mode 2 as a response to their increasingly complicated external environment. A somewhat different result was obtained by Ferlie and Wood (2003), who found that health services researchers operate in both Modes 1 and 2, with Mode 1 significantly constraining the extent to which Mode 2 might develop. With regard to the social sciences, Pettigrew (1997) suggested that a new and more social mode of knowledge production was emerging in management science, and Huff (2000) argued that Mode 1 knowledge production in business schools was declining and was likely to continue to do so. However, Robson and Shove (1999) found no evidence of a shift from Mode 1 to Mode 2 in the social sciences. Albert (2003) carried out an analysis of the research activities of university professors in two social sciences, economics and sociology. The study provided no evidence of a shift towards Mode 2, and indeed suggested a shift (at least in these two fields) in the opposite direction – towards Mode 1 – over the previous decade or so.

            In an interview-based survey of Swedish academics regarding changes in knowledge production in the social sciences and humanities, Morton (2005) found that many researchers disagreed with the need for Mode 2 knowledge production and exhibited a marked reluctance to becoming subject to Mode 2 evaluation. Overall, she was unable to find any evidence of a shift from Mode 1 to Mode 2 research, and concluded that the Mode 2 model was inadequate for understanding the complexities of change in the Swedish system. A comprehensive review of the notion of Mode 2 and the reactions to it, including the various empirical investigations, can be found in Hessels and van Lente (2008). Their overall conclusion is that ‘empirical evidence to show the rise of reflexivity, transdisciplinarity, and new modes of quality control is lacking’ (p.754).

            In the light of all this, it was decided to carry out a study of the most influential publications from the field of bibliometrics in order to establish whether the themes of these, and any changes in these themes over time, provide support for a shift towards Mode 2 knowledge production. The specific research questions addressed here are as follows:

            • 1.

              Is there any evidence of a shift to research in the context of application?

            • 2.

              Is there any evidence of a shift to more transdisciplinarity?

            • 3.

              Is there any evidence of a shift to more institutional heterogeneity?

            • 4.

              Is there any evidence of a shift to external accountability or of changes in the form of quality assessment?

            Methodology

            Let us begin by setting out the scope of the field being studied. Bibliometrics can be defined as the quantitative study of research with the aim of analysing, mapping, measuring or assessing research activities and their impact, whether at the level of countries, fields, institutions, research groups or individuals. There is some overlap with the fields of library science and information science, but these have not been included here. The time covered in this study, the 20 years since 1990, has been divided into four five-year periods. (Although The New Production of Knowledge was published only in 1994, it is assumed that the changes it describes had begun some years earlier.) This 20-year period is contrasted with the previous two decades.

            The methodology involved first searching for highly cited publications (HCPs) from the field of bibliometric studies. It is assumed that these HCPs have had the greatest impact on the academic community, although not necessarily on research policy or practice (there are, unfortunately, no simple indicators for such external impact). The identification of these HCPs involved three main starting points. The first was a search of the main journals used by bibliometric researchers to publish their results. These include Scientometrics, Journal of the American Society of Information Science and Technology (JASIST, formerly JASIS), Research Evaluation, Journal of Information Science, Journal of Informetrics and Information Processing and Management. (Some of these journals also publish non-bibliometric articles; for example, relating to library science or information science. As noted earlier, these were excluded from our analysis.)

            Secondly, a keyword search was conducted, first on Google Scholar (which is more flexible than other search engines such as that in the Web of Science, and which covers books as well as journal articles) in order to identify potential candidate HCPs. Among the search strings used, for example, were ‘publication OR citation OR bibliometric’, and ‘publication indicator’ OR ‘publication count’ OR ‘publication analysis’ OR ‘citation indicator’ OR ‘citation count’ OR ‘citation analysis’ OR ‘bibliometric indicator’ OR ‘bibliometric analysis’ OR ‘scientometric analysis’. Thirdly, we searched the publications of prominent bibliometric authors including winners of the de Solla Price Medal (the main international prize in the bibliometric research community) and various others. (The latter list may not be very comprehensive, but it should nevertheless be good enough for the purpose of identifying trends in the overall themes of bibliometric HCPs.)

            The candidate HCPs thus identified from these three sources were then searched on the Web of Science (WoS) in order to establish how frequently they had been cited. The ‘general search’ facility in the WoS was used in the case of articles in journals scanned by the WoS, while the ‘cited reference’ facility was used for the remaining publications (e.g. books, book chapters and articles in journals not scanned by WoS).

            The period from 1990 to 2009 was split into four five-year periods. In these, we used varying thresholds in order to identify the top 15–20 HCPs in each. (Using a fixed citation threshold would mean that only older HCPs were identified; in order to give more recent publications an equal chance, one must therefore use a lower threshold for later periods.) The previous 20 years, between 1970 and 1989, was divided into two 10-year periods, and again varying thresholds were used to identify the top 15–20 HCPs in each. For each of the periods analysed, the top 15–20 HCPs were classified according to the main theme of the paper. Next, these themes were linked to various Mode 2 characteristics. Finally, the evolution of the themes over time was analysed.

            Results

            1970–1979

            Of the 17 HCPs published in this period (see Table 1), six can be classified as ‘general bibliometrics’ (#3: 307 citations, #6: 207, #7: 156, #15: 103, #16: 102, #17: 99). Three relate to the topic of highly cited papers (HCPs) (#8: 142, #9: 136, #13: 113), three to co-citation mapping (#2: 452, #4: 299, #11: 129), and three to the evaluation of journal impact (#1: 768, #10: 133, #14: 103), one of which was the single most highly cited bibliometric publication of the decade identified here (the 1972 paper by Garfield putting forward the notion of journal impact factor or JIF). Of the remaining two papers, one focuses on research evaluation (#5: 222) and the other on research collaboration (#12: 121).

            Table 2. Bibliometric HCPs published in 1980–1989
             TitleAuthorsSourceDateCit’nsTheme
            1Author co-citation – a literature measure of intellectual structureWhite, H.D., Griffith, B.C. Journal of the American Society for Information Science, 32, 3, pp.163–711981213Co-cit’n
            2Assessing basic research – some partial indicators of scientific progress in radio astronomyMartin, B.R., Irvine, J. Research Policy, 12, 2, pp.61–901983180Res eval
            3Problems of citation analysis – a critical reviewMacRoberts, M.H., MacRoberts, B.R. Journal of the American Society for Information Science, 40, 5, pp.342–491989180Gen bib’s
            4BibliometricsWhite, H.D., McCain, K.W. Annual Review of Information Science and Technology, 24, pp.119–861989152Gen bib’s
            5Scientometric datafiles – a comprehensive set of indicators on 2649 journals and 96 countries in all major science fields and subfields 1981–1985Schubert, A., Glanzel, W., Braun, T. Scientometrics, 16, 1–6, pp.3–4781989146Res eval
            6Patents as indicators of corporate technological strengthNarin, F., Noma, E., Perry, R. Research Policy, 16, 2–4, pp.143–551987143Patents
            7The use of bibliometric data for the measurement of university-research performanceMoed, H.F., Burger, W.J.M., Frankfort, J.G., Van Raan, A.F.J. Research Policy, 14, 3, pp.131–491985129Res eval
            8The foundations of information science. 1. Philosophical aspectsBrookes, B.C. Journal of Information Science, 2, 3–4, pp.125–331980112Gen bib’s
            9Journal citation studies. 46. Physical-chemistry and chemical physics journals. 3. The evolution of physical-chemistry to chemical physicsGarfield, E. Current Contents, 3, pp.3–121986112Gen bib’s
            10Relative indicators and relational charts for comparative assessment of publication output and citation impactSchubert, A., Braun, T. Scientometrics, 9, 5–6, pp.281–911986111Res eval
            11The intellectual development of management information systems, 1972–1982 – a co-citation analysisCulnan, M.J. Management Science, 32, 2, pp.156–721986109Co-cit’n
            12From translations to problematic networks – an introduction to co-word analysisCallon, M., Courtial, J.P., Turner, W.A., Bauin, S. Social Science Information sur les Sciences Sociales, 22, 2, pp.191–2351983107Co-word
            13Patent statistics as indicators of innovative activities – possibilities and problemsPavitt, K. Scientometrics, 7, 1–2, pp.77–991985105Patents
            Source: See Table 1.
            1980–1989

            Of the 13 HCPs identified for this period (see Table 2), general bibliometrics continues to feature prominently, accounting for four of these (#3: 180 citations, #4: 152, #8: 112, #9: 112). However, during this decade, research evaluation evidently grew in importance, now accounting for four HCPs (#2: 180, #5: 146, #7: 129, #10: 111) compared with just one in the previous decade. Co-citation or co-word mapping continued to feature prominently (#1: 213, #11: 109, #12: 107). The 1980s also saw the emergence of interest in patent indicators, which accounted for two HCPs (#6: 143, #13: 105).

            1990–1994

            As can be seen from Table 3, during the first half of the 1990s, interest in co-citation and co-word mapping was at its height, accounting for six of the 19 HCPs identified for this period (#1: 155 citations, #7: 85, #13: 63, #14: 60, #15: 60, #18: 53). A second theme that became much more prominent during this time was research collaboration, with five HCPs (#5: 108, #6: 88, #10: 73, #11: 65, #12: 63). Another theme of growing interest was patent citations and related work on science–technology links, with three HCPs (#4: 116, #8: 81, #19: 50). In contrast, interest in general bibliometrics seems to have waned somewhat, with three HCPs (#2: 143, #16: 58, #17: 55), along with interest in the evaluation of journal impact, with two HCPs (#3: 130, #9: 81).

            Table 3. Bibliometric HCPs published in 1990–1994
             TitleAuthorsSourceDateCit’nsTheme
            1Mapping authors in intellectual space – a technical overviewMcCain, K.W. Journal of the American Society for Information Science, 41, 6, pp.433–431990155Co-cit’n
            2The skewness of scienceSeglen, P.O. Journal of the American Society for Information Science, 43, 9, pp.628–381992143Gen bib’s
            3The relative impacts of economics journals – 1970–1990Laband, D.N., Piette, M.J. Journal of Economic Literature, 32, 2, pp.640–661994130JIF
            4Direct validation of citation counts as indicators of industrially important patentsAlbert, M.B., Avery, D., Narin, F., McAllister, P. Research Policy, 20, 3, pp.251–591991116Pat cit’s
            5Understanding patterns of international scientific collaborationLuukkonen, T., Persson, O., Sivertsen, G. Science Technology & Human Values, 17, 1, pp.101–261992108Coll’n
            6Scientific cooperation in Europe and the citation of multinationally authored papersNarin, F., Stevens, K., Whitlow, E.S. Scientometrics, 21, 3, pp.313–23199188Coll’n
            7Mapping of science by combined co-citation and word analysis. 1. Structural aspectsBraam, R.R., Moed, H.F., Van Raan, A.F.J. Journal of the American Society for Information Science, 42, 4, pp.233–51199185Co-cit’n
            8Status report – linkage between technology and scienceNarin, F., Olivastro, D. Research Policy, 21, 3, pp.237–49199281Pat cit’s
            9Causal relationship between article citedness and journal impactSeglen, P.O. Journal of the American Society for Information Science, 45, 1, pp.1–11199480JIF
            10International collaboration in the sciences, 1981–1985Schubert, A., Braun, T. Scientometrics, 19, 1–2, pp.3–10199073Coll’n
            11The measurement of international scientific collaborationLuukkonen, T., Tijssen, R.J.W., Persson, O., Sivertsen, G. Scientometrics, 28, 1, pp.15–36199365Coll’n
            12Geographical proximity and scientific collaborationKatz, J.S. Scientometrics, 31, 1, pp.31–43199463Coll’n
            13Co-citation analysis: overview and defenseWhite, H.D.in Scholarly Communication and Bibliometrics 199063Co-cit’n
            14Co-word analysis as a tool for describing the network of interactions between basic and technological research – the case of polymer chemistryCallon, M., Courtial, J.P., Laville, F. Scientometrics, 22, 1, pp.155–205199160Co-word
            15Mapping of science by combined co-citation and word analysis. 2. Dynamic aspectsBraam, R.R., Moed, H.F., Van Raan, A.F.J. Journal of the American Society for Information Science, 42, 4, pp.252–66199160Co-cit’n
            16The duality of informetric systems with applications to the empirical lawsEgghe, L. Journal of Information Science, 16, 1, pp.17–27199058Gen bib’s
            17Do citations matter?Baird, L.M., Oppenheim, C. Journal of Information Science, 20, 1, pp.2–15199455Gen bib’s
            18The intellectual base and research fronts of JASIS 1986–1990Persson, O. Journal of the American Society for Information Science, 45, 1, pp.31–38199453Co-cit’n
            19Patent bibliometricsNarin, F. Scientometrics, 30, 1, pp.147–55199450Pat cit’s
            Source: See Table 1.
            1995–1999

            The 19 HCPs identified in Table 4 show a fairly even spread of interest among the evaluation of journal impact, with four HCPs (#4: 195 citations, #8: 160, #14: 89, #16: 81), co-citation mapping with three (#2: 222, #10: 133, #17: 73), patent citations and science–technology links with three (#3: 201, #9: 134, #13: 93), and the new theme of international comparisons along with other research evaluation studies, also with three (#5: 192, #12: 96, #15: 83). Interest in general bibliometrics continued to decline, with just two HCPs (#1: 329, #19: 73), as did that in research collaboration, which also accounted for two HCPs (#6: 178, #18: 73). In addition, this period witnessed the emergence of the new theme of webometrics, which accounted for two HCPs (#7: 169, #11: 131).

            Table 4. Bibliometric HCPs published in 1995–1999
             TitleAuthorsSourceDateCit’nsTheme
            1How popular is your paper? An empirical study of the citation distributionRedner, S. European Physical Journal B, 4, 2, pp.131–341998329Gen bib’s
            2Visualizing a discipline: an author co-citation analysis of information science, 1972–1995White, H.D., McCain, K.W. Journal of the American Society for Information Science, 49, 4, pp.327–551998222Co-cit’n
            3The increasing linkage between US technology and public scienceNarin, F., Hamilton, K.S., Olivastro, D. Research Policy, 26, 3, pp.317–301997201Pat cit’s
            4How can impact factors be improved?Garfield, E. British Medical Journal, 313, 7054, pp.411–131996195JIF
            5The scientific wealth of nationsMay, R.M. Science, 275, 5301, pp.793–961997192Int comp’n
            6What is research collaboration?Katz, J.S., Martin, B.R. Research Policy, 26, 1, pp.1–181997178Coll’n
            7The calculation of Web impact factorsIngwersen, P. Journal of Documentation, 54, 2, pp.236–431998169Web’s
            8Journal impact factor: a brief reviewGarfield, E. Canadian Medical Association Journal, 161, 8, pp.979–801999160JIF
            9Knowledge sourcing by foreign multinationals: patent citation analysis in the US semiconductor industryAlmeida, P. Strategic Management Journal, 17, special issue, pp.155–651996134Pat cit’s
            10Visualizing science by citation mappingSmall, H. Journal of the American Society for Information Science, 50, 9, pp.799–8131999133Co-cit’n
            11Informetric analyses on the World Wide Web: methodological approaches to ‘webometrics’Almind, T.C., Ingwersen, P. Journal of Documentation, 53, 4, pp.404–261997131Web’s
            12New bibliometric tools for the assessment of national research performance – database description, overview of indicators and first applicationsMoed, H.F., Debruin, R.E., Van Leeuwen, T.N. Scientometrics, 33, 3, pp.381–422199596Int comp’n
            13Citation frequency and the value of patented inventionsHarhoff, D., Narin, F., Scherer, F.M., Vopel, K. Review of Economics and Statistics, 81, 3, pp.511–15199993Pat cit’s
            14Improving the accuracy of Institute for Scientific Information’s journal impact factorsMoed, H.F., Van Leeuwen, T.N. Journal of the American Society for Information Science, 46, 6, pp.461–67199589JIF
            15Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercisesvan Raan, A.F.J. Scientometrics, 36, 3, pp.397–420199683Res eval
            16Impact factors can misleadMoed, H.F., van Leeuwen, T.N. Nature, 381, 6579, p.186199681JIF
            17Visualization of literaturesWhite, H.D., McCain, K.W. Annual Review of Information Science and Technology, 32, pp.99–168199773Co-cit’n
            18Studying research collaboration using co-authorshipsMelin, G., Persson, O. Scientometrics, 36, 3, pp.363–77199673Coll’n
            19Problems of citation analysisMacRoberts, M.H., MacRoberts, B.R. Scientometrics, 36, 3, pp.435–44199673Gen bib’s
            Source: See Table 1.
            2000–2004

            During this period, it would appear from Table 5 that there was a dramatic resurgence of interest in research collaboration and co-authorship, this theme accounting for the top four most highly cited publications and another in the top 10 (#1: 611 citations, #2: 304, #3: 300, #4: 291, #9: 81). Also very prominent was webometrics, with five of the top 21 HCPs (#10: 80, #12: 78, #16: 56, #17: 54, #20: 49). Of the remaining HCPs, interest was fairly evenly divided among general bibliometrics (#6: 122, #14: 59, #21: 48), patent citations and science–technology links (#7: 89, #13: 70, #19: 52), co-citation mapping (#11: 79, #15: 59), and international comparisons (#5: 167, #18: 53), with journal impact (#8: 81) accounting for the remaining HCP.

            Table 5. Bibliometric HCPs published in 2000–2004
             TitleAuthorsSourceDateCit’nsTheme
            1The structure of scientific collaboration networksNewman, M.E.J. Proceedings of the National Academy of Sciences, 98, 2, pp.404–92001a611Coll’n
            2Scientific collaboration networks. I. Network construction and fundamental resultsNewman, M.E.J. Physical Review E, 64, 1, 0161312001b304Coll’n
            3Scientific collaboration networks. II. Shortest paths, weighted networks, and centralityNewman, M.E.J. Physical Review E, 64, 1, 0161322001c300Coll’n
            4Evolution of the social network of scientific collaborationsBarabasi, A.L., Jeong, H., Neda, Z., Ravasz, E., Schubert, A., Vicsek, T. Physica A, 311, 3–4, pp.590–6142002291Coll’n
            5The scientific impact of nationsKing, D.A. Nature, 430, 6997, pp.311–162004167Int comp’n
            6Citation review of Lagergren kinetic rate equation on adsorption reactionsHo, Y.S. Scientometrics, 59, 1, pp.171–772004122Gen bib’s
            7An analysis of the critical role of public science in innovation: the case of biotechnologyMcMillan, G.S., Narin, F., Deeds, D.L. Research Policy, 29, 1, pp.1–8200089Pat cit’s
            8Journal impact measures in bibliometric researchGlanzel, W., Moed, H.F. Scientometrics, 53, 2, pp.171–93200281JIF
            9National characteristics in international scientific co-authorship relationsGlanzel, W. Scientometrics, 51, 1, pp.69–115200181Coll’n
            10Perspectives of webometricsBjorneborn, L., Ingwersen, P. Scientometrics, 50, 1, pp.65–82200180Web’s
            11Requirements for a co-citation similarity measure, with special reference to Pearson’s correlation coefficientAhlgren, P., Jarneving, B., Rousseau, R. Journal of the American Society for Information Science and Technology, 54, 6, pp.550–60200379Co-cit’n
            12Bibliometrics and beyond: some thoughts on web-based citation analysisCronin, B. Journal of Information Science, 27, 1, pp.1–7200178Web’s
            13Does science push technology? Patents citing scientific literatureMeyer, M. Research Policy, 29, 3, pp.409–34200070Pat cit’s
            14Scholarly publishing in the Internet age: a citation analysis of computer science literatureGoodrum, A.A., McCain, K.W., Lawrence, S., Giles, C.L. Information Processing & Management, 37, 5, pp.661–75200159Gen bib’s
            15Pathfinder networks and author co-citation analysis: a remapping of paradigmatic information scientistsWhite, H.D. Journal of the American Society for Information Science and Technology, 54, 5, pp.423–34200359Co-cit’n
            16Data collection methods on the Web for informetric purposes – a review and analysisBar-Ilan, J. Scientometrics, 50, 1, pp.7–32200156Web’s
            17New informetric aspects of the Internet: some reflections – many problemsEgghe, L. Journal of Information Science, 26, 5, pp.329–35200054Web’s
            18Language biases in the coverage of the Science Citation Index and its consequences for international comparisons of national research performancevan Leeuwen, T.N., Moed, H.F., Tijssen, R.J.W., Visser, M.S., Van Raan, A.F.J. Scientometrics, 51, 1, pp.335–46200153Int comp’n
            19The changing composition of innovative activity in the US – a portrait based on patent analysisHicks, D., Breitzman, T., Olivastro, D., Hamilton, K. Research Policy, 30, 4, pp.681–703200152Pat cit’s
            20A web crawler design for data miningThelwall, M. Journal of Information Science, 27, 5, pp.319–25200149Web’s
            21Authors as citers over timeWhite, H.D. Journal of the American Society for Information Science and Technology, 52, 2, pp.87–108200148Gen bib’s
            Source: See Table 1.
            2005–2009

            In the final five-year period examined here (see Table 6), of the 17 HCPs, no fewer than 10 focussed on the newly created Hirsch-index (or h-index) and related indices or analyses (#1: 355 citations, #4: 83, #5: 77, #6: 71, #7: 63, #9: 61, #11: 52, #14: 47, #15: 46, #17: 45). The only other theme to feature prominently during this time was research evaluation and international comparisons, with four HCPs (#3: 116, #8: 62, #12: 50, #13: 50). Mapping (#10: 53, #16: 46) and the evaluation of journal impact (#2: 147) accounted for the remaining HCPs. A summary of the number of times each of the above main themes appeared in the HCPs for the six periods considered here is given in Table 7, which reveals how the relative prominence of these themes has changed over time.

            Table 6. Bibliometric HCPs published in 2005–2009
             TitleAuthorsSourceDateCit’nsTheme
            1An index to quantify an individual’s scientific research outputHirsch, J.E. Proceedings of the National Academy of Sciences, 102, 46, pp.16569–722005355h-index
            2The history and meaning of the journal impact factorGarfield, E. JAMA – Journal of the American Medical Association, 295, 1, pp.90–932006147JIF
            3 Citation Analysis in Research Evaluation Moed, H.F.[Springer, Dordrecht]2005116Res eval
            4Theory and practise of the g-indexEgghe, L. Scientometrics, 69, 1, pp.131–52200683h-index
            5Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groupsVan Raan, A.F.J. Scientometrics, 67, 3, pp.491–502200677h-index
            6Does the h-index for ranking of scientists really work?Bornmann, L., Daniel, H.D. Scientometrics, 65, 3, pp.391–92200571h-index
            7Using the h-index to rank influential information scientistsCronin, B., Meho, L. Journal of the American Society for Information Science and Technology, 57, 9, pp.1275–78200663h-index
            8Fatal attraction: conceptual and methodological problems in the ranking of universities by bibliometric methodsVan Raan, A.F.J. Scientometrics, 62, 1, pp.133–43200562Int comp’n
            9On the h-index – a mathematical approach to a new measure of publication activity and citation impactGlanzel W Scientometrics, 67, 2, pp.315–21200661h-index
            10Mapping the backbone of scienceBoyack, K.W., Klavans, R., Borner, K. Scientometrics, 64, 3, pp.351–74200553Map’g
            11A Hirsch-type index for journalsBraun, T., Glanzel, W., Schubert, A. Scientist, 19, 22, p.8200552h-index
            12Is it possible to compare researchers with different scientific interests?Batista, P.D., Campiteli, M.G., Kinouchi, O., Martinez, A.S. Scientometrics, 68, 1, pp.179–89200650Res eval
            13The emergence of China as a leading nation in scienceZhou, P., Leydesdorff, L. Research Policy, 35, 1, pp.83–104200650Int comp’n
            14An informetric model for the Hirsch-indexEgghe, L., Rousseau, R. Scientometrics, 69, 1, pp.121–29200647h-index
            15A Hirsch-type index for journalsBraun, T., Glanzel, W., Schubert, A. Scientometrics, 69, 1, pp.169–73200646h-index
            16CiteSpace II: detecting and visualizing emerging trends and transient patterns in scientific literatureChen, C.M. Journal of the American Society for Information Science and Technology, 57, 3, pp.359–77200646Map’g
            17Does the h index have predictive power?Hirsch, J.E. Proceedings of the National Academy of Sciences, 104, 49, pp.19193–98200745h-index
            Source: See Table 1.
            Table 7. Trends in the main themes of bibliometric HCPs
            Theme1970–19791980–19891990–19941995–19992000–20042005–2009
            General bibliometrics64323 
            Highly cited papers (HCPs)3     
            Co-citation (& co-word) mapping336322
            Journal impact (JIF)3 2411
            Research evaluation (incl. international comparisons)14 324
            Research collaboration (incl. co-authorship & networks)1 525 
            Patents (incl. patent citations & S&T links) 2333 
            Webometrics   25 
            h-index etc.     10
            Source: Summary of material from final column of Tables 1–6.

            Do bibliometric HCPs provide evidence of a shift in the mode of knowledge production?

            A shift to research in the context of application?

            The analysis reported above indicates that during the 1970s the main focus of bibliometrics seems to have been on developing basic bibliometric tools, such as citation analysis (Garfield, 1970),5 the concepts and theory of bibliometric processes (Price, 1976), the identification of ‘citation classics’ (Garfield, 1977, 1979),6 and the use of co-citations to produce ‘maps’ of science (Small, 1973; Small and Griffith, 1974). Only limited attention was given to potential practical or policy applications of bibliometrics, such as the use of citations to evaluate the impact of journals for science policy purposes (Garfield, 1972) and Narin’s (1976) proposal for the development of bibliometrics for research evaluation and policy purposes. Over the 1980s, however, as can be seen from Table 7, interest started to grow in the policy applications of bibliometrics, and in particular its use in research evaluation, either in the form of institutional assessments (Martin and Irvine, 1983; Moed et al., 1985) or international comparisons (Schubert and Braun, 1986; Schubert et al., 1989).

            The 1990s witnessed emerging interest in other policy issues, such as research collaboration (e.g. Narin et al., 1991; Luukkonen et al., 1992; Katz and Martin, 1997),7 innovation and the associated interest in patent citations and science–technology links (e.g. Albert et al., 1991; Narin and Olivastro, 1992; Almeida, 1996; Narin et al., 1997; Harhoff et al., 1999), and international comparisons (e.g. Moed et al., 1995; May, 1997). Over the most recent decade, there was continuing interest in policy-related topics, such as research collaboration and networks (e.g. Newman, 2001a, 2001b, 2001c; Glanzel, 2001; Barabasi et al., 2002), innovation, including patent citations and science–technology links (e.g. McMillan et al., 2000; Meyer, 2000; Hicks et al., 2001), and international comparisons, including university rankings (e.g. van Leeuwen et al., 2001; King, 2004; Van Raan, 2005; Zhou and Leydesdorff, 2006). In short, the growing interest in such uses of bibliometrics for policy purposes would seem to suggest a shift from the primarily Mode 1 form of bibliometric research that dominated in the 1970s to bibliometric research being conducted more in the context of application in the last two decades.

            Increasing transdisciplinarity?

            What can we learn from the disciplinary backgrounds of HCP authors? Does this reveal anything about whether the degree of transdisciplinarity has been changing over time? During the 1970s (and indeed the previous decade, although we have not considered data for that period here),8 the pioneers migrated into the new area of bibliometrics from a variety of fields, such as library or information science (e.g. Garfield), science studies (e.g. Price, Small, Griffith, Mullins) and natural sciences (e.g. Moravcsik, Narin). In other words, as with any new or emerging field, knowledge, ideas, methods and perspectives were being drawn from various fields and the level of multidisciplinarity was relatively high in this early phase. Over the 1980s, in contrast, the authors of HCPs came mainly from within the developing bibliometric (or information science) community (e.g. Griffith, White, McCain, Schubert, Glanzel, Braun, Narin, Brookes, Garfield), and, as researchers began to integrate the inputs from different disciplines, the research shifted from being multidisciplinary to interdisciplinary. Nevertheless, there continued to be a number of new entrants (e.g. Martin, Irvine and Pavitt from science policy research, Van Raan from physics, MacRoberts and MacRoberts from biology, Culnan from management, and Callon from science studies).

            The situation changed somewhat during the 1990s, when there were a number of prominent immigrants from other fields, including Seglen (from biomedical research), Laband and Piette (economics), Redner (physics, econophysics), and May (mathematics, theoretical biology and complexity). This trend was accentuated during the first decade of the twenty-first century, when the most influential HCPs came from immigrants. These included Newman and Barabasi et al. (from physics and complexity) – with Newman’s analysis of collaboration networks as small worlds being extremely highly cited – King and Ho (both from chemistry), and Hirsch (again from physics). Indeed, the development by Hirsch of the h-index is arguably the most significant recent advance in bibliometrics in recent years. The h-index, like most of these other external contributions, although the subject of a certain amount of criticism, was nevertheless rapidly assimilated within the bibliometric community, sparking the development of related measures, such as the g-index (Egghe, 2006). This ability to take inputs from other disciplines, and not only integrate them but also then subject them to further transformation, would suggest that bibliometrics was undergoing a change in character from interdisciplinary to transdisciplinary research.

            In addition, the fact that the most highly cited advances in the field over recent years have tended to come from outsiders might suggest that many in the established bibliometric community are now engaged in more incremental ‘normal science’ and hence are less likely to initiate radical advances in bibliometrics.9 However, another contributing factor may have been that the ISI Web of Science database was one of the few data sets available that was large enough and ‘clean’ enough to study complex social networks and small world phenomena. Those in the physics and computer science communities interested in such matters were thus attracted by the Web of Science database, bringing with them new insights and techniques (for example, from statistical mechanics) to investigate these topics. As in other such instances in the history of science, this influx of new ideas, perspectives and methods from other fields has subsequently resulted in a number of major advances in the field of bibliometrics.10

            In summary, the evidence from the backgrounds of HCP authors suggests that bibliometric research, like most new fields, was initially highly multidisciplinary in nature. During the 1980s, the various inputs from different disciplines began to be more integrated, marking a shift from multidisciplinarity to interdisciplinarity (see the earlier discussion of definitions and terminology). Then, over the last 15–20 years, as the ability and indeed the self-confidence of the bibliometric community to assimilate and then transform inputs from other disciplines has grown, a degree of transdisciplinarity has become apparent, thus providing some measure of support for the Mode 2 thesis of Gibbons et al. (1994).

            Growing institutional heterogeneity?

            What can we learn from the institutional affiliations of HCP authors? From the above analysis, it is apparent that the field has always been quite heterogeneous. Over the 1970s and 1980s, private firms, such as ISI (founded by Garfield and home to Small) and CHI (founded by Narin), were very prominent, as were universities and research institutes (e.g. ISSRU, the Information Science and Scientometrics Research Unit at the Library of the Hungarian Academy of Sciences). During the 1990s and the first decade of the twenty-first century, important HCPs came from university science departments and from government (e.g. successive UK chief scientists). However, from this analysis it is not clear whether bibliometrics is now exhibiting more institutional heterogeneity than it did in its early years.

            Indeed, the above analysis points to a weakness in the New Production of Knowledge argument. As Godin (1998) and others have observed, there have always been new fields emerging, initially multidisciplinary or interdisciplinary in character, as well as quite institutionally heterogeneous. Only later does bibliometrics begin to settle down and mature, becoming more Mode 1-like.11 The efforts of those in the bibliometric community to move the field in the direction of a discipline (and more Mode 1 in nature) may thus be masking any tendency for the knowledge production to become more Mode 2-like.12

            Growing external accountability and quality control?

            As noted above in the sub-section on research in the context of application, the 1980s and 1990s saw growing interest in research evaluation as a theme of bibliometric publications. This suggests that, at least among the authors of HCPs, there was an increasing recognition that scientific research is now subject to external accountability, and that bibliometric indicators can play a significant role in providing such accountability.

            What do bibliometric HCPs reveal about scientific research more generally?

            The foregoing analysis of the themes of bibliometric HCPs, besides telling us something about the changing nature of bibliometric research, also gives us some clues about the changing nature of knowledge production in science more generally. In particular, the growing prevalence of highly cited bibliometric work on evaluation, patent citations and science–technology links suggests that scientific research itself is increasingly conducted in the context of application. Likewise, the bibliometric studies of collaboration provide evidence that scientific research in areas closely linked to technology is becoming more multi-, inter- or transdisciplinary, and more institutionally heterogeneous.

            Lastly, the shift towards policy-related issues exhibited by the authors of bibliometric HCPs almost certainly reflects the growing government emphasis on external accountability for all scientific research, with associated changes in the approach to quality assessment away from the former exclusive reliance on peer review. Moreover, by making available a growing range of tools for evaluating research, bibliometric researchers may well have encouraged the growing efforts by funding agencies and others to subject scientific research to more systematic assessment (as in the UK Research Assessment Exercises, for example). The increasing application of bibliometric and other performance indicators may, in turn, have resulted in changes in publication and citation practices, and perhaps also in what research is carried out and how, as authors seek to maximise their performance in terms of a particular indicator, a point to which we return below.

            Conclusions

            Following pioneering developments by such authors as Garfield and Price in the 1960s, bibliometrics started to emerge as a distinct research area during the 1970s. This was the decade when the term ‘bibliometric’ and related ones, such as ‘citation analysis’, first began to be used on a significant scale.13 By the 1980s, bibliometrics had become an established research field, with its own journal (Scientometrics, founded in 1978), conferences (e.g. the International Conferences on Scientometrics and Informetrics, the first of which was held in 1987, and the Leiden Conferences on Science and Technology Indicators, first held in 1988) and handbooks (e.g. van Raan, 1988). Reflecting this, most of the authors of HCPs appearing during the 1980s came from within the bibliometric community, and their interests were more internally motivated.14 Some were also driven, in part, by a desire to ensure that bibliometrics acquired some of the characteristics of a discipline (such as dedicated journals and conferences) – in other words, to make it more Mode 1-like. However, during this time, there were signs of a growing policy need (a demand pull) for evaluation, and the 1990s witnessed a significant shift to bibliometric research conducted in the context of application. During this decade, leading bibliometric researchers played a significant role in helping scientific research to respond to increased demands for external accountability, and in contributing to changes in the approach to the quality assessment of research (for example, in evaluations such as the Research Assessment Exercises and, more recently, the Research Excellence Framework in the UK, and parallel initiatives in Australia, Norway and elsewhere). In these respects, bibliometric HCPs provide some evidence of a change in the balance between Mode 1 and Mode 2 knowledge production, with a shift towards the latter.

            Another characteristic of Mode 2 knowledge production is greater transdisciplinarity. Here, the evidence from bibliometric HCPs is more complicated. Reflecting its status as a newly emerging field, bibliometric research was intrinsically multidisciplinary in its early years up to the 1970s, drawing upon inputs from a wide range of fields, but it seems to have become more self-contained and arguably more ‘interdisciplinary’ in the 1980s as the field matured and developed the ability not just to draw upon, but also to synthesise inputs from different disciplines. By the first decade of the twenty-first century, many of the top HCPs came from other disciplines, but these external inputs were quickly assimilated and then further transformed by the bibliometrics community, suggesting that there may have been a growing element of transdisciplinarity since the publication of The New Production of Knowledge. However, another possible interpretation is that, as the bibliometric research community has matured, it has also become more conservative and incremental in approach, leaving the field open to outsiders, such as Barabasi, Hirsch and Newman (perhaps attracted, in part, by the size and possibilities of the Web of Science database), to make the most influential advances of the last decade or so.

            The remaining characteristic of Mode 2 knowledge production is greater institutional heterogeneity than in Mode 1. We have seen how the field of bibliometrics was very heterogeneous at the start (as in any new research activity, researchers must inevitably come from outside as the field has yet to form), but it became markedly less so in the 1980s. More recently, however, new actors have again become involved. These include not only certain physicists who have chosen to study research performance (e.g. Hirsch) or networks (e.g. Barabasi, Newman), but also a number of computer and information scientists studying bibliometrics using new clustering algorithms and visualisation techniques (e.g. Borner, Boyack, Chen). While the former have tended to remain largely outside the discussions of the bibliometric community, the latter have become more actively integrated, participating in bibliometric conferences and publishing in bibliometric journals, and thereby linking bibliometrics more closely with information science.15 Overall, although there is little evidence that the field is now more institutionally heterogeneous than in the earliest period of its history, there has evidently been a shift in this direction from the 1990s onwards.

            From the subject matter of bibliometric HCPs, there is also evidence that scientific research more generally is increasingly conducted in the context of application, and is often transdisciplinary in nature, and perhaps also somewhat more institutionally heterogeneous. Certainly, the focus of bibliometric HCPs on policy-related issues, such as research evaluation, science–technology links and collaboration, would seem to reflect a growing need for scientific research to be subject to external accountability, with bibliometrics itself being part of the process of change in the approach to the quality assessment of scientific research.

            However, it is important to bear in mind certain limitations of the approach adopted in this study. First, it focussed on the top HCPs in each of the chosen periods. The themes of these elite publications may, or may not, reflect the themes (or the balance of themes) of the bulk of lesser cited publications. Secondly, as already remarked, whether a publication is highly cited may bear little relationship to its wider impact, whether on research policy or its economic or social impact. To take one example, the article by Pinski and Narin (1976) is credited with playing a significant part in stimulating the extremely important development of Google by Larry Page (see Franceschet, 2011). However, as noted earlier, there is unfortunately no obvious indicator that one might use to assess such wider impact. Thirdly, only leading Web of Science journals were scanned comprehensively. Books and other non-Web of Science publications have been searched less systematically using keywords, while the scanning of leading authors is inevitably not as comprehensive as one might have liked. Hence, the HCPs identified should be regarded as being among the top most highly cited publications, rather than necessarily constituting the most highly cited publications in the field of bibliometrics. However, the former is almost certainly sufficient for the subsequent analysis of themes and trends in these themes.

            Fourthly, identifying the boundary between bibliometrics and library or information science (in order to exclude the latter) is inevitably somewhat subjective, as is the classification of HCPs by theme. Fifthly, the analysis reported here has gone back only as far as 1970 and it is possible that the 1960s might show a somewhat different picture of the origins of bibliometrics. Lastly, although we have identified a number of characteristics that might be symptomatic of a shift towards Mode 2 knowledge production, we cannot exclude the possibility that there might be other forces at work which result in the same, or at least broadly similar, consequences in the pattern of publication and citation to those observed here.16 For example, as we noted earlier, the influx of researchers from physics and computer science over the last decade or so (and the resulting increase in the apparent level of transdisciplinarity) may be an accidental consequence of the unique attractiveness of the Web of Science database to those seeking to investigate large complex social networks and small world phenomena.

            In conclusion, this study has provided some evidence that bibliometrics, as a field of research, has exhibited a shift towards Mode 2 knowledge production over the last two decades or so. In addition, bibliometrics would seem to have played a significant part in an overall shift towards Mode 2 knowledge production, contributing policy-relevant tools and analyses, helping scientific research to respond to increased demands for external accountability, and stimulating changes in the approach to the quality assessment of research. Indeed, it may also have inadvertently contributed to changes in publication and citation practices as authors choose their research topics, and decide how they write these up, whom they cite, and which journals to publish in, all with an eye very much to maximising their ‘score’ on one or more bibliometric indicators (Martin, 2011). As such, bibliometrics may have shifted to become (at least in part) an instrument of managerialism in universities and the higher education sector – in other words, an intrinsically Mode 2 activity.17 However, any shift from Mode 1 to Mode 2 involves a rather slow and gradual process of evolution. Moreover, as we have seen, the two modes are intertwined to such an extent that it may be some time before we can say with greater certainty that a transition from one to the other has indeed taken place.18

            Acknowledgements

            This paper was originally presented at the Riksbankens Jubileumsfond Symposium, ‘Changes of Science and Policy’, held at Noors Slott, Sweden on 16–18 October 2009. The author is grateful to participants at the symposium for their comments, and subsequently to Diana Hicks, Sylvan Katz, Stuart Macdonald, Ismael Rafols and Ed Steinmueller for their very helpful suggestions, many of which have been incorporated here. However, the usual disclaimers apply.

            Notes

            References

            1. Albert M.. 2003. . Universities and the market economy: the differential impact on knowledge production in sociology and economics. . Higher Education . , Vol. 45((2)): 147––182. .

            2. Broadus R.. 1987. . Early approaches to bibliometrics. . Journal of the American Society for Information Science . , Vol. 38((2)): 127––129. .

            3. Choi B. and Pak A.. 2006. . Multidisciplinarity, Interdisciplinarity and transdisciplinarity in health research, services, education and policy: 1. Definitions, objectives, and evidence of effectiveness. . Clinical and Investigative Medicine . , Vol. 29((6)): 351––364. .

            4. Etzkowitz H. and Leydesdorff L.. 2000. . The dynamics of innovation: from national systems and Mode 2 to a triple helix of university–industry–government relations. . Research Policy . , Vol. 29((2)): 109––123. .

            5. Ferlie E. and Wood M.. 2003. . Novel mode of knowledge production? Producers and consumers in health services research. . Journal of Health Services Research & Policy . , Vol. 8((Supplement 2)): 51––57. .

            6. Franceschet M.. 2011. . PageRank: standing on the shoulders of giants. . Communications of the ACM . , Vol. 54((6)): 92––101. .

            7. Gibbons M., Limoges C., Nowotny H., Schwartzman S., Scott P. and Trow M.. 1994. . The New Production of Knowledge: The Dynamics of Science and Research in Contemporary Societies . , London : : Sage. .

            8. Godin B.. 1998. . Writing performative history: the new New Atlantis? . Social Studies of Science . , Vol. 28((3)): 465––483. .

            9. Godin B.. 2006. . On the origins of bibliometrics. . Scientometrics . , Vol. 68((1)): 109––133. .

            10. Godin B. and Gingras Y.. 2000. . The place of universities in the system of knowledge production. . Research Policy . , Vol. 29((2)): 273––278. .

            11. Gulbrandsen M. and Langfeldt L.. 2004. . In search of Mode 2: the nature of knowledge production in Norway. . Minerva . , Vol. 42((3)): 425––451. .

            12. Harvey J., Pettigrew A. and Ferlie E.. 2002. . The determinants of research group performance. towards Mode 2? . Journal of Management Studies . , Vol. 39((6)): 747––774. .

            13. Hemlin S. and Rasmussen S.B.. 2006. . The shift in academic quality control. . Science, Technology & Human Values . , Vol. 31((2)): 173––198. .

            14. Hessels L. and van Lente H.. 2008. . Re-thinking new knowledge production: a literature review and a research agenda. . Research Policy . , Vol. 37((4)): 740––760. .

            15. Hicks D. and Katz J.. 1996. . Where is science going? . Science, Technology & Human Values . , Vol. 21((4)): 379––406. .

            16. Huff A.. 2000. . Changes in organisational knowledge production. . Academy of Management Review . , Vol. 25((2)): 288––293. .

            17. Klein J.. 2010. . “A taxonomy of interdisciplinarity. ”. In The Oxford Handbook of Interdisciplinarity . , Edited by: Frodeman R., Klein J. and Mitcham C.. p. 15––30. . Oxford : : Oxford University Press. .

            18. Liu N. and Cheng Y.. 2005. . The academic ranking of world universities. . Higher Education in Europe . , Vol. 30((2)): 127––136. .

            19. Martin B.. 2003. . “The changing social contract for science and the evolution of the university. ”. In Science and Innovation: Rethinking the Rationales for Funding and Governance . , Edited by: Geuna A., Salter A. and Steinmueller W.. p. 7––29. . Cheltenham : : Edward Elgar. .

            20. Martin B.. 2011. . The Research Excellence Framework and the impact agenda: are we creating a Frankenstein monster? . Research Evaluation . , Vol. 20((3)): 247––254. .

            21. Martin, B. (2012a) ‘The evolution of science policy and innovation studies’, Research Policy (forthcoming).

            22. Martin, B. (2012b) ‘Are universities and university research under threat? Towards an evolutionary model of university speciation’, Cambridge Journal of Economics (forthcoming).

            23. Martin B. and Etzkowitz H.. 2000. . The origin and evolution of the university species. . Journal for Science and Technology Studies (Tidskrift för Vetenskaps- och Teknikstudier, VEST) . , Vol. 13((3–4)): 9––34. .

            24. Martin, B., Nightingale, P. and Yegros-Yegros, A. (2012) ‘Science and Technology Studies: exploring the knowledge base’, Research Policy (forthcoming).

            25. Morton S.. 2005. . “Academics and the Mode-2 society: shifts in knowledge production in the humanities and social sciences. ”. In Governing Knowledge: A Study of Continuity and Change in Higher Education – A Festschrift in Honour of Maurice Kogan, Higher Education Dynamics Volume 9 . , Edited by: Bleiklie I. and Henkel M.. p. 169––188. . Dordrecht : : Springer. .

            26. Nowotny H.. 2000. . The production of knowledge beyond the academy and the market: a reply to Dominique Pestre. . Science Technology Society . , Vol. 5((2)): 183––194. .

            27. Nowotny H., Scott P. and Gibbons M.. 2001. . Re-Thinking Science. Knowledge and the Public in an Age of Uncertainty . , Cambridge : : Polity Press. .

            28. Nowotny H., Scott P. and Gibbons M.. 2003. . Mode 2 revisited: The New Production of Knowledge . . Minerva . , Vol. 41((3)): 179––194. .

            29. Pestre D.. 2000. . The production of knowledge between academies and markets: historical reading of the book The New Production of Knowledge . . Science Technology Society . , Vol. 5((2)): 169––181. .

            30. Pestre D.. 2003. . Regimes of knowledge production in society: towards a more political and social reading. . Minerva . , Vol. 41((3)): 245––261. .

            31. Pettigrew A.. 1997. . “The double hurdles for management research. ”. In Advancement in Organizational Behaviour: Essays in Honour of Derek S. Pugh . , Edited by: Clark T.. p. 277––296. . Aldershot : : Ashgate. .

            32. Pinski G. and Narin F.. 1976. . Citation influence for journal aggregates of scientific publications: theory, with application to the literature of physics. . Information Processing and Management . , Vol. 12((5)): 297––312. .

            33. Porter A. and Rafols I.. 2009. . Is science becoming more interdisciplinary? Measuring and mapping six research fields over time. . Scientometrics . , Vol. 81((3)): 719––745. .

            34. Rip A.. 2002. . “Science for the 21st century. ”. In The Future of Science and the Humanities . , Edited by: Tindemans P., Verrijn-Stuart A. and Visser R.. p. 99––148. . Amsterdam : : Amsterdam University Press. .

            35. Robson, B. and Shove, E. (eds) (1999) Interactions and Influence: Individuals and Institutions, summary report of six pilot studies commissioned by the ESRC, ESRC, Swindon.

            36. Shinn T.. 1999. . Change or mutation? Reflections on the foundations of contemporary science. . Social Science Information . , Vol. 38((1)): 149––176. .

            37. Shinn T.. 2002. . The triple helix and new production of knowledge: prepackaged thinking on science and technology. . Social Studies of Science . , Vol. 32((4)): 599––614. .

            38. van Raan A.. , ed. 1988. . Handbook of Quantitative Studies of Science and Technology . , Amsterdam : : Elsevier. .

            39. Weingart P.. 1997. . From finalization to Mode 2: old wine in new bottles? . Social Science Information . , Vol. 36((4)): 591––613. .

            Footnotes

            1. The book is included among the most influential contributions to the field of science policy and innovation studies identified in Martin (2012a), and also among those for science and technology studies (STS) examined in Martin et al. (2012).

            2. This is not the place to go into the extensive criticisms of The New Production of Knowledge and Mode 2 – for these, see, for example, Weingart (1997), Godin (1998), Shinn (1999, 2002), Etzkowitz and Leydesdorff (2000), Martin and Etzkowitz (2000), Pestre (2000, 2003), Rip (2002) and Martin (2003). Suffice it to say that one major element of the criticisms focussed on showing that Mode 2 with its various characteristics is not particularly new; indeed, in some respects Mode 2 can even be said to have predated Mode 1 (which emerged only in the second half of the twentieth century). Martin (2003, 2012b) argues that it is probably better to talk about shifts in the relative balance of Mode 1 and Mode 2 over time than about the emergence of a new mode of knowledge production. A response to the early criticisms can be found in Nowotny (2000).

            3. In a section entitled ‘Transdisciplinary publishing’, Hicks and Katz (1996, pp.387–88) looked at the proportion of papers published in journals spanning two or more disciplines, but whether such articles can be considered truly transdisciplinary in the sense of Gibbons et al. (1994) or as used here is debatable.

            4. See also the distinction between the three terms drawn by Choi and Pak (2006) in their examination of health research.

            5. Note that full references to bibliometric HCPs can be found in Tables 1–6. For this reason, they are not repeated in the list of references at the end of this paper.

            6. This is not to imply that the work of Garfield was ever particularly Mode 1 in nature; from the start his research interests were very much in the context of application – providing more efficient tools for literature searches and then commercialising these. (I am grateful to Stuart Macdonald for this point.)

            7. Although the authors cited here studied collaboration for policy-related purposes, there were others, such as those in the science and technology studies (STS) community, who were more interested in it from an internal or Mode 1 perspective. However, this work does not appear to have generated any particularly highly cited publications. Moreover, the substantial resources and specialist expertise required for such bibliometric work may have resulted in a growing focus on policy as the best strategy for raising the necessary resources to pursue such research. (I am indebted to Diana Hicks for this observation.)

            8. An analysis of the early history of bibliometrics can be found in Broadus (1987), while Godin (2006) goes even further back to examine the pre-history of the field, in particular the early work by psychologists.

            9. Although it does not apparently show up in the form of a very highly cited publication, another major contribution during recent years has been the Shanghai ranking of universities (see Liu and Cheng, 2005), again produced by a group external to the established bibliometric community (Ismael Rafols, private communication).

            10. Sylvan Katz (private communication).

            11. Ed Steinmueller (private communication).

            12. Ismael Rafols (private communication).

            13. This can be seen, for example, using the Google Ngram Viewer at http://ngrams.googlelabs.com/ with the terms ‘bibliometric’ and ‘citation analysis’.

            14. However, one exception, as noted earlier, was Garfield, whose research interests were more applied.

            15. Ismael Rafols (private communication).

            16. Ed Steinmueller (private communication).

            17. Stuart Macdonald (private communication).

            18. Sylvan Katz (private communication).

            Author and article information

            Contributors
            Journal
            cpro20
            CPRO
            Prometheus
            Critical Studies in Innovation
            Pluto Journals
            0810-9028
            1470-1030
            December 2011
            : 29
            : 4
            : 455-479
            Affiliations
            a Science and Technology Policy Research (SPRU), University of Sussex, UK and Centre for Science and Policy (CSAP) and Centre for Business Research, Judge Business School, University of Cambridge , UK
            Author notes
            Article
            643540 Prometheus, Vol. 29, No. 4, December 2011, 455–479
            10.1080/08109028.2011.643540
            6118d4b0-1917-4b2e-a0a5-dacf46a3cdcf
            Copyright Taylor & Francis Group, LLC

            All content is freely available without charge to users or their institutions. Users are allowed to read, download, copy, distribute, print, search, or link to the full texts of the articles in this journal without asking prior permission of the publisher or the author. Articles published in the journal are distributed under a http://creativecommons.org/licenses/by/4.0/.

            History
            Page count
            Figures: 0, Tables: 7, References: 39, Pages: 25
            Categories
            RESEARCH PAPERS

            Computer science,Arts,Social & Behavioral Sciences,Law,History,Economics

            Comments

            Comment on this article