3

The scholarly ecosystem

Michael Jubb

Abstract:

This chapter examines some key trends in the funding of research, especially the funding of basic research by governments and other public bodies; and highlights the pressure to increase the contribution of research to innovation, economic performance and the fulfilment of social needs. It also examines where basic research is conducted in different countries, in universities and research institutes; how research careers are developed under the increasing pressure of competition; and how universities both support and manage researchers. Finally, it examines how research is conducted by both individuals and teams; how researchers both consume and create information in the course of their research; and how the research process is changing in different disciplines as a result of technological change.

Key words

Research

researchers

funding

competition

information

universities

technology

Introduction

Researchers are driven by a desire to enhance our knowledge and understanding of the world we inhabit: the end product of their work, if successful, is new knowledge, typically based on a combination of newly discovered or developed data and information, set alongside data and information which has been taken from the work of previous researchers. The result is greater understanding and a newly enhanced knowledge base.

Scholarly publishing fulfils a relatively small but critical role in the wider landscape of research across the globe, enabling researchers to communicate their findings. Publishing depends on the activities of researchers, both as producers and as consumers of scholarly content. But researchers in turn depend on a publishing infrastructure to enable them to communicate their findings effectively both to their fellow researchers and to wider communities. This chapter presents a picture of the nature and scale of the research landscape, delineates some key features and trends over the past few years, and considers some key aspects of the research process and how it is changing.

Funding of research by governments, business and other organisations

Across the 34 members of the Organisation for Economic Co-operation and Development (OECD), gross expenditure on research and development (R&D) amounted in 2008 to $964 billion.1 Roughly 35 per cent of investment in R&D takes place in North America, 31 per cent in Asia and 28 per cent in Europe; the rest of the world (Latin America and the Caribbean, Africa and the Middle East, and Oceania) together account for about 6 per cent. Expenditure has increased by over 60 per cent in real terms since the mid-1990s, and in major research countries has tended to exceed the rate of growth in gross domestic product (GDP). Thus in the US the average annual growth in R&D expenditure over the past 20 years has been 3.1 per cent in real terms, as compared with average growth in GDP of 2.8 per cent. The result is that across OECD countries as a group, R&D represents a growing proportion of the economy as a whole: R&D expenditure grew as a proportion of GDP from 1.9 per cent in 1981 to 2.3 per cent in 2008.

Of course, not all of this expenditure results in research findings and outputs of the kinds that are reported in scholarly books and journals. The business sector is the major source of funding for R&D among the members of the OECD, and the majority of those funds are devoted to ‘experimental development’: the development of products, processes or services. In the US, for example, development of this kind accounts for over 60 per cent of the total expenditure on R&D. The more fundamental ‘basic’ or ‘applied’ research that is reported in the scholarly literature thus represents just a part of the overall expenditure on R&D. Expenditure on basic research – that is, according to the definitions developed by the OECD, ‘experimental or theoretical work undertaken primarily to acquire new knowledge of the underlying foundation of phenomena and observable facts, without any particular application or use in view’ (OECD, 2002) – thus amounts in the US to 17 per cent and in France to 25 per cent of total expenditure on R&D (National Science Board, 2010).2

In China it accounts for about 5 per cent of activity (Ministry of Science and Technology of the People’s Republic of China, 2007). Governments fund a significant proportion of all recorded R&D expenditure; but they tend to be the major funders of basic and applied research. In the US, for example, the Federal Government accounted in 2008 for about a quarter of all R&D expenditure, but 57 per cent of the funding for basic research. And in the major research-producing countries, governments have tended over the past decade to increase their research budgets quite sharply. Thus in the UK the budgets for the Research Councils increased by 78 per cent in real terms between 1998 and 2008, while the block grant allocated to universities to support their research activities increased by 62 per cent.3 In the US, the Federal budget for basic research rose by 20 per cent in real terms between 2000 and 2008, although the sharpest increases were in the early part of the decade (NationalScience Board, 2010, Appendix Table 4–18). Such sharp increases have become less common since 2008, but it is notable that the economic stimulus package enacted in early 2009 through the American Recovery and Reinvestment Act provided a considerable one-off increase in the Federal R&D budget of over $18 billion.

Governments have been prepared to increase expenditure in this way because they believed that it is necessary in order to achieve economic success. The UK was typical in adopting the kind of strategy announced in its Science and Innovation Investment Framework, published in 2004 (HM Treasury et al., 2004).4 This announced that: ‘For the UK economy to succeed in generating growth through productivity and employment in the coming decade, it must invest more strongly than in the past in its knowledge base, and translate this knowledge more effectively into business and public service innovation.’

The new strategy promised to make good past under-investment in the science base, and to raise science spending faster than the trend rate of growth of the economy to achieve that end. But the investment was for a purpose, and brought with it a renewed emphasis on the linkages between research and innovation, and translating the results of research into tangible outcomes for the benefit of society and the economy. ‘Knowledge transfer’ and working collaboratively with business were key themes in this strategy, which was accompanied by the development of targets and performance indicators, along with periodic reviews to track performance and progress. Similar themes have been repeated in the funding and policy papers issued by the new Coalition Government elected in 2010 in the UK, which have used remarkably similar language (Department for Business Innovation and Skills, 2010).

‘Our world-class science and research base is inherently valuable, as well as critical to promoting economic growth. Investment in science and research creates new businesses and improves existing ones; brings highly skilled people into the job market; attracts international investment and improves public policy and services. The UK’s world-class research base will be a key driver in promoting economic growth.’

None of this is unique to the UK. The OECD’s Ministerial Committee for Scientific and Technological Policy identified in 2004 the pressure for publicly funded research ‘to increase its contribution to innovation, economic performance and the fulfilment of social needs’ (OECD, 2004, 2009). In Japan, the Science and Technology Agency has made investment in research a foundation for its economic strategy,5 and similar points were made in the review of Australia’s innovation system (Cutler and Co., 2008).

The OECD Ministerial Committee also noted, however, that governments were wrestling with questions of ‘how best to restructure and reform public research organisations to improve their contributions to social and economic problems without sacrificing the objectivity and independence of their advice and their ability to pursue curiosity-based research’. More recently, the European Union has noted the need to address: ‘both a competitiveness challenge (closing Europe’s gap in innovation) and a cultural challenge (integrating research and innovation to focus on societal challenges)’ (European Union 2011, p. 1).

The struggle to balance these different kinds of goals continues and is reflected in the strategic aims and objectives of major funding bodies. The Higher Education Funding Council for England (HEFCE), which provides block grants to universities to support their research activities, for example, defines its aim in this area as: ‘to develop and sustain a dynamic and internationally competitive research sector that makes a major contribution to economic prosperity and national wellbeing and to the expansion and dissemination of knowledge.’6

In China, the Law on Science and Technology Progress makes repeated mention of the role of science and technology in ‘economic construction and social development’.7 Funding bodies that provide project-based rather than block grants to support research typically make the link even more explicit. Thus, the US National Institutes of Health (NIH) defines its mission as: ‘to seek fundamental knowledge about the nature and behavior of living systems and the application of that knowledge to enhance health, lengthen life, and reduce the burdens of illness and disability.’8

In similar vein, the Australian National Health and Medical Research Council, like similar funding bodies in other countries, has put an increasing emphasis on translational outcomes of medical and health research; while the more broadly based Australian Research Council talks of ‘capturing and quantifying the outcomes of research and knowledge transfer and the contribution of research to the economic, social, cultural and environmental well-being of Australians.’9

This focus on research as a key underpinning of economic performance and social well-being means that Government funding of scientific research – in both research institutes and universities – has shown an increasing tendency to be based on performance criteria; and in countries such as the UK and Australia, this has been accompanied by large-scale research assessment exercises undertaken by national agencies. The impact on universities will be considered in the next section.

While science policy in many countries is based on the belief that investment in research is intimately tied to economic growth, evidence about the nature and scope of the linkages is far from conclusive. This is not surprising, as the relationship between science and innovation is non-linear, and complex outcomes may differ substantially between different countries and disciplines. We do not understand the mechanisms through which investments in R&D − still more the investments in basic research − and the immediate results in the form of new knowledge or technologies, interact with other features of societies and economies to produce innovation and growth. Impacts often come after considerable time-lags, and are complex to identify and analyse. Hence there is renewed interest in developing capacity and capability to assess the impact of research through programmes such as the National Science Foundation’s (NSF) Science of Science and Innovation Policy (SciSIP) programme in the US.10 The challenges faced by such programmes are, however, formidable.

Research and researchers

There were some 4.2 million researchers in the OECD countries in 2007, an increase of 48 per cent since 1995. The rate of increase has been even sharper, of course, in some emerging research countries, and on some measures China now has more researchers than the US.

Research and where it is conducted

In all the top ten research nations, the business sector is the largest performer of R&D, accounting for expenditure ranging from 78 per cent in Japan and 72 per cent in China and the US to 49 per cent in Italy. But most basic research − on average more than three-quarters in the OECD − is undertaken in universities and Government research institutes. There are significant differences between countries, however, in how the research base is organised, and where researchers are located; and the balance between universities and national research institutes varies hugely. Thus in the Russian Federation, around 90 per cent of basic research is conducted in research institutes; while in the Nordic countries around three-quarters and in the US nearly three-fifths is conducted in universities. Nevertheless, in the US, intramural R&D performed in agencies and laboratories of the federal Government, and in federally funded research and development centres accounts for over 10 per cent of all R&D, and a slightly higher proportion of basic research (National Science Board, 2010, Chapter 4).

In some European countries, recent developments may tend to shift more research activity towards the university sector. In Russia, for example, new federal universities have been established by merging existing universities and research institutions (Kuznetsov and Razumova, 2011). And in China there is a clear shift away from the previous dominance of the Chinese Academy of Sciences and its institutes towards the funding of researchers in universities.11

There are similar differences in the location of researchers. In the UK, there is a relatively small number of research institutes run centrally by the Research Councils, but 73 per cent of researchers are employed in universities, and only 3 per cent in Government departments or institutes. In Germany, on the other hand, with its well-developed infrastructure of institutes run under the auspices of the Fraunhofer Gesellschaft, the Helmholtz Association, the Leibnitz Association and the Max Planck Society,12 44 per cent of researchers are located in universities, and 12 per cent in the Government sector.

The proportions of university-based research that is funded by Government on the one hand, and business or other organisations on the other, also show significant differences across the OECD. In the major research nations, Government is the largest source of funding for university research, with proportions ranging from 52 per cent in Japan to 65 per cent in the US, and 80 per cent in Germany and Switzerland. Business support for university-based R&D, on the other hand, averages only around 6 per cent across the OECD; but that average conceals a range from 35 per cent in China, and 31 per cent in Russia, to between 1 and 3 per cent in Japan, France and Italy. In the US, it is about 6 per cent. The balance is made up from other sources, including non-profit organisations, funds from overseas and funds controlled by universities themselves. In the US, for example, over 20 per cent of the funds to support research in universities is provided by the universities themselves.

Research careers

Research careers typically begin in universities with study for a doctorate. In the major research nations, research funders and universities devote considerable resources to the support of doctoral students, although the mechanisms and forms of support differ significantly. Doctoral education is increasingly seen as a commodity with measurable economic value. Thus in the US, the NIH and the NSF support large numbers of doctoral students as research assistants funded through grants to universities for academic research. In the UK, the Research Councils earmark doctoral training grants to universities to enable students to carry out a doctoral-level research project together with taught coursework. Both countries seek to attract doctoral students from other countries, and in the US, for example, foreign students on temporary visas account for more than half the doctoral degrees awarded in several disciplines, including engineering, physics, mathematics, computer science and economics (National Science Board, 2010, Chapter 2). Such students are attractive to universities and funders not least because many of them choose to stay once they have gained their doctorate, and pursue their careers as highly qualified researchers in their new country: more than three-quarters of foreign recipients of US doctorates plan to stay in the US.

The US, Germany and the UK are the major producers of doctoral award-holders globally, accounting for nearly half the total among OECD countries.13 A key concern in many countries is the success rate for those entering doctoral programmes, and the time taken to complete a doctoral degree. A recent study in the US indicated that the 10-year completion rate across 30 institutions averaged 56.6 per cent, with a range across disciplines from 41.5 per cent in computer and information sciences to 77.6 per cent in civil engineering. In the UK, the Research Councils as well as individual universities have made it a priority over the past decade to increase completion rates, which for Research Council-supported students are now typically over 80 per cent after four years (Council of Graduate Schools, 2008).14

The next career stage for many researchers is a postdoctoral appointment, the number of which has tended to increase in many countries. Although it is difficult to get precise numbers, this growth has become a major issue of concern in both the US and the UK. Increases in competition for permanent academic posts, and the growth of collaborative research in large teams, have led to large numbers of highly qualified researchers working on short-term contracts with relatively low pay and few benefits, thus delaying the start of their independent careers either within or outside the research community. In the US, about a quarter of those who gain doctorates get to a tenure track position within 4–6 years of gaining doctoral status. In the UK, it is estimated that 30 per cent of science PhDs go on to postdoctoral positions, but only about 12 per cent of those go on to secure a permanent academic position (Royal Society, 2010). The problem is that while the number of postdoc posts has increased at the base of the career pyramid, the number of permanent and more senior posts has not increased to anything like the same extent. In the US, for example, while the number of full-time tenured faculty increased by a third between 1979 and 2006, their proportion of the academic workforce fell from 69 to 62 per cent; over the same period, the proportion of postdocs rose from 4 to 9 per cent (National Science Board, 2010, Chapter 2). This position has been made worse by the economic recession, which has made many universities reluctant to appoint new permanent staff. Policy-makers in the US, the UK, Sweden and other countries are struggling to find solutions to the problem, which is an acute one for the university sector in many countries.

Funding; competition and assessment

Competition for posts and for funding is indeed an increasingly prominent feature of the research landscape, especially in universities. Success in winning research income is a key performance indicator for many universities. It features, alongside bibliometric measures such as numbers of papers published and citations to those papers, in the calculation of the various global league tables produced by the Shanghai Jiao Tong University, Times Higher Education, Quacquarelli Symonds and others, as well as in similar exercises in individual countries. Research income typically comes in two forms: project funding, which is awarded, on the basis of a project specification, to an individual or group to undertake a specific research project that is limited in scope, time and budget; and institutional funding, often in the form of an annual block grant, which is provided to an institution to support its general research activities, without identifying specific projects or activities to be undertaken.

In Germany, for instance, the Federal Government − through the Ministry of Education and Research − and the Länder (the 16 states which form the Federal Republic) provide institutional funding to some 750 research establishments, including universities. Both the Federal Government and the Länder also provide direct funding for specific projects within a framework of programmes in particular areas of research, as well as to basic research projects independent of any programme. They also provide the majority of funds for the German Research Foundation (DFG), whose chief task is to select the best research projects by scientists and academics at universities and research institutions on a competitive basis and to finance those projects.15

In the UK, universities receive block grant to support their research activities through the Higher Education Funding Councils for England, Scotland, Wales and Northern Ireland, respectively. The amount that each university receives is determined primarily by the results of the Research Assessment Exercise (RAE), which takes place every few years; the most recent was in 2008, and the next, under the new title of Research Excellence Framework (REF), will take place in 2013. The key outcome of the process is a rating of the quality of the work produced over the period intervening between successive exercises by the researchers entered into some 70 subject-based ‘units of assessment’. These ratings, along with associated volume measures, constitute the basis of a formula which determines the amount of block grant that universities receive each year to support their research activities (not their teaching) until the next exercise is undertaken.

UK universities also bid for project-based grants and doctoral studentship awards from the Research Councils. Here one of the main concerns, for the Councils as well as universities, has been the low success rates for applicants, and what might be done to ameliorate the position when success rates fall to around 20 per cent16 or lower. From the Councils’ perspective, the problem is that they receive, and have to assess through a costly peer-review process, many more high-quality applications than they are able to support. From the universities’ perspective, the problem is the time and effort involved in preparing and submitting many more applications than are likely to receive funding. Similar problems have arisen in Australia, where success rates for some Australian Research Council awards have fallen well below 20 per cent.17

In the US, the Federal Government does not provide support for research in universities in the form of a general block grant, although some state funding for universities probably does support some research activity at public universities. The Federal Government provides all its funding for university research in the form of support provided through agencies such as the NIH (by far the biggest provider), the NSF and the Department of Energy, for specific, individually costed, projects. This means that the Government and its agencies are more directly involved than in many other countries in determining which projects will be supported from public funds; but it avoids the need for large-scale assessment exercises such as in the UK. It is also noticeable that as in other countries, funding tends to concentrate on a relatively small number of universities: in 2008 the top 20 universities accounted for 30 per cent of all university expenditure on R&D, a proportion that has remained stable for the past two decades (National Science Board, 2010, Chapter 5).

Managing and supporting research

Universities thus face increasing pressure to manage as well as support the activities and performance of their researchers. At the highest level, this means that most universities have established mechanisms for developing a clearly articulated research strategy, typically with the active involvement of a Research Committee; and that a senior member of the university’s management team (in the UK, a Pro Vice Chancellor or Deputy Principal) is responsible together with that committee for monitoring, reviewing and delivering that strategy and its key objectives.

But the increasingly complex and competitive research landscape requires structures of support and management below that strategic level. Institutional funding for research is accompanied by increasing demands from governments for scrutiny and the demonstration of quality, value-added outputs and impact in return for the supply of taxpayer money. And the ways in which research grants and contracts are bid for and won means that developing and sustaining a university’s or a department’s research portfolio is not straightforward. Moreover, success in securing research funding brings with it a range of obligations: grants and contracts are often tied to tightly specified milestones and outputs, rigorously monitored and heavily audited. Together these developments have created a need to support and manage research portfolios more closely. Activities that in earlier years were left to researchers themselves are now more closely linked to the university’s strategic objectives; and dedicated support services have been established to operate on the interface between researchers and corporate management.

Such developments have led some to detect a strengthening of hierarchical management in universities, and a weakening of academic self-management. There is thus talk of a ‘paradigm shift’ in authority relationships, with university managers experiencing both pressure to improve research performance, and more opportunities to manage research activity. But the inherent uncertainty of scientific research, and the diversity of research cultures and peer-group judgements, imposes limits on managers’ ability to control. Research teams retain considerable autonomy over how they conduct their research, and research strategies and priorities are informed by the judgements of key members of the research community. Moreover, there is considerable debate over the appropriate mechanisms for research management, and measures for assessing performance; the literature on performance indicators for research is extensive (Whitley et al., 2010).

Research support units within universities vary hugely in size and shape, and in how they are organised: some are highly centralised, while others operate more as teams distributed across departments and faculties (Green and Langley, 2009). They undertake a wide range of functions including identifying and publicising research funding opportunities; supporting researchers in developing project proposals, especially on matters such as costing; negotiating contracts with external sponsors; project management and financial control; monitoring and reporting to funders; and knowledge transfer, commercialisation and dissemination. Some universities have set up internal research assessment processes and mechanisms for internal peer review of project proposals before they are submitted.

As part of these developments, many universities are developing and implementing current research information systems (CRISs) which bring together data and information about the research projects that have been and are being undertaken across the institution; funding agencies, programmes and funding schemes, and forthcoming calls; proposals being submitted to or under consideration by those funding agencies; research results in the form of publications, patents and other intellectual property; and about the performance of individuals, groups and departments. For universities, a CRIS provides a tool to assist in policy-making, evaluation of research based on outputs, documenting research activities and outputs, project planning, and providing a formal log of research in progress.

Not all researchers welcome these kinds of developments, or share the kind of view of the world they represent: many regard the requirement to submit information about their work as a burdensome chore, and resent the monitoring and management of their activities and performance as an unwelcome intrusion. Most researchers are focused on research, not the ancillary things that surround it. Many of them regard budgets, standards, regulatory requirements, financial and progress reporting and so on as at best necessary evils and at worst bureaucratic obstacles that get in the way of their work. Seen from this perspective, research support services can provide a very valuable service in helping to overcome such obstacles. Any requirements that add to the obstacles by introducing new tasks or burdens tend, on the other hand, to be resented.

The more effective and user-focused research support offices, and the more sophisticated CRISs, however, can provide useful support to researchers by alerting them to funding opportunities, locating new contacts and networks, supporting them in developing research proposals as well as reports to funders, providing links to scholarly publications and other outputs, and so on. The key point is to ensure that researchers perceive real benefits for a relatively small amount of input effort.

Researchers and the research process

To create the new knowledge which is the objective of research, researchers need to obtain data. Some of this comes from primary sources (experimentation, observation, archival documents, interviews and so on, depending on the subject matter and the nature of the research), but much is derived from the existing knowledge base. Thus during the course of their work, researchers are both producers and consumers of data and information resources.

Many different models of the research process have been devised, but there is common agreement on the key stages, which involve defining a research question and then:

image identifying existing knowledge which is relevant to the question;

image accessing, analysing and evaluating existing data and knowledge;

image designing a methodology and a process for generating new data and knowledge;

image writing a research proposal and submitting it to a funding body;

image collecting or creating new data and analysing it;

image combining old and new knowledge to answer the research question and to enhance understanding;

image reporting and disseminating the outcomes of the research in a form which is both sustainable and retrievable.

The process is not linear, of course; and it typically involves a number of loops backwards and forwards – as well as stops and starts – between the defining of a research question and the dissemination of the results of a project that seeks to answer it. Moreover, the activities and the detail of the processes involved in information discovery, data collection, processing, analysis, information management, access and dissemination vary hugely across different subjects, disciplines, and individual researchers and research groups. Similar kinds of activities are configured together in very different ways; and the practices of individual groups often involve multiple information cycles leading to intermediate outputs (tools, methodologies, half-processed data and so on) which then form the inputs for other cycles of activity.

Individuals and teams

In the sciences, most current research is conducted by groups or teams of researchers: there are relatively few researchers who undertake research projects as individuals without the support of a team. In the social sciences and the humanities, individual research is still common, but team-based research is becoming increasingly common too. There is no typical research group or team: they vary in size, structure and scope, as well as in the roles that individual members perform. But it is common to have a principal investigator (PI) working with a range of colleagues who may include junior and senior tenured staff alongside postdocs and doctoral students, as well as technicians. Larger groups may include lab or project managers.

Individual members of research groups typically have specific roles within the group – often according to their level of seniority – and their activities may vary sharply as a result.

PIs, for example, often combine research with a range of other teaching, management and administrative responsibilities; they may be involved in developing and running a wide variety of research and related activities, involving relationships with a diverse range of people and organisations, including local colleagues, national and international peers, and funding and regulatory bodies. They are often responsible for preparing new research proposals as well as reporting to funders on completed projects, knowledge transfer activities and so on.

Postdocs are often the key team members who devote their efforts mainly and directly to the research activity of their group. In some cases, where they have developed the necessary expertise, they may also provide specialist support in areas such as statistics and modelling. Doctoral students’ work tends to be even more tightly focused on a specific aspect of the group’s research. They tend to have a relatively narrow set of relationships primarily but not exclusively with local collaborators, and their information-handling activities are often less complex than those of other members of the team. Similarly, technicians’ activities tend to focus around the functioning of experimental equipment and protocols. Lab and/or project managers usually focus on managing meetings, meeting reports, staff project reports, visitors’ agendas and so on, and sometimes maintaining the group’s website.

Researchers as information consumers

Although roles and practices vary considerably within and between groups, and across subjects and disciplines, at key stages in the research lifecycle it is clear that all researchers are information consumers. They discover and gain access to the information they need nowadays predominantly through web-based resources, including generic search engines such as Google as well as specialist bibliographic search and retrieval tools such as PubMed; on-line publications; and dedicated websites that they trust as authoritative. Most scientists now use the physical library relatively little; and even in the humanities, researchers are increasingly finding the resources they need online.

Many studies have noted (see, for example, Research Information Network, 2009a) that researchers appear to have a limited awareness of the range of information services and resources available to them, and the number they use seems surprisingly small. They also show loyalty to particular resources or services that they like or trust and find easy to use. A fundamental reason for this narrow, often opportune, choice of information tools and resources is that researchers lack the time to review the whole information landscape, or to learn how to exploit a wider range of resources and services to best effect. They are more likely to supplement their search strategies by seeking advice from colleagues (scientists more than information professionals) as to the most appropriate and useful sources of information.

In the UK and the US at least, articles in scholarly journals are the single most important kind of information resource for researchers, alongside conference proceedings in some disciplines such as engineering. And usage is increasing: surveys in the US indicate that the number of articles they read increased from c.150 to c.280 between 1977 and 2005 (Tenopir et al., 2009); and usage of e-journals in UK universities has been increasing at an annual rate of over 20 per cent (Research Information Network 2009b). But there is an increasing array of other kinds of information including protocols, techniques, standard operating procedures (SOPs), technical product information, reference works and databases which researchers need to access during the course of their research. Social media are also becoming increasingly important for some researchers, although evidence of take-up across the research community as a whole is as yet slight, probably because there is not yet the critical mass of individuals using such services to make it worthwhile for the purposes of enhancing research.

Generating and analysing data

Many areas of research are now characterised by the generation of volumes of data unthought of even a few years ago. Low-cost sensors mean that environmental scientists, engineers and other researchers can use a range of instruments long term to generate data relevant to their work on a 24/7 basis. Synchrotrons, telescopes, satellites, lasers and other instruments can generate terabytes (1012 bytes) of data every day; and in the life sciences, high-throughput gene sequencing platforms can produce similar volumes of data in a single experiment. Projects that generate tens of terabytes of data are now common; and at the largest end of the scale, the Large Hadron Collider will produce tens of petabytes (1015 bytes) of data a year. These developments have given rise to talk of a data deluge, and more recently of a new ‘fourth paradigm’ for research: following the moves from empirical to theoretical to computational science, it is suggested that we now need to think in terms of data-intensive science (Microsoft, 2009).

The data come in many different formats, shapes and sizes: observational, experimental, surveys, models and simulations; and from large international projects down to the work of small teams and individuals. Handling all the data requires new ways of working. E-science and cyberinfrastructure initiatives in the UK, the US and other countries have involved developing the capability and capacity to undertake computationally intensive science in distributed network environments, and projects using very large datasets that require grid computing. The complexity of the software and infrastructural requirements mean that many e-science projects involve large teams, and heavy investment in the processes for curating and analysing the data.

Even beyond the bounds of formally designated ‘e-science’, technological advance has brought fundamental change not only in the methods of scientific research but also in the infrastructure necessary to conduct it. Many researchers face increasing complexity in preparing, managing and analysing data; and as datasets grow larger, established data management techniques and off-the-shelf analysis tools are no longer adequate. In many areas of the life sciences, for example, researchers may need to draw on a wide range of genomics data, as well as data from other disciplines such as chemistry and clinical medicine, or from multi-scale mathematical models. The problem is that the datasets that researchers create and/or wish to use are often only partly connected, and incompatible in format, so that both discovery and integration are significant challenges.

There is thus increasing interest in the need for and development of schemas and ontologies to facilitate the indexing, cross-searching, aggregation and integration of data. There is also an increasing demand for workflows that provide systematic and automated means of undertaking analyses across a range of diverse datasets and applications; and also of capturing the process so that the method and the results can be reviewed, validated, repeated and adapted (see, for example, Taylor et al., 2007). Workflows configured in this way can co-ordinate multiple tasks, such as running a program, submitting a query to a database, submitting a task to a cloud or grid, or summoning a service over the web to use a remote resource. Outputs from one task constitute inputs to subsequent tasks.

Finally, there is increasing interest in the sharing of data across organisations and disciplines, and in making it more generally available as a key output of the research process. Many funding bodies now require applicants to submit data management plans as an integral part of their project proposals, and include a requirement to make data available to others. They see this as part of their commitment to getting best value for the funds they invest, not least by reinforcing open scientific enquiry and stimulating new investigations and analyses. But they also recognise that different fields of study require different approaches, and that what is sensible in one scientific or technological area may not work in others; and that there is thus a need to determine standards and best practice, as well as encouraging the development of scientific cultures in which data sharing is embedded.18

There is also broad recognition of the need to develop an infrastructure (Research Councils UK, 2010) of facilities that link multiple laboratories and observatories, instruments, users and data; and especially large

databases and services that allow researchers to ‘bring their computations to the data, rather than the data to the computations’. And developing new capacities and skills in the research base is an essential part of building that infrastructure. New kinds of science require close collaboration between scientists from different domains, and working with computer scientists and technologists to provide new ways of conducting research through high-quality data acquisition, simplified large-scale data management, powerful data modelling and mining, and effective sharing and visualisation.

Communicating research

Researchers publish and disseminate their work in many different ways: through formal publication in books and in learned and professional society journals; through conferences and their proceedings; and through a variety of less formal means, now including social media. The choices researchers make are underpinned by a number of interrelated motives beyond the simple desire to pass on their findings to those who may be interested in them. These motivations include the desire not only to maximise dissemination to a target audience, but to register their claim to the work they have done, and to gain peer esteem and the rewards that may flow from that.

In deciding when, where and how to communicate their work, researchers may have to make choices between speedy dissemination to a desired audience, and less speedy publication in a high-status journal. Such choices are made more complex because researchers know that publications serve not only as means of communication (Research Information Network, 2009c). They can be monitored or measured as indicators of quality or impact (in the academic world and more widely). And articles in scholarly journals dominate all other forms of publication, partly because they are more easily ranked and measured using a series of readily available and increasingly sophisticated metrics. But many researchers feel uncomfortable with the dominance of the article –particularly the article published in a high-status journal. They are concerned that communications through other channels – including those that are better suited to applied or practice-based research, and to communicating with non-academic audiences – seem to have low status and prestige in the academic world. There are also concerns about the impact on researcher behaviour in areas such as policy development, which may have a significant social or political impact, but do not feature much in research performance metrics. The introduction of impact assessments in the forthcoming Research Excellence Framework in the UK has been designed in an attempt to address such concerns.

The only major exceptions to the dominance of the journal article is the continuing high status attached to monographs and edited volumes in the humanities, and to practice-based outputs in the arts. Yet even in the humanities journal articles are now by far the largest publication format by volume, and there are increasing concerns about the decline of the book, attributed variously to shrinking library purchase budgets, publishers’ reluctance and the pressures of research assessment regimes.

Increasing numbers of researchers in all disciplines are also making at least occasional use of one or more social media for communicating their work; for developing and sustaining networks and collaborations; or for finding out about what others are doing. But frequent or intensive use is rare, and some researchers regard blogs, wikis and other novel forms of communication as a waste of time or even dangerous. Moreover, most who use social media do not see them as comparable to or substitutes for other channels and means of communication, but as having their own distinctive role for specific purposes and at particular stages of research. And frequent use of one kind of tool does not imply frequent use of others as well. Current levels of take-up are therefore relatively low, even though attitudes towards social media are broadly supportive (Research Information Network, 2010; CIBER, 2010).

Commonalities and differences

As noted at several points above, although it is possible to present a generic picture of the research process, practice varies across disciplines, but also between different groups operating in different institutional settings within similar research fields. Disciplinary cultures have a powerful influence on practice. But so also do other factors including: access to funding; the size of the group or project; the volumes of data being handled; the complexity or heterogeneity of that data; the complexity of the analysis or computation required; and the nature and scale of any collaboration across disciplinary, institutional or national boundaries.

In the life sciences, for example, large-scale proteomic or genomic research programmes are characterised by high-volume sharing of largely standardised (and thus homogeneous) data. Systems biology, which attempts to pull together diverse data (such as genomic, gene expression and metabolic data), is characterised by large-scale processing but of much more heterogeneous kinds of information, which may pose a challenge for researchers seeking to integrate the different taxonomic structures that have emerged in specialist domains.

Similarly, in the humanities, a single researcher in a discipline such as philosophy may operate in a complex set of informal relationships with other scholars, and use a wide range of tools and techniques to organise and analyse their data. Researchers in a field such as corpus linguistics, by contrast, use complex datasets and a range of methods to assemble a corpus which they may then need to clean and reformat before they can begin to annotate and analyse the data, using a range of bespoke software or off-the-shelf packages.

Understanding the practices and the needs of researchers in different subjects and disciplines, or operating in different contexts, is therefore a complex process: what works in one setting may well not work in another.

Competition and collaboration

Researchers have for long both competed and collaborated with each other. As funds are finite, individuals, teams, institutions and nations compete for resources, for doctoral students and for research posts in order to pursue their interests and sustain their work. They also compete for impact (in the form of citations, innovations and so on) and for prestige (in the form of prizes, of ratings in assessment exercises such as the UK’s RAE, of rankings in league tables and so on). And competition is increasing as a result of increased pressure on funding resources; the changing expectations of funding bodies; and globalisation, with new competitors in emerging nations.

Much attention has been paid to the development of ‘big science’ of the kind exemplified by the Large Hadron Collider at CERN, or the Human Genome Project, and the teams of hundreds or thousands of researchers associated with them. But most researchers do not work in big teams with big budgets. Rather, they work in groups that operate in a relatively unstructured way and on a relatively small scale; and they typically have a series of informal as well as formal relationships with other individuals and groups both within the institution in which they work, as well as with others outside. Senior researchers, in particular, often operate as part of a number of more or less overlapping collaborations and relationships.

Even in the humanities the ‘lone scholar’ has for long been essentially a myth. Such scholars may work for the most part on their own on projects they themselves design and undertake; but they engage in a wide range of informal discussions and dialogue with colleagues working in cognate areas. And technological developments are both facilitating and driving collaboration between researchers. Internet connectivity means that it is easier than in the past to share ideas, data, tools and workflows. So the costs of collaboration are falling on the one hand, while the increasing complexity and cost of the infrastructure needed to support many kinds of research means that there is an increasing imperative to work across traditional boundaries to exploit that infrastructure to the full.

But it is not just technology that is driving growth in collaboration at local, regional, national and international levels. Governments and other funders are also promoting collaboration by their emphasis on multi-disciplinary research that addresses large-scale issues and problems such as environmental change, sustainable energy, health and well-being.19 19There is also growing interest in seeking to develop a deeper understanding of the linkages between research and innovation, or healthcare outcomes; and of how to achieve successful interactions between the business and research sectors.

From a research perspective, collaboration with other universities, with industry and with public and voluntary sector bodies can help to drive success for individuals (in securing grants and contracts, and greater citation and other impacts for their work); for institutions (in helping to build critical mass, leverage of research opportunities and winning funding); and for nations (in supporting innovation and the development of knowledge-based economies). In Europe, the framework programmes for research of the European Commission have been a major driver for collaboration. The Seventh Framework Programme has provided 50 billion euros for research and development over the seven years from 2007 to 2013, the great majority of which has gone to projects requiring participation from several different countries.

One indicator of the extent of collaboration is co-authorship of the articles reporting the results of research. As is well known, the proportion of science and engineering articles that are co-authored has been growing, from 40 per cent of the global total in 1988 to 64 per cent in 2008; and the average number of authors per article has risen too, from 3.1 to 4.7. Of course, part of this simply reflects the growth of research teams, as distinct from collaboration across institutional or other boundaries. But more than half of articles published by authors from US academic institutions now include a co-author from another institution; and globally, the articles that list institutions from more than one country grew from 8 to 22 per cent between 1988 and 2008 (National Science Board 2010, Chapter 5). For major research nations in Europe – France, Germany, Netherlands, Switzerland, the UK – around half or more of science and engineering articles published include an author from another country. Such collaboration, particularly with long-term partners, tends to produce papers with higher citation impact.

Looking forward

This brief overview has highlighted some trends, some of them in tension with each other, in a complex landscape. There is no reason to believe that most of those trends, and the tensions, will not remain for the foreseeable future. Thus research will continue to be driven by the intellectual curiosity of researchers, but also by the imperatives and policies of the major funders of research, primarily Governments. It remains an article of faith for Governments – even those experiencing fiscal difficulties – that continued public investment in research is essential for the success of their economies and for the well-being of their societies; that a high-quality research base makes a country attractive for inward investment by international business; and that publicly funded research plays an important role in raising the productivity of R&D in the business sector and has a positive impact on innovation in the economy as a whole.

But in straitened economic circumstances especially, Governments and other funders seek a return on their investment in research. There are some suggestions that the greatest productivity increases in the long term come through breakthroughs in knowledge and understanding that derive from basic research. But the evidence of relationships between investment in basic research and economic growth is not strong; and Governments tend to wish to see returns in relatively short timescales. Hence the increasing stress on collaboration and knowledge exchange with the business sector and other organisations, and efforts to develop and tighten the linkages between research and innovation. Hence also the development of targeted technology-transfer and knowledge-exchange programmes; and the increasing emphasis on monitoring and assessing the performance of the publicly funded research base. Such monitoring and assessment often covers not only primary results and their quality, but success in patenting, licensing, transfer agreements and co-operative R&D relationships (see, for example, National Science Board 2010, Chapter 4). There is no sign that these trends will weaken.

Nor will the focus from Governments and other funders on addressing major global challenges such as public health or reducing poverty. Indeed, this has been part of a long-term trend to encourage researchers from different disciplines and organisations to come together to share ideas, skills and techniques to address complex problems; to provide access to new ideas and insights; to create critical mass in research skills and expertise; to share costs and risks, and to ensure efficient use of expensive facilities; to produce higher-quality results in shorter time frames; and to provide more pathways to achieve economic and social impact.

Multi-disciplinary and cross-disciplinary work co-exists, however, with strong, but changing, disciplinary cultures. Even as disciplines evolve, and new ones such as bioinformatics emerge, there is little sign that such cultures are breaking down, or that they will not remain a key feature of the landscape in the future. Indeed, for many researchers, their allegiance to their discipline is as strong as – or even stronger than – that to their institution. It is from their peers working in cognate areas, after all, that they seek the recognition and esteem they need to advance in their careers; and this becomes increasingly significant as competition for funds intensifies. The forces of globalisation could also help to reinforce disciplinary communities and cultures at the same time that they facilitate collaboration and cross-disciplinary working.

At a global level, as we have seen, China, India, Brazil, Iran and other Asian countries are already playing an increasingly important role in the research landscape; and that role will increase further as that of Western Europe and North America continues to decline proportionately. It is important to distinguish, however, between a proportionate and an absolute fall in contributions to the global research effort. What we are seeing as a result of the emergence of new countries is a significant increase in that effort and the resources – financial, human and infrastructural – devoted to it; and there is no sign of significant reductions in contributions from Western countries in absolute terms. Collaboration and movement of researchers between them and the newer countries are growing, and will become an increasingly important feature of the landscape. Perhaps more challenging will be the fostering of co-operation with researchers in countries of the developing South, where resources of expertise and facilities are more thinly spread. At present, such collaborations as exist tend to be dominated by researchers from established research nations. Developing more equal relationships and helping to build research capabilities and capacities in the countries of the South is a significant challenge for the future.

References

CiberSocial Media and Research Workflow. London: CIBER: University College London and Emerald Publishing, 2010.

Council of Graduate SchoolsPhD Completion and Attrition: Analysis ofBaseline Program Data for the PhD Completion Project. Washington, DC: Council of Graduate Schools, 2008.

Cutler and Co Pty Ltd. Venturous Australia. available at http://www. innovation. gov. au/Innovation/Policy/Documents/NISReport. pdf, 2008.

Department for Business Innovation and Skills, The Allocation of Science and Research Funding 2011/12 to 2014/15: Investing in World-Class Science and Research. 2010;

European Union. Innovation Union Competitiveness Report 2011: Executive Summary, available at http://ec. europa. eu/research/innovation-union/pdf/competitiveness-report/2011/executive_summary. pdf, 2011.

Green, J., Langley, D. Professionalising Research Management. available at researchsupport. leeds. ac. uk/images/uploads/docs/PRMReport. pdf, 2009.

HM Treasury, Department of Trade and Industry, and Department of Education and SkillsScience and Innovation Investment Framework, 2004–2014. London: Her Majesty’s Stationery Office, 2004.

Kuznetsov, A., Razumova, I. Selling to the BRIC – Russia: scholarly e-products and the Russian market. Learned Publishing. 24(2), 2011.

Microsoft, The Fourth Paradigm. 2009;

Ministry of Science and Technology of the People’s Republic of China. China Science & Technology Statistics Data Book 2007, 2007.

National Science BoardNational Science Board Science and Engineering Indicators 2010. Arlington, VA: National Science Foundation, 2002. [(NSB 10–01)].

OECD. Frascati Manual: Proposed Standard Practice for Surveys on Research and Experimental Development, 2002.

OECD. Science and Innovation Policy: Key Challenges and Opportunities. available at http://www. oecd. org/dataoecd/24/11/25473397. pdf, 2004.

OECD. Interim Report on the OECD Innovation Strategy. available at http://www. oecd. org/dataoecd/1/42/43381127. pdf.

Research Councils UK. Delivering the UKs e-infrastructure for research and innovation, 2010.

Research Information Network. Patterns of Information Use and Exchange: case studies of researchers in the life sciences, 2009.

Research Information Network, E-journals, their use value and impac 2009;

Research Information Network. Communicating Knowledge: how and why researchers publish and disseminate their findings, 2009.

Research Information Network. If You Build it, will They Come:researchers perceive and use Web 2. 0, 2010.

Royal Society. The Scientific Century: Securing our Future Prosperity, 2010.

Taylor, I. J., Deelman, E., Gannon, D. B., Shields, M. Workflows for E-Science: Scientific Workflows for Grids. London: Springer, 2007.

Tenopir. Electronic journals and changes in scholarly article seeking and reading patterns. Aslib Proceedings. 61(1), 2009.

Whitley, R., Glaser, J. Engwall L., ed. Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and their Consequences for Intellectual Innovation. Oxford University Press: Oxford, 2010.


1Figures in this section are taken from OECD Science and Technology Main Indicators unless otherwise referenced.

2See also Chapter 4 of Science and Engineering Indicators 2010, National Science Foundation, available at http://www.nsf.gov/statistics/seind10/start.htm

3Table 1.2, SET Statistics, Department of Business, Innovation and Skills, available at http://www.bis.gov.uk/policies/science/science-funding/set-stats. For an overview of the funding of research in the UK, see Making Sense of Research Funding in UK Higher Education, Research Information Network (RIN), 2010, available at http://www.rin.ac.uk/system/files/attachments/ Making_sense_of_funding.pdf

4See also the Innovation Nation White Paper, published by the Department for Innovation, Universities and Skills in 2008.

5Japan Science and Technology Agency: www.jst.go.jp/EN

6http://www.hefce.ac.uk/research/

7http://www.most.gov.cn/eng/policies/regulations/200412/t20041228_ 18309.htm

8http://www.nih.gov/about/mission.htm

9Strategic Plan 2010–11 to 2012–13, Australian Research Council, 2010, p. 14.

10http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=501084 .

11See the list of grants in Chapter 5 of the Annual Report 2009 of the National Natural Science Foundation of China, available at http://www.nsfc.gov.cn/ english/09ar/2009/pdf/005.pdf

12For an overview of the research landscape in Germany, see http://www. research-in-germany.de/dachportal/en/research-landscape/2866/research-landscape.html.

13It should be noted, however, that important countries such as India, China and Brazil supply no PhD data to the OECD.

14See, for example, the BBSRC Annual Report and Accounts, 2009–10, Biotechnology and Biological Sciences Research Council, 2010; and the ESRC Annual Report and Accounts, 2009–10, Economic and Social Research Council, 2010.

15Federal Report on Research and Innovation 2010, Bundesministerium fur Bildung und Forschung (BMBF − German Federal Ministry of Education and Research) Referat Innovationspolitische Querschnittsfragen, Rahmenbedingungen (Innovation Policy Framework Department) 11055 Berlin, Germany.

16That was the success rate for applicants to the UK Medical Research Council in 2009–10. See http://www.mrc.ac.uk/consumption/idcplg?IdcService=GET_FILE&dID=26844&dDocName=MRC006981&allowInterrupt=1

17See the Australian Research Council Annual Report 2009–10, available at http://www.arc.gov.au/pdf/annual_report_09-10.pdf

18For an example of a policy statement from a major UK research funder, see the Biotechnology and Biological Sciences Research Council’s BBSRC Data Sharing Policy Version 1.1. 2010.

19See, for example, the UK Research Councils’ ;‘Cross-Council Research Themes’ at http://www.rcuk.ac.uk/research/xrcprogrammes/Pages/home.aspx

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset