Implementing Fantasy Science Funding

Printer-friendly version

"Fantasy Science Funding" is an online game played by people with concrete ideas about science funding who are not currently in a position to put these ideas into practice. There are five rules to the game: 1 - choose a funding body whose funds you are managing in your fantasy, 2 - imagine how their funds could be distributed to the benefit of science, 3 - choose areas of science to be "fired" (i.e. whose funding should be decreased with respect to present state), 4 - choose areas of science to be "hired" (where funding should be increased with respect to now), 5 - blog about it.

Previous shindings that I am aware of were hosted by Duncan Hull, Björn Brembs and Cameron Neylon.

Here, in following rule 5, I will concentrate on rule 2 because in my fantasies, the role of science funders could not only be fulfilled by dedicated funding bodies like the ones featured in rule 1, but also by other organizations, companies, groups of people or individuals. I will add that, given the limits to growth of our economies, research funded by these economies will likewise have to achieve a steady state, perhaps around the often-mentioned target of 3% of the Gross Domestic Product, so I refrain from discussing a concrete budget or changes thereof and hope that the following fantasies scale well with the amount of money a funder is prepared to invest in research. I also add that even though I am writing about "science", I think the fantasies would be equally implementable in other areas of inquiry.

Rule 2 - imagine:
i) Baseline funding for established scientists not requiring grant submissions;

ii) Research proposals beyond baseline funding not being sent in to funding agencies but deposited instead to a public repository (preferably in a machine-readable manner);

iii) The review of the proposals not being performed by review committees but by the science community as a whole, resulting in a public rating of each proposal in terms of its scientific quality and related aspects (e.g. appropriateness of the budget);

iv) Science funders (see my definition above, possibly even including scientists who wish to pool part of their baseline grants) being able to browse (potentially even with the aid of automated or semi-automated proposal crawlers) through the available proposals meeting their criteria and to either fund them directly or to signal to other funders that they would be willing to fund a proposal in part (such a practice would particularly benefit transdisciplinary projects, which often fall through the grid in traditional research funding).

How could these proposals be put into practice?

i) This may actually be doable: A recent study on the cost effectiveness of the Natural Sciences and Engineering Research Council of Canada found that the costs of the research grant peer review exceeded the costs of providing every eligible researcher with a yearly baseline grant of about CAN$ 30k (some discussions of this paper).

ii) Public repositories for publications already exist (indeed, they form the back of the "Open Access" movement and represent its green variant), and similar platforms for research projects begin to emerge (on Innocentive, funders pose challenges to a community of problem solvers and offer a prize that will be awarded to the best solution to the problem; Mechanical Turk allows to hire workforce for predefined tasks, and Fund Science enables the public to participate directly in the allocation of research funds). Moreover, there is a long tradition of science prizes (current example: the Millenium Prize Problems of the Clay Mathematics Institute), and they tend to stimulate innovation in ways hardly foreseeable by research and innovation funders (for discussions, see here and here - subscriptions required).

iii) Timely expert attention is a rare good and hard to get without incentives, as any journal editors and funding bodies can attest (for an example, see this putative solution and the discussion thereof). But a strong incentive would be there if the eligibility of a proposal for funding were coupled to the condition that co-investigators on a grant proposal have each peer reviewed another grant in the repository within a certain time. Chances for gaming the system can be reduced if all the reviews are available to the public (a practice already in use at some journals, e.g. at most of those published by or on behalf of the European Geosciences Union), irrespective of the reviewer revealing their identity immediately, after some time or not at all. The individual proposals (as well as their reviews) could then be rated by individual or institutional members of the scientific community in a way similar to how feature requests are rated at Mendeley, though it would be advisable to adjust the number of "votes" available to a rater by some measure of relative expertise, initially perhaps derived from overlap in the tags of the proposals and the raters, later on directly from the aggregated ratings of the raters' contributions to the system. A full-fledged rating system of this kind would allow, by the way, young scientists to enter the system much more smoothly (e.g. if they have already provided a number of comments or reviews that highly rated scientists have found useful) than with the current system (where most of them have similar H indices and are often judged based on the inadequate Journal Impact Factor of the few publications they may have).

iv) Already now, science funders have a strong position with respect to scientists seeking to be funded. This would be somewhat weakend by adoption of fantasy (i) but still in effect with fantasies (ii) and (iii), allowing them to impose, as they have always done, conditions for funding. Apart from the aforementioned obligation of applicants to peer review other proposals, funders could include in their eligibility criteria a mandate (as some have already done, along with some institutions - see this regularly updated list) for publications of the applicants to be made Open Access (green or gold) within a certain time frame relative to the proposal. They could also provide, for instance, a funding supplement for scientists who post all their data (with special provisions for cases like patient data) on public platforms (e.g. OpenWetWare), who commit to regularly commenting in public on research output by others (example discussion thread), whose research has a low ecological footprint, or who decide to abandon stand-alone review articles and put the effort in suitable online encyclopedias instead which can be expanded and updated collaboratively as new research results come in. Let's consider this latter point in more detail: "Suitable" in this sense means reasonably reliable (so not the current Wikipedias) and updated (so not the current Encyclopaedia Britannica). Currently, no such scholarly wiki is anywhere near to being comprehensive, detailed, balanced and updated but the cross-disciplinary platforms Scholarpedia and Citizendium or, even more so, some subject-specific platforms like Encyclopedia of Earth or Encyclopedia of Life appear to be on the right track if considered from a long-term perspective (as a sidenote, long-term projects in general constitute a major weakness of the current funding system, and I assume that the transparency my fantasies would bring to research funding may help to alleviate this problem). Finally, reasonably comprehensive, detailed, balanced and updated encyclopedic articles on topics relevant to a research proposal could also help to provide the background for a research proposal and thus reduce the writing effort that has to be put into individual research proposals. With such a comprehensive corpus of structured knowledge at hand, funders could then issue calls for proposals on the basis of some rated versions of Most Wanted pages which would identify apparent gaps of knowledge as the potential focal points for further research.

In summary, I would "fire" (rule 3) peer review for baseline grants, and for projects beyond baseline, the opacity that currently prevails in research funding. I would "hire" (rule 4) stable baseline funding for established scientists, new opportunities for young scientists and - most of all - transparent, collaborative and sustainable approaches to funding and performing research.



Duncan Hull's picture

Hi Daniel, I'd agree with

Hi Daniel, I'd agree with your summary, just read the NSERC paper that you reference, it is an interesting read. Cheers. Duncan

daniel's picture

A tool for public funding of

A tool for public funding of artists can be found at - perhaps some lessons to be learned there for science.

daniel's picture

A logo design contest site

A logo design contest site is at .

daniel's picture

Fantasy Science Funding, continued

Ongoing discussions of the topic can be followed via the following embed: