Alexandra Lešková
University of Economics in Bratislava, Bratislava, Slovakia
DOI: https://doi.org/10.31410/eraz.2018.372

​ ​​ ​​

4th International Conference – ERAZ 2018 – KNOWLEDGE BASED SUSTAINABLE ECONOMIC DEVELOPMENT, Sofia- Bulgaria, June 7, 2018, CONFERENCE PROCEEDINGS published by: Association of Economists and Managers of the Balkans, Belgrade, Serbia;  Faculty of Business Studies, Mediterranean University – Podgorica, Montenegro; University of National and World Economy – Sofia, Bulgaria; Faculty of Commercial and Business Studies – Celje, Slovenia; Faculty of Applied Management, Economics and Finance – Belgrade, Serbia, ISBN 978-86-80194-12-7

Abstract

Public funding mechanism for excellence is widely used mostly due its aim to raise the performance of higher education institutions to an excellent level since the reallocation is based on competitiveness of institutions or researchers. The approaches which are currently used for research evaluation are either peer review or bibliometric techniques. Peer review is based on deep expertise of committees and experts. However, its application is questioned to some extent, especially due to its ineffectiveness and inefficiency. In Slovakia, peer review process is applied to selection of projects by the Scientific Grant Agency. This paper identifies whether there is a relationship between peer review score of project proposals and research productivity. Case study is applied to the Scientific Grant Agency and its grant selection in year 2009 when the results of peer review process are for the first time available to the public. Our results show that peer review in most fields failed to predict the success of projects. Moreover, we observed potential gender bias in peer review and grant selection mechanism.

Key words

peer review, funding for excellence, university research


References

  1. Abdoul, H. – Perrey, C. – Amiel, P. – Tubach, F. – Gottot, S. – Durand-Zaleski, I. – Alberti, C. – Gagnier, J. J. (2012). Peer review of grant applications: criteria used and qualitative study of reviewer practices. In: PLoS ONE, Vol. 7, Issue 9. doi: 10.1371/journal.pone.0046054.
  2. Abramo, G. – D’Angelo, C. A. – Di Costa, F. (2008). Assessment of sectoral aggregation distortion in research productivity measurements. In : Research Evaluation, Vol. 17, Issue 2, pp. 111–121.
  3. Alberts, B. – Kirschner, M. W. – Tilghman, S. – Varmus, H. (2014). Rescuing US biomedical research from its systematic flaws. In : Proceedings of the National Academy of Science of the USA, Vol. 111, Issue 16, pp. 5773–5777.
  4. Berg, J. (2013). On deck chairs and lifeboats. ASBMB Today:–. [online], [cited 23. 04. 2018]. Available at: <https://www.asbmb.org/asbmbtoday/asbmbtoday_article.aspx?id=32362>.
  5. Danthi, N. – Wu, C. O. – Shi, P. – Lauer, M. (2014). Percentile ranking and citation impact of a large cohort of national heart, lung, and blood institute-funded cardiovascular R01 grants. In: Circulation Research, Vol. 114, pp. 600–606. doi: 10.1161/CIRCRESAHA.114.302656.
  6. EUA. (2015). Designing Strategies for Efficient Funding of Universities in Europe. EUA : Brusel, ISBN: 9789078997603.
  7. Li, D. – Agha, L. (2015). Big names or big ideas: Do peer-review panels select the best science proposals? In: Science, Vol. 348, Issue 6233, pp. 434 – 439. doi: 10.1126/science.aaa0185.
  8. [8]    Mayo, N. E. – Brophy, J. – Goldberg, M. S. – Klein, M. B. – Miller, S. – Platt, R. W. – Ritchie, J. (2006). Peering at peer review revealed high degree of chance associated with funding of grant applications. In: Journal of Clinical Epidemiology, Vol. 59, pp. 842–848. doi: 10.1016/j.jclinepi. 2005.12.007.
  9. Méndez, E. (2012). What´s in Good? [online], [cited 17. 04. 2018], Available at: <https://www.idrc.ca/sites/default/files/sp/Documents%20EN/Lit-review-Final-English.pdf>.
  10. Méndez, E. (2012). Evaluating Research Excellence1: Main Debates. [online], [cited 17. 04. 2018], Available at: <https://www.idrc.ca/sites/default/files/sp/Documents%20EN/Brief-Final-English.pdf>
  11. Ministry of Education. (2007). Statute of SGA. [online], [cited 23. 04. 2018], Available at: <https://www.minedu.sk/data/att/11171.pdf>.
  12. Ministry of Education. (2015). A methodology of state budget reallocation to public higher institutions for the year 2015. [online], [cited 23. 04. 2018]. Available at: <https://www.minedu.sk/rozpis-dotacii-zo-statneho-rozpoctu-verejnym-vysokym-skolam-na-rok-2015/>.
  13. Porter, A. L. – Rossini, F. A. (1985). Peer Review of Interdisciplinary Research Proposals. In: Science, Technology, Human Values, Vol. 10, Issue 3, pp. 33 – 38.
  14. Presidency of Scientific Grant Agency. (2007). Rules of the Scientific Grant Agency. [online], [cited 23. 04. 2018], Available at: <https://www.minedu.sk/data/att/11378.pdf>.
  15. Roy, R. (1985). Funding Science: The Real Defects of Peer Review and an Alternative to It. In: Science, Technology, Human Values, Vol. 10, Issue 3, pp. 73 – 81.
  16. Sandstrom, U. – Hällsten, M. (2008). Persistent nepotism in peer review. In: Scientometrics, Vol. 74, pp. 175–189.
  17. Scientific Grant Agency. (2010). Annual Report for year 2009. [online], [cited 23. 04. 2018]. Available at: <https://www.minedu.sk/data/att/1685.pdf>.
  18. Wenneras, Ch. – Wold, A. (1997). Nepotism and sexism in science. In : Nature, Vol. 387, pp. 341–343. doi:10.1038/387341a0.
  19. Wilsdon, J. et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. doi 10.13140/RG.2.1.4929.1363.
  20. Yates, L. (2005). Is Impact a measure of Quality? Some Reflections on the Research Quality and Impact Assessment Agendas. In : European Educational Research Journal, Vol. 4 , Issue 4, pp. 391-403.