ESTONIAN ACADEMY
PUBLISHERS
eesti teaduste
akadeemia kirjastus
PUBLISHED
SINCE 1997
 
TRAMES cover
TRAMES. A Journal of the Humanities and Social Sciences
ISSN 1736-7514 (Electronic)
ISSN 1406-0922 (Print)
Impact Factor (2022): 0.2
THE IMPORTANCE OF BIBLIOMETRIC INDICATORS FOR THE ANALYSIS OF RESEARCH PERFORMANCE IN GEORGIA; pp. 345–356
PDF | doi: 10.3176/tr.2014.4.03

Authors
Teimuraz Matcharashvili, Zurab Tsveraidze, Aleksandre Sborshchikovi, Tamar Matcharashvili
Abstract

The present analysis of research productivity of scholars in Georgia was motivated by the disadvantageous position of Georgia in international listings of the most cited scientific articles. We used official databases provided by governmental Shota Rustaveli National Scientific Foundation (SRNSF) from 2007 to 2013. In this research we have restricted our analysis by the consideration of bibliometric indicators of the leaders of the awarded projects. Three bibliometric characteristics: the number of publications and citations, as well as H-index of project leaders was obtained from SCOPUS database. According to our results, just 58% of all leaders of awarded projects in SRNSF grant competition, have an article (at least one) in the Scopus database for the entire period of their scholarly activity. From our analysis it follows that the quality of reviewing of the projects, presented to the SRNSF grant competition, does not promote a selection of the most productive project teams; there is no correlation between values of SRNSF reviewer's evaluation scores and the bibliometric data of project leaders in the Scopus database. As a result, in 2007–2012 in spite of large enough (for Georgia) funding, the problem of the low productivity and quality of scientific research in Georgia has not been resolved. We conclude that, in order to improve the situation with the low productivity of research in Georgia, the governmental programs of science support should be based on the new system of evaluation of the quality of presented projects; namely, peer-review approach should be combined with the bibliometric methodology. Besides local interest, for Georgian researchers and governmental authorities, the results of presented research have general importance in the light of ongoing international discussions about the necessity of inclusion of bibliometric data in evaluation procedures of research productivity. Presented results and discussions will be especially helpful for scholars and research administrators from countries in transition and could facilitate in elaboration of effective research funding policy.

References

Abramo, G., C.A. D’Angelo (2011) “Evaluating research: from informed peer review to biblio­metrics”. Scientometrics 87, 499–514.
http://dx.doi.org/10.1007/s11192-011-0352-7

Abramo, G., C. A. D’Angelo, and F. Pugini (2008) “The measurement of Italian Universities’ research productivity by a non parametric-bibliometric methodology”. Scientometrics 76, 2, 225–244.
http://dx.doi.org/10.1007/s11192-007-1942-2

Abramo, G., C. A. D’Angelo, and A. Caprasecca (2009) “Allocative efficiency in public research funding: can bibliometrics help?”. Research Policy 38, 1, 206–215.
http://dx.doi.org/10.1016/j.respol.2008.11.001

Aksnes, D. W. and R. E. Taxt (2004) “Peers reviews and bibliometric indicators: a comparative study at Norvegian University”. Research Evaluation 13, 1, 33–41.
http://dx.doi.org/10.3152/147154404781776563

Allik, J. (2003) “The quality of science in Estonia, Latvia, and Lithuania after the first decade of independence”. Trames 7, 1, 40–52.

Allik, J. (2008) “Quality of Estonian science estimated through bibliometric indicators (1997–2007)”. Proceedings of the Estonian Academy of Sciences 57, 4, 255–264.
http://dx.doi.org/10.3176/proc.2008.4.08

Allik, J. (2013) “Factors affecting bibliometric indicators of scientific quality”. Trames 17, 3, 199–214.
http://dx.doi.org/10.3176/tr.2013.3.01

Chirici, G. (2012) “Assessing the scientific productivity of Italian forest researchers using the Web of Science, SCOPUS and SCIMAGO databases”. iForest 5, 101–107.
http://dx.doi.org/10.3832/ifor0613-005

Glänzel, W. (2006) “On the h-index – a mathematical approach to a new measure of publication activity and citation impact”. Scientometrics 67, 315.
http://dx.doi.org/10.1007/s11192-006-0102-4

Hammouti, B. (2010) “Comparative bibliometric study of the scientific production in Maghreb countries (Algeria, Morocco and Tunisia) in 1996–2009 using Scopus”. Journal of Materials and Environmental Science 1, 2, 70–77.

Hirsch, J.E. (2005) “An index to quantify an individual’s scientific research output”. Proceedings of the National Academy of Sciences of the United States of America 102, 46, 16569–16572.
http://dx.doi.org/10.1073/pnas.0507655102

Lotka, A.J. (1926) “The frequency distribution of scientific productivity”. Journal of the Washington Academy of Sciences 16, 12, 317323.

May, R.M. (1997) “The scientific wealth of nations”. Science 275, 793–796.
http://dx.doi.org/10.1126/science.275.5301.793

Meho L, Yang K (2007) “Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus SCOPUS and Google Scholar”. Journal of the American Society for Information Science and Technology 58, 13, 2105–2125.
http://dx.doi.org/10.1002/asi.20677

Mikki, S. (2010) “Comparing Google Scholar and ISI Web of Science for earth sciences”. Scientometrics 82, 2, 321–331.
http://dx.doi.org/10.1007/s11192-009-0038-6

Moed, H. F. (2005) Citation analysis in research evaluation. Dordrecht: Springer.

Suleymenov, E., N. Ponomareva, A. Dzhumabekov, T. Kubieva, and G. Kozbagarova (2011) “An assessment of the contributions of Kazakhstan and other CIS countries to global science: the Scopus database”. Scientific and Technical Information Processing 38, 3, 159–165.
http://dx.doi.org/10.3103/S0147688211030051

Valencia, M. (2004) “International scientific productivity of selected universities in the Philippines”. Science Diliman 16, 49–54.

Wolszczak-Derlacz, J. and A. Parteka (2010) Scientific productivity of public higher education institu­tions in Poland: a comperative bibliometric analysis. Warsaw: Sprawne Panstwo.

Yang, J., M.W. Vannier, F. Wang, Y. Deng, F. Ou, J. Bennett, Y. Liu, and G. Wang (2013) “A biblio­metric analysis of academic publication and NIH funding”. Journal of Informetrics 7, 318–324.
http://dx.doi.org/10.1016/j.joi.2012.11.006

Back to Issue