ESTONIAN ACADEMY
PUBLISHERS
eesti teaduste
akadeemia kirjastus
PUBLISHED
SINCE 1997
 
TRAMES cover
TRAMES. A Journal of the Humanities and Social Sciences
ISSN 1736-7514 (Electronic)
ISSN 1406-0922 (Print)
Impact Factor (2022): 0.2
FACTORS AFFECTING BIBLIOMETRIC INDICATORS OF SCIENTIFIC QUALITY; pp. 199–214
PDF | doi: 10.3176/tr.2013.3.01

Author
Jüri Allik
Abstract

The High Quality Science Index (HQSI) was constructed on the basis of the release of the Essential Science Indicators (Thomson Reuters) for the period from January 1, 2002 to August 31, 2012. The HQSI was computed for a country or territory as a sum of normalised scores of the mean impact (citations per paper) and the percentage of papers that reach the top-1% citation ranking. Expectedly, countries or territories that are producing larger Gross National Income per capita and allocate higher percentage of the produced economic wealth for the research and development (R&D) were more likely to achieve prominence in the scientific publications. The size of the country and its population were not important factors to excel in scientific research. Since economic and socio-demographic factors only partly predicted the quality of science in a given country or territory, there is considerable space for historical and science policy factors that could affect the quality of science in a given country. Several countries being in almost identical starting positions twenty years ago have developed on completely different trajectories dependent on policies and decisions made by their policy makers. Possibilities of how to improve reliability of measures of scientific quality have been discussed.

References

Adams, J. and C. King (2010) Global research report: Russia. Research and collaboration in the new geography of science. Leeds: Evidence.

Aksnes, Dag W., Jesper W. Schneider, and Magnus Gunnarsson (2012) “Ranking national research systems by citation indicators. A comparative analysis using whole and fractionalised count­ing methods“. Journal of Informetrics 6, 1, 36–43.
http://dx.doi.org/10.1016/j.joi.2011.08.002

Albarran, Pedro, Juan A.Crespo, Ignacio Ortuno, and Javier Ruiz-Castillo (2010) “A comparison of the scientific performance of the US and the European union at the turn of the 21st century”. Scientometrics 85, 1, 329–344.
http://dx.doi.org/10.1007/s11192-010-0223-7

Allik, J. (2003) “The quality of science in Estonia, Latvia, and Lithuania after the first decade of independence”. Trames 7, 1, 40–52.

Allik, J. (2008) “Quality of Estonian science estimated through bibliometric indicators (1997–2007)”. Proceedings of the Estonian Academy of Sciences 57, 255–264.
http://dx.doi.org/10.3176/proc.2008.4.08

Allik, J. (2011) “Estonian science estimated through bibliometric indicators”. In J. Engelbrecht, ed. Research in Estonia: present and future, 456–469. Tallinn: Estonian Academy of Sciences.

Allik, J. (2013a) “Bibliometric analysis of the journal of cross-cultural psychology during the first ten years of the new millennium”. Journal of Cross-Cultural Psychology 44, 4, 657–667.
http://dx.doi.org/10.1177/0022022112461941

Allik, J. (2013b) “Personality psychology in the first decade of the new millennium: a bibliometric portrait”. European Journal of Personality 27, 1, 5–14.
http://dx.doi.org/10.1002/per.1843

Bornmann, Lutz, and Loet Leydesdorff (2013) “Macro-indicators of citation impacts of six prolific countries: In Cites data and the statistical significance of trends”. Plos One 8, 2.
http://dx.doi.org/10.1371/journal.pone.0056768

Garfield, E. (2005) “The agony and the ecstasy – the history and meaning of the journal impact factor”. Paper presented at the International Congress on Peer Review and Biomedical Publication, Chicago, September 16, 2005.

Herranz, Neus, and Javier Ruiz-Castillo (2012) “Sub-field normalization in the multiplicative case: average-based citation indicators”. Journal of Informetrics 6, 4, 543–556.
http://dx.doi.org/10.1016/j.joi.2012.02.006

Human Development Report (2011) Human development report 2011: sustainability and equity: a better future for all. New York: United Nations Development Programme.

Jaffe, K. (2011) “Do countries with lower self-citation rates produce higher impact papers? Or, does humility pay?”. Interciencia 36, 9, 694–698.

Karlsson, S., and O. Persson (2012) The Swedish production of highly cited papers. (Vetenskapsrådet lilla rapportserie, 5.) Stockholm: Vetenskapsrådet.

King, D. A. (2004) “The scientific impact of nations”. Nature 430, 6997, 311–316.
http://dx.doi.org/10.1038/430311a

Kristapsons, J., H. Martinson, and I. Dagyte (2003) Baltic R&D systems in transitions: experiences and future prospects. Riga: Zinatne.

Leydesdorff, Loet, and Lutz Bornmann (2011) “How fractional counting of citations affects the impact factor: normalization in terms of differences in citation potentials among fields of science”. Journal of the American Society for Information Science and Technology 62, 2, 217–229.
http://dx.doi.org/10.1002/asi.21450

Leydesdorff, Loet, and Tobias Opthof (2010) “Normalization at the field level: fractional counting of citations”. Journal of Informetrics 4, 4, 644–646.
http://dx.doi.org/10.1016/j.joi.2010.05.003

Leydesdorff, Loet, and Wagner, Caroline. (2009a) Is the United States losing ground in science? A global perspective on the world science system. Scientometrics, 78(1), 23–36.
http://dx.doi.org/10.1007/s11192-008-1830-4

Leydesdorff, Loet, and Caroline Wagner (2009b) “Macro-level indicators of the relations between research funding and research output”. Journal of Informetrics 3, 4, 353–362.
http://dx.doi.org/10.1016/j.joi.2009.05.005

Markusova, V. A., M. Jansz, A. N. Libkind, I. Libkind, and A. Varshavsky (2009) “Trends in Russian research output in post-Soviet era”. Scientometrics 79, 2, 249–260.
http://dx.doi.org/10.1007/s11192-009-0416-0

May, R. M. (1997) “The scientific wealth of nations”. Science 275, 5301, 793–796.
http://dx.doi.org/10.1126/science.275.5301.793

Moed, H. F. (2005) Citation analysis in research evaluation. Dordrecht: Springer.

Põder, E. (2010) “Let’s correct that small mistake”. Journal of the American Society for Information Science and Technology 61, 12, 2593–2594.
http://dx.doi.org/10.1002/asi.21438

Rodriguez-Navarro, A. (2011) “Measuring research excellence number of Nobel Prize achievements versus conventional bibliometric indicators”. Journal of Documentation 67, 4, 582–600.
http://dx.doi.org/10.1108/00220411111145007

The Royal Society (2011) Knowledge, networks and nations: global scientific collaboration in the 21st century. London: The Royal Society.

Wagner, Caroline S. and, Loet Leydesdorff (2012) “An integrated impact indicator: a new definition of ‘impact’ with policy relevance”. Research Evaluation 21, 3, 183–188.
http://dx.doi.org/10.1093/reseval/rvs012

van Leeuwen, T. N., M. S. Visser, H. F. Moed, T. J. Nederhof, and A. F. J. van Raan (2003) “Holy Grail of science policy: exploring and combining bibliometric tools in search of scientific excellence”. Scientometrics 57, 2, 257–280.
http://dx.doi.org/10.1023/A:1024141819302

Back to Issue