Counting, measuring, comparing quantities, analyzing measurements: quantitative analysis is perhaps the main tool of science. Scientific research itself, and recording and communicating research results through publications, has become enormous and complex. The need to measure research performance is largely driven by the necessity to make funding decisions. Traditionally, research has been judged by other scholars in the same research field; by expert review, more widely known as peer review. Measuring the strength of peer review through citation counts allows funders who are not subject experts to make informed decisions. Individual researchers can use bibliometrics to promote their research. Finding bibliometric values is not difficult, but using them requires more consideration of the norms of the discipline and the data source to use.
Bibliometrics (sometimes called Scientometrics) turns the main tool of science, quantitative analysis, on itself. There are various definitions used for "bibliometrics." Essentially, bibliometrics is the application of quantitative analysis and statistics to publications such as journal articles and their accompanying citation counts. Quantitative evaluation of publication and citation data is now used in almost all nations around the globe with a sizeable science enterprise. Bibliometrics is used in research performance evaluation, especially in university and government labs, and also by policymakers, research directors and administrators, information specialists and librarians, and researchers themselves.
A Two-Pronged Approach The two together--peer review and quantitative analysis of research better inform evaluation. Quantitative analysis offers certain advantages in gathering the objective information necessary for decision-making:
* Quantitative analysis of research is global in perspective, offering a "top-down" review that puts the work in context, complementing the local perspective of peer review. Quantitative research analysis provides data on all activity in an area, summaries of these data, and a comprehensive perspective on activity and achievements.
* Weighted quantitative measures, such as papers per researcher or citations per paper, remove characteristics, such as the place of production or past reputation, that color human perceptions of quality.
COMMONLY USED METRICS
* It is used to compare journals in a chosen academic discipline.
* It gives rise to a ranking of journals with the most highly cited journal
* Ranking vary on the formula used, and also on the source data used.
* The Journal Impact Factor (JIF) from Thomson Reuters is the most common metric in this category.
LIMITATIONS OF JOURNAL METRICS
* It is not possible to compare journals in various disciplines due to different citation pattern between disciplines, e.g. some disciplines use fewer citations than others.
* The time span used is arbitrary, with 2, 3 and 5 years; different disciplines may need different timescales.
* Review journals (that is, journals consisting of review articles) have high numbers of citations
* Are used to compare researchers, but comparisons can only be made in a given academic discipline due to differing citation patterns.
* Based on the number of articles a researcher has published, and the number of times these articles have been cited by others.
* The average number of citations per author is a fairly crude way of measuring a person's research impact, so there are various measures which try and overcome this limitation.
* The most commonly used author metric is the h-index
* The h-index is dependent on the data used to calculate it.
LIMITATIONS OF THE H-INDEX
* Highly cited articles are likely to be the most important.
* Favours the authors at the middle or end of their career.
* Ignores small numbers of important articles.
* Incomplete coverage by citation indexes, e.g. documents covered, disciplines, foreign language materials etc.
SOURCES OF DATA
* Thomson Reuters provide Journal Citation Reports and the Web of Science. This was the original source for bibliometrics.
* Scopus is provided by Elsevier. The free service SCImago, uses Scopus data to generate journal metrics.
* Google Scholar data is used by the website Harzing's Publish or Perish (POP)
REVIEW OF LITERATURE
Malathy, S (2015) studied that Journal of Spacecraft and Technology, an in-house publication of ISRO Satellite Centre publishes the research activity of the centre. This paper presents bibliometric study of the journal published during 1991 to 2012, which includes 22 volumes with 330 papers and 2597 citations. The analysis was made on different parameters like year-wise distribution of articles for the period of study (1991-2012), length of articles, authorship pattern of contributions, author productivity, degree of collaboration among co-authors and gender-wise distribution of papers. It also presents Institution-wise contribution, group-wise (only ISAC) contribution, ranked list of prolific/productive authors, number of citations appeared in papers and form-wise distribution of citations. This study provides the insight and development of the journal towards excellence.
Maharana, Rabindra K (2015) aimed to analyze Indian researchers' publications on tuberculosis (TB) which were indexed in Web of Science (WoS) database during the from 2004 to 2013. It also emphases the performance of publication covering annual outputs, mainstream journals, leading Indian research institutions, h-index, etc. The present study is a bibliometric analysis of all Indian TB publications over the past 10 years, in the national/international journals of repute. Utilizing the WoS database, 5,073 documents of Indian researcher's publications data on TB research were used for the study period from 2004 to 2013; various statistical techniques and bibliometric measures have been used for further analysis. The study exclusively examines 5,073 research outputs of Indian researchers on TB which have been indexed in Thomson Reuters WoS during 2004-2013. Thus, documents published in any other different channels and sources which have not indexed in WoS are excluded from the purview of research.
Ta$kin, Zehra (2015) aimed to undertake a bibliometric investigation of the NASA Astrobiology Institute (NAI) funded research that was published between 2008 and 2012. Using the NAI annual reports, 1210 peer-reviewed publications are analyzed. The following conclusions are drawn: (1) NAI researchers prefer publishing in high-impact multidisciplinary journals. (2) Astronomy and astrophysics are the most preferred categories to publish based on Web of Science subject categories. (3) NAI is indeed a virtual institution; researchers collaborate with other researchers outside their organization and in some cases outside the U.S. (4) There are prominent scholars in the NAI co-author network but none of them dominates astrobiology
Maddisetty, Balaji (2014) provided access of scientific and scholarly content and peerreviewed journals that meet s high quality standards and it is free to all the time of publication based on the Budapest open access initiatives a right to read, download, copy, distribute, print, search or link to the full text of the articles. In this paper author made a effort to study the total 57 full-free E-journal in physical education. Journal analyzed based on language-wise, country-wise, subject headings-wise, keywords-wise and year-wise their accessibility of archives of online Journals in the physical education
Patra, Swapan Kumar (2014) stated that Indian library and...