An analysis of publication productivity for 1225 academic neurosurgeons and 99 departments in the United States

Clinical article

Full access

Object

Bibliometrics is defined as the study of statistical and mathematical methods used to quantitatively analyze scientific literature. The application of bibliometrics in neurosurgery is in its infancy. The authors calculate a number of publication productivity measures for almost all academic neurosurgeons and departments within the US.

Methods

The h-index, g-index, m-quotient, and contemporary h-index (hc-index) were calculated for 1225 academic neurosurgeons in 99 (of 101) programs listed by the Accreditation Council for Graduate Medical Education in January 2013. Three currently available citation databases were used: Google Scholar, Scopus, and Web of Science. Bibliometric profiles were created for each surgeon. Comparisons based on academic rank (that is, chairperson, professor, associate, assistant, and instructor), sex, and subspecialties were performed. Departments were ranked based on the summation of individual faculty h-indices. Calculations were carried out from January to February 2013.

Results

The median h-index, g-index, hc-index, and m-quotient were 11, 20, 8, and 0.62, respectively. All indices demonstrated a positive relationship with increasing academic rank (p < 0.001). The median h-index was 11 for males (n = 1144) and 8 for females (n = 81). The h-index, g-index and hc-index significantly varied by sex (p < 0.001). However, when corrected for academic rank, this difference was no longer significant. There was no difference in the m-quotient by sex. Neurosurgeons with subspecialties in functional/epilepsy, peripheral nerve, radiosurgery, neuro-oncology/skull base, and vascular have the highest median h-indices; general, pediatric, and spine neurosurgeons have the lowest median h-indices. By summing the manually calculated Scopus h-indices of all individuals within a department, the top 5 programs for publication productivity are University of California, San Francisco; Barrow Neurological Institute; Johns Hopkins University; University of Pittsburgh; and University of California, Los Angeles.

Conclusions

This study represents the most detailed publication analysis of academic neurosurgeons and their programs to date. The results for the metrics presented should be viewed as benchmarks for comparison purposes. It is our hope that organized neurosurgery will adopt and continue to refine bibliometric profiling of individuals and departments.

Object

Bibliometrics is defined as the study of statistical and mathematical methods used to quantitatively analyze scientific literature. The application of bibliometrics in neurosurgery is in its infancy. The authors calculate a number of publication productivity measures for almost all academic neurosurgeons and departments within the US.

Methods

The h-index, g-index, m-quotient, and contemporary h-index (hc-index) were calculated for 1225 academic neurosurgeons in 99 (of 101) programs listed by the Accreditation Council for Graduate Medical Education in January 2013. Three currently available citation databases were used: Google Scholar, Scopus, and Web of Science. Bibliometric profiles were created for each surgeon. Comparisons based on academic rank (that is, chairperson, professor, associate, assistant, and instructor), sex, and subspecialties were performed. Departments were ranked based on the summation of individual faculty h-indices. Calculations were carried out from January to February 2013.

Results

The median h-index, g-index, hc-index, and m-quotient were 11, 20, 8, and 0.62, respectively. All indices demonstrated a positive relationship with increasing academic rank (p < 0.001). The median h-index was 11 for males (n = 1144) and 8 for females (n = 81). The h-index, g-index and hc-index significantly varied by sex (p < 0.001). However, when corrected for academic rank, this difference was no longer significant. There was no difference in the m-quotient by sex. Neurosurgeons with subspecialties in functional/epilepsy, peripheral nerve, radiosurgery, neuro-oncology/skull base, and vascular have the highest median h-indices; general, pediatric, and spine neurosurgeons have the lowest median h-indices. By summing the manually calculated Scopus h-indices of all individuals within a department, the top 5 programs for publication productivity are University of California, San Francisco; Barrow Neurological Institute; Johns Hopkins University; University of Pittsburgh; and University of California, Los Angeles.

Conclusions

This study represents the most detailed publication analysis of academic neurosurgeons and their programs to date. The results for the metrics presented should be viewed as benchmarks for comparison purposes. It is our hope that organized neurosurgery will adopt and continue to refine bibliometric profiling of individuals and departments.

Academic advancement in medicine is dependent on multiple factors, such as clinical volume and outcomes, teaching, board certification, number of years in practice, membership in organizations, participation in administrative duties, acquisition of funding, and conference presentations. Research and the end result of one's work—publishing—have been shown to be one of the most important determinants of promotion.3,8,10,48 The simplest method of quantifying one's research activity is to count publications; however, not all publications should be considered equal. A book chapter, case report, or review article should not be given the same weight as original research. Among original research, how does one differentiate papers that have substantial impact on a person's specialty against those that do not? In essence, can we establish research quality and impact, and if so, how?

Bibliometrics is defined as the application of statistical and mathematical methods to quantitatively analyze scholarly documents in an effort to establish indicators of research performance.11,18,35,55 Bibliometrics is heavily based on citations analysis, which is the study of references cited in the bibliographies of scholarly publications. Many metrics have been introduced to evaluate academic productivity, but none is more famous or controversial than the h-index.6,16 The h-index was first described in 2005 by physicist Jorge E. Hirsch from the University of California, San Diego.28 The h-index is defined as an individual having h papers with at least h citations. For example, if an author has 20 papers cited at least 20 times his or her h-index would be 20. This simple yet intuitively descriptive index has generated much discussion in various fields in medicine, including anesthesiology,36 hepatology,39 otolaryngology,47 radiation oncology,40 radiology,41 surgery,50 urology,9 and neurosurgery.2,33,38,45 As of July 2013, Hirsch's landmark paper has been cited almost 3200 times (per Google Scholar).

Although the h-index is the most well-known citation metric, it has a number of weaknesses.5,25,43,52 For example, the h-index favors senior researchers since h-index numbers can never decrease; the index also relies on time for researchers to generate a sufficient number of papers and then more time for those papers to be cited. As a result, other metrics have been proposed to counteract the shortcomings of the h-index.1 The contemporary h-index (hc-index)44 and m-quotient28 are examples of metrics that attempt to establish parity when young researchers are compared with more seasoned ones. The hc-index described by Antonis Sidiropoulos et al. in 2006 considers the age of publications and assigns more citations to recent publications.44 The m-quotient is defined as the h-index divided by the number of years since the author's first publication. Once a publication is counted toward the h-index, it will have no further impact on the h-index, despite garnering more subsequent citations. Thus, many argue that the h-index values quantity more so than quality. The g-index19,20 and e-index54 are metrics that are able to differentiate scientists with highly cited works that the h-index fails to capture. An alternative is Google's i10 index; it is the number of articles with 10 or more citations.

Another drawback of the h-index is that it is dependent on the size of the field. Researchers in small fields—such as neurosurgery—with consequently a smaller readership will, in general, have lower indices than researchers in fields with larger audiences, such as general medicine.33 Thus, the h-index should not be used to compare researchers of different areas of expertise. The h-index is also susceptible to self-citation and counts review articles, which are often highly cited, equal to original research and gives equal value to all authors listed on a publication.

Citation analysis and metrics are founded in databases, of which there are currently 3. Prior to 2004, Thomson Reuters' Web of Science was the only database available. In 2004, both Elsevier's Scopus and Google Scholar were released. The h-index was included as an indicator in Web of Science, Scopus, and Google Scholar within 2 years of its publication. The h-index, other metrics, and the citation analysis that creates them have been shown to vary depending on which database is used and, depending on how the search is performed, even within each database.4,23,32 Each database has its advantages and disadvantages. For example, Google Scholar is free, updated several times a week, and has a broader coverage than both Scopus and Web of Science. However, Google Scholar does not provide a list of its sources and includes citations in non–peer-reviewed publications, such as conference proceedings, white papers, and books.27

This article evaluates almost all of academic neurosurgery (99 of 101 departments and 1225 academicians) to benchmark the h-index, m-quotient, g-index, and contemporary h-index across academic ranks, departments, and subspecialties using all 3 currently available databases (Scopus, Google Scholar, and Web of Science).

Methods

Selection of Programs

A listing of the 2012 neurosurgery residency–training programs was obtained from the Accreditation Council for Graduate Medical Education (https://www.acgme.org/ads/Public/Reports/ReportRun?ReportId=1&CurrentYear=2012&SpecialtyId=35). Departmental websites were consulted for names, academic ranks, and subspecialties. Nonneurosurgical faculty (for example, neurologists, non–M.D. Ph.D.s, and radiologists) were excluded from this study. If all relevant information could not be obtained from the department's website, we contacted the department via email or telephone. Two programs were excluded (Cleveland Clinic and Walter Reed). This was due to insufficient information obtainable from departmental websites and unsuccessful subsequent attempts to gain this information. Calculations were carried out from January to February 2013.

Definition of Citation Metrics

h-Index

The h-index is defined as an individual having h papers with at least h citations. In other words, it corresponds to the point where the number of citations crosses the publications listed in decreasing order of citations.

m-Quotient

The m-quotient is the h-index divided by the number of years since the author's first publication.

hc-Index

The contemporary h-index is derived by multiplying the citation count of article by 4, then dividing by the number of years since publication. Thus, the number of citations an article published in this year (2013) would be multiplied by 4, a paper from 4 years ago would have its citation count multiplied by 1, and a paper from 6 years ago would have its citation count multiple by 4/6.

g-Index

With articles ranked in decreasing order of the number of citations that they received, the g-index is the largest number such that the top g articles received (together) at least g2 citations.

Calculation of Citation Metrics by Citation Database

Scopus

The automated h-index from Scopus (http://www.scopus.com) was obtained using the “Author Search” function. Because Scopus does not count citations prior to 1996, a “manually calculated” h-index was calculated for each individual by looking at each of the author's manuscripts (accounting for citations prior to 1996 in Scopus). The m-quotient was calculated by dividing Scopus' manually calculated h-index by the years since the first publication.

Google Scholar

Harzing's Publish or Perish (http://www.harzing.com/pop.htm) application was used to access Google Scholar for h-index, g-index, and contemporary h-index (hc). Publish or Perish uses the Advanced Scholar Search capabilities of Google Scholar.26

Web of Science

The Web of Science (http://wokinfo.com/products_tools/multidisciplinary/webofscience/) was used to determine an individual's h-index using the “Author” search function for each individual author.

An author's first and last initials were used within search strings. Careful examination of the results from each search was performed to determine if the author had a preferred way of listing his or her initials. Further analysis was performed on each search result to determine if, indeed, it represented the individual being searched. This method included looking at the titles of articles, titles of journals, affiliations of authors, and in some instances reading the articles published.

Citation metrics were then summarized for groups of individuals based on academic rank (instructor, assistant, associate, professor, and chairperson), sex, and subspecialty (spine, pediatrics, neuro-oncology/skull base, vascular, general, functional/epilepsy, peripheral nerve, and radiosurgery). A mean departmental h-index was calculated using Scopus, Google Scholar, and Web of Science. We then ranked departments by summing the manually calculated Scopus h-indices of all individuals within a department.

Statistical Analysis

The following a priori statistical comparisons were performed (mean values used): 1) h-index (Scopus, Google Scholar, Web of Science), m-quotient (Scopus), hc-index (Google Scholar), and g-index (Google Scholar) versus academic rank;

2) h-index (Scopus, Google Scholar, Web of Science), m-quotient (Scopus), hc-index (Google Scholar), and g-index (Google Scholar) versus sex (raw analysis and then stratified for academic rank); and

3) h-index (Scopus, Google Scholar, Web of Science), m-quotient (Scopus), hc-index (Google Scholar), and g-index (Google Scholar) versus neurosurgical subspecialty.

All statistics were calculated using SPSS (version 21, IBM SPSS). Significant values were considered to be p < 0.05. Mean values are presented with ± SD. Natural logarithms were used to transform nonparametric data.

Results

Citation Metrics Based on Database Used

Data were obtained from 99 departmental websites for a total of 1225 academic neurosurgeons. The overall mean and median of the calculated citation metrics across all 3 databases are shown in Table 1. The distribution of all metrics was positively skewed so that the means are larger than the medians.

TABLE 1:

Mean and median of calculated citation metrics across the 3 databases used*

DatabaseCitation Metric
h-Indexm-Quotienthc-Indexg-Index
Scopus
 manual14.6 [11]0.71 [0.62]
 automated12.1 [9]
GS (n = 1210)14.1 [10]9.3 [8]26.5 [20]
WOS (n = 1216)13 [9]

Mean values are listed outside the brackets. Median values are listed in brackets. GS = Google Scholar; WOS = Web of Science.

The h-index was calculated for 1180 neurosurgeons and the m-quotient was calculated for 1167 neurosurgeons.

h-Index

Using Scopus, the h-index was calculated for 1180 individuals. The mean manually calculated h-index was 14.6 ± 12.7 with a range of 0–76 and a median of 11. The mean automated Scopus h-index was 12.1 ± 10.4 with a range of 0–68 and a median of 9. For Google Scholar, 1210 neurosurgeons were analyzed. The mean h-index was 14.1 ± 12.9 with a range of 0–83 and a median of 10. Finally, using Web of Science, the mean value for 1216 neurosurgeons was 13 ± 13 with a range of 0–77 and a median of 9.

m-Quotient

The m-quotient was calculated for 1167 individuals using Scopus. The mean was 0.71 ± 0.47 (range of 0–3.67) and the median was 0.62.

hc-Index

The mean hc index was 9.3 ± 7.1 with a range of 0–42 and a median of 8 (Google Scholar, n = 1210).

g-Index

The mean g-index was 26.5 ± 23.8 with a range of 0–131 and a median of 20 (Google Scholar, n = 1210).

Citation Metrics by Groups of Individuals

Table 2 shows the mean and median h-indices for all 3 databases, m-quotient, hc-index, and g-index by academic rank, sex, and subspecialty.

TABLE 2:

Mean and median citation metrics by groups of individuals*

GroupingCitation Metric
h-Index (Scopus/GS/WOS)m-Quotient (Scopus)hc-Index (GS)g-Index (GS)
academic rank
 chair31/29/27 [29/22/26]1.02 [1]17 [16]55 [50]
 professor24/24/23 [22/17/23]0.88 [0.84]14 [13]45 [42]
 associate professor12/12/11 [11/10/11]0.69 [0.60]9 [8]23 [21]
 assistant professor8/7/7 [6/6/6]0.57 [0.50]6 [5]14 [11]
 instructor7/5/7 [4/4/3]0.51 [0.45]5 [2]12 [4]
 p value<0.001/<0.001/<0.001<0.001<0.001<0.001
sex
 male15/15/14 [11/11/10]0.72 [0.62]10 [8]28 [21]
 female10/9/8 [8/8/6]0.64 [0.57]7 [6]17 [13]
 p value0.111/0.291/0.4230.2110.1670.380
subspecialty
 vascular17/16/15 [12/12/11]0.87 [0.76]11 [9]31 [23]
 functional/epilepsy16/15/15 [12/12/11]0.75 [0.67]10 [9]29 [23]
 radiosurgery18/19/16 [13/13/12]0.82 [0.61]13 [9]35 [24]
 neuro-onc/skull base17/18/16 [14/14/12]0.82 [0.77]11 [10]34 [28]
 pediatrics13/13/12 [10/10/9]0.64 [0.60]8 [7]23 [18]
 peripheral nerve18/17/16 [16/16/12]0.70 [0.65]11 [10]32 [34]
 spine12/12/10 [8/9/7]0.64 [0.50]8 [6]22 [15]
 general14/14/12 [9/9/9]0.58 [0.50]9 [6]25 [16]
 p value<0.001/<0.001/<0.001<0.001<0.001<0.001

Mean values are listed outside the brackets. Median values are listed in brackets. P values were obtained using ANOVA. neuro-onc = neuro-oncology.

Manually calculated.

From left to right the values were obtained from the Scopus, Google Scholar, and Web of Science databases.

Academic Rank

Citation metric calculations were carried out for 99 chairs, 290 professors, 261 associate professors, 472 assistant professors, and 21 instructors. In general there was good agreement between the 3 databases. The range in average values for the h-index, m-quotient, hc-index, and g-index between an instructor and chairman were 5–31, 0.51–1.02, 5–17, and 12–55, respectively. All citation metrics were positively correlated with increasing academic rank (Kruskal-Wallis, p < 0.001).

Sex

There were 1144 male and 81 female neurosurgeons. The h-index appeared to vary with sex for all 3 databases (2-tailed Mann-Whitney, p < 0.001); however, when corrected for academic rank, this difference was no longer significant (2-way ANOVA, p = 0.111, p = 0.291, and p = 0.423). The m-quotient did not vary with sex (2-tailed Mann-Whitney, p = 0.211). Similar to the h-index, the hc-index and g-index did not vary with sex when corrected for academic rank (2-way ANOVA, p = 0.167 and p = 0.380, respectively).

Subspecialty

There was a statistically significant difference in h-indices among the various neurosurgical subspecialties (Kruskal-Wallis, p < 0.001). The distribution of values seemed to cluster into 2 groups: 1) general (n = 106), pediatrics (n = 190), and spine (n = 317) with an average h-index range of 10–14; and 2) functional/epilepsy (n = 123), peripheral nerve (n = 17), radiosurgery (n = 43), neuro-oncology/skull base (n = 195), and vascular (n = 167) with an average h-index range of 15–19. The same differences were seen for the m-quotient (0.58–0.64 vs 0.70–0.87), hc-index (8–9 vs 10–13), and g-index (22–25 vs 29–35).

Departmental h-Indices

Table 3 lists a ranking of the 99 programs based on the cumulative manually calculated Scopus h-index. Each department had at least 3 faculty members listed on their website. The top 5 programs were University of California, San Francisco; Barrow Neurological Institute; Johns Hopkins University; University of Pittsburgh; and University of California, Los Angeles.

TABLE 3:

Departmental rankings using the summation of manually calculated Scopus h-indices for all 99 departments included in this study

ProgramRankMean h-IndexNo. of FacultyΣ h-Index
University of California, San Francisco123.1527625
Barrow Neurological Institute220.5225513
Johns Hopkins University323.1422509
University of Pittsburgh415.1931471
University of California, Los Angeles523.3520467
Columbia University625.6316410
Massachusetts General Hospital719.3020386
University of Virginia830.0812361
Stanford University914.8723342
Mayo School of Graduate Medical Education1026.4212317
University of Pennsylvania1118.8816302
Washington University1218.6916299
Duke University Hospital1319.8715298
University of Southern California1421.2114297
University of Washington1520.7914291
Harvard/Brigham & Women's Hospital1622.2313289
University of Utah1718.5315278
Mount Sinai School of Medicine1810.6226276
University of Miami1916.0017272
Thomas Jefferson University2012.5221263
Northwestern University2114.5018261
Ohio State University2213.5819258
Yale University2320.2512243
Oregon Health & Science University2421.9111241
Cedars Sinai Medical Center2521.0911232
Emory University2613.5317230
University of Florida2720.8211229
University of California, San Diego2817.3113225
University of South Florida2914.0616225
Mayfield Clinic/University of Cincinnati3011.8419225
Vanderbilt University3114.8015222
University of Michigan3211.2218202
Indiana University3312.6316202
University of Alabama, Birmingham3414.0014196
New York University3513.9314195
University of Texas Southwestern3611.3517193
University of Wisconsin3713.5714190
Baylor College of Medicine3812.6715190
Allegheny General Hospital3910.4418188
Colorado University4011.6916187
University at Buffalo4116.8211185
Semmes-Murphey Clinic/University of Tennessee, Memphis4212.9314181
Cornell University4316.2711179
NSLIJ/Hofstra University4410.6316170
Henry Ford Hospital4517.449157
Methodist Houston4611.8513154
University of Texas, Houston4711.7713153
University of Maryland4818.638149
Virginia Commonwealth University4914.8010148
Case Western Reserve University5011.0813144
University of Illinois, Chicago5111.0813144
New York Medical College527.4719142
Medical College of Wisconsin5315.679141
Brown University School of Medicine5413.9010139
Albert Einstein College of Medicine5512.2711135
University of Iowa5618.437129
Rush University Medical Center5717.577123
SUNY/Upstate Medical University5813.569122
Penn State University5910.8211119
University of Chicago6014.138113
University of California, Davis6111.899107
Tufts Medical Center6214.00798
University of Arkansas6312.25898
University of Minnesota6412.00896
Louisiana State University, Shreveport6515.67694
West Virginia University668.551194
Wake Forest University6710.22992
Dartmouth University6815.00690
University of New Mexico6915.00690
University of Rochester709.001090
University of Nebraska716.211487
George Washington University727.081285
University of Illinois, Peoria7310.38883
Georgia Regents University748.89980
Medical University of South Carolina757.901079
University of Kentucky7612.67676
Mayo Florida7712.50675
University of Medicine and Dentistry of New Jersey7811.67670
University of Louisville7911.50669
University of Texas, San Antonio808.63869
Loma Linda University818.38867
University of California, Irvine8213.20566
Wayne State University838.13865
Albany Medical Center847.22965
University of Oklahoma857.75862
Saint Louis University869.67658
University of Vermont8714.00456
University of North Carolina889.33656
National Institutes of Health8911.00555
University of Kansas9010.80554
Louisiana State University, New Orleans9110.00550
University of Missouri9216.00348
Loyola University939.60548
University of Arizona9410.50442
University of Mississippi957.00642
Georgetown University965.13841
Tulane University975.13841
University of Puerto Rico983.29723
Temple University997.33322

Discussion

Bibliometrics (or citation metrics) is a powerful and useful tool. It allows one to quantitatively evaluate, in as much detail as he or she would like, the scope, breadth, and impact of an author's publication output. These metrics can be used to compare one's achievements with those of his or her peers. It is a tool that can be used by a dean or department chair to determine, along with other factors, promotion or tenure. Some governments, particularly those with centralized and socialized health care delivery systems, are using citation analysis as a means of establishing research quality and impact. For example, the Australian government established the Research Quality Framework to determine the quality and impact of government-funded research, one method of which was the use of quantitative metrics. Some “collectable” quality indicators included 1) the number of highly-cited articles published; 2) the number of articles in high-quality journals; and 3) the number of citations of articles within articles published in high-quality journals.13 In 2004, INSERM (the French National Institutes of Health and Medical Research) introduced bibliometrics as part of its research assessment procedures.42

The use of citation metrics in neurosurgery is still in its infancy.2 It began in 2009 when Lee et al. demonstrated a positive correlation between academic rank and a small group of neurosurgeons from 30 programs.33 Two studies have been undertaken in which large numbers of individual data have been collected, similar to what we have done in this paper. In 2010, Spearman et al.45 used Google Scholar to calculate the h-index for 1120 academic neurosurgeons in the US. They found the overall median h-index was 9 (range 0–68), and for instructors, assistants, associates, full professors, chairs, and program directors it was 2, 5, 10, 19, 22, and 17, respectively. The following year, Campbell et al. collected information on 986 neurosurgeons from 97 programs.12 The h-index was determined using Scopus and Web of Science. The mean Web of Science h-indices were 12.6 for assistant professors, 15.9 for associate professors, and 26.3 for full professors. The mean Scopus h-indices were 5.6 for assistant professors, 9.7 for associate professors, and 16.0 for full professors. Both papers confirmed the correlation between academic rank and h-index. Earlier this year, we performed 2 pilot studies that evaluated the top 10 U.S. News & World Report programs for academic rank, authorship value, impact factor subspecialty, and departmental rankings.30,31 These pilot studies revealed to us the need for standardization of bibliometrics in neurosurgery.

Our current study represents the most comprehensive attempt to provide bibliometric “benchmarks” for academic neurosurgery. Almost all academic neurosurgeons and departments in the US were analyzed. In addition to the most well-described metric—the h-index—we have also collected data on a number of other metrics that complement the h-index. The m-quotient allows a more balanced comparison between researchers of significantly differing publication longevity. Chairs and full professors overall were able to keep their m-quotient at or near 1.0, a remarkable accomplishment. This means that over the course of their career, they have been able to increase their h-index by 1 point for each year that they have been in practice. The contemporary or hc-index adds an age-related weighting to each cited article, thus correcting for the property of the h-index never decreasing with time, even if the researcher is no longer publishing. Our data show that the spread of hc-index values is not as great from instructor to chairman (5–17) as it is for the h-index (5–31). This indicates that a significant proportion of the h-index for senior neurosurgeons is due to the ability to continue reaping citation benefits from dated publications that are not yet available to younger neurosurgeons. Finally, the g-index gives more weight to highly cited articles, whose extra citations go unrecognized by the h-index. Again, we see that senior neurosurgeons—chairs (55) and full professors (45)—have a group of very highly cited publications that younger neurosurgeons—instructors (12) and assistant professors (14)—do not have and that associate professors (23) are gradually acquiring.

This is the first study to compare all 3 currently available citation databases (Scopus, Google Scholar, and Web of Science). While we showed that overall there was good agreement between the citation databases, there may be considerable variation from person to person. Widely differing citation counts based on which database is used have been demonstrated repeatedly.4,32,37 Table 4 summarizes the key features of the 3 currently available citation databases. Despite the limitation of not counting citations prior to 1996, we feel that Scopus is most appropriate for analysis at the individual level. Scopus identifies individuals by unique identification, whereas Google Scholar and Web of Science are susceptible to contaminated searches. For instance, if we search for “Neil Martin” in Google Scholar or Web of Science we will have all the papers published by anyone of this name in any field; Scopus allows the user to choose which “Neil Martin” is the correct one. To correct for this limitation in Google Scholar and Web of Science, we paid particular attention to refining searches and limiting our results to studies related to neurosurgery, a process that may be impractical on a day-to-day basis, particularly when analyzing neurosurgeons with common names or with high publication output. Although Scopus' drawback of not counting citations before 1996 will affect neurosurgeons whose publishing career dates back to 1995 and beyond, it will have less and less impact as time goes on and seasoned academic neurosurgeons retire.

TABLE 4:

Characteristics of the 3 currently available citation databases

ParameterGSScopus (Elsevier)WOS (Thomson Reuters)
updateweeklydailyweekly
coveragenot provided by GSas of March 2013, 49 million records & >20,500 peer-reviewed journal titles in the life sciences, social sciences, health sciences, & physical sciences; more coverage of non–English-language literature>11,000 journals covered by Journal Citation Reports, coverage not well defined
yrs coverednot provided by GS1966 to present for some journals; 1996 to present for citations1900–present (science), 1956–present (social science), 1975–present (arts & humanities); 1900–present for citations
fee-basednoyesyes
originUS, 2004Europe (the Netherlands), 2004US, 1960s
restrictionsproblematic in delineating individuals w/ common names; includes conference proceedings, patents, books, & white papersno citations prior to 1996 includedproblematic in delineating individuals w/ common names

In addition to academic rank, we also evaluated the bibliometrics based on sex and neurosurgical subspecialty. This paper represents the first comparison of publication productivity based on sex in the field of neurosurgery. The bibliometrics in Table 2 suggests that male neurosurgeons, as a whole, are more academically productive than their female counterparts. This is an observation that has been made by others and has been given the term “the productivity puzzle.”15,22,53 Various explanation has been put forth, including 3 hypotheses by American economist Larry Summers: 1) the high-powered job hypothesis; 2) different availability of aptitude at the high end; and 3) different socialization and patterns of discrimination in a search.46 These views were criticized in an editorial in Nature by Ben Barres.7 One potential explanation for the difference is that female researchers produce fewer but higher-quality publications.34,49 Symonds et al. devised an alternative to the h-index called the “Research Status,” which recognizes those whose publication history has more significant impact. Eloy et al. found that among otolaryngologists, women demonstrated a different productivity curve.21 Female otolaryngologists produced less research output earlier in their careers than men do, but at senior levels, they equaled or exceeded the research productivity of men. When we corrected for the confounder of academic rank by stratification, female neurosurgeons had similar h-index, hc-index, and g-index values to males. The m-quotient did not vary by sex even without correction for academic rank. Choi et al. also found that sex differences in publication productivity were corrected by academic rank among radiation oncologists.14 A more thorough analysis of the sex differences in publishing among academic neurosurgeons is needed.

There also appears to be a difference in productivity based on neurosurgical subspecialty. General, spine, and pediatric neurosurgeons had lower bibliometrics compared with peripheral nerve, neuro-oncology/skull base, radiosurgery, functional/epilepsy, and vascular neurosurgeons. With the exception of the recent paper by Kalra and Kestle,29 this represents the first attempt to compare publication productivity among neurosurgical subspecialties. Kalra and Kestle used Google Scholar (Harzing's Publish or Perish) to determine the h- and g-indices for 72 academic pediatric neurosurgeons. They found that the overall mean h- and g-indices were 16.6 and 29.5, respectively. These are higher than our figures (14.1 and 26.5; Table 1), likely as a result of those authors selecting pediatric neurosurgeons from fellowship-accredited programs. Their mean h-index as calculated by faculty rank was 7.8 for assistant professors, 13.0 for associate professors, and 27.9 for professors. For the g-index, these values were 14.5 for assistant professors, 25.0 for associate professors, and 48.0 for full professors. These 2 sets of numbers are comparable to ours (Table 2).

Bibliometrics may also be used to compare departments. Ponce and Lozano38 ranked American and Canadian neurosurgical programs by 3 different h-indices: one reflecting the cumulative work attributed to a neurosurgical department—h(c); one restricted to the cumulative work published over the past 10 years—h(10); and one limited to work published in 2 major North American neurosurgical journals—h(NS)(10). The use of these different methods resulted in large shifts in rankings, ranging from a rise of 45 positions to a fall of 70 of 99 departments. However, their study suffered from methodological drawbacks. They almost certainly included nonneurosurgeons in their analysis (for example, those faculty with a Ph.D.), and they used Web of Science, which is generally viewed as the weakest of the 3 citation databases. Furthermore, their outcomes were susceptible to large fluctuations depending on what text was used in each specific search string to define a department.

In our study we sought to develop a more accurate method of creating departmental rankings using the publication productivity of neurosurgeons only. A simple arithmetic mean would not suffice to compare departments with different numbers of individuals. For instance, the University of California, San Francisco, has 27 faculty members in our calculation, and even though Cedars Sinai has a similar overall mean, it is clear that the University of California, San Francisco, has more research productivity due to their larger number of faculty. A cumulative h-index would also be inadequate because it would be too susceptible to the publishing history of senior members. In other words, many junior faculty members' h-indices would have no bearing on the cumulative h-index for a department with a chair and other senior faculty who possess high h-index values. Therefore, we used a summation of all individual h-indices within a department. This method provides an accurate representation of a department's publication output by taking into account each member's accomplishments and the number of neurosurgeons within said department.

Our study has a number of limitations. One minor weakness is that we could not account for all individuals or programs. This small amount of missing data, however, would unlikely change any of our findings. The greatest potential criticism is the assumption that citations, and the bibliometrics that arise from citation analysis, are surrogates for quality in publishing. While it is generally safe to presume that the more times an article is cited, the greater its impact on neurosurgery, this may not always be the case. Some articles that generate high citations may present opinions that are eventually proven wrong. Review articles are often highly referenced, but they should not be viewed in the same fashion as an original research contribution with the same or a greater number of citations. Younger, less well known, researchers may suffer from what is termed the “Mendel syndrome,” defined by Garfield in 1979 as the “. . . inability of citation counts to identify premature discoveries—work that is highly significant but so far ahead of the field that it goes unnoticed.”17,24 On the other hand, veteran and established researchers may have their work recognized more merely due to name recognition rather than the substance of the publication, a phenomenon known as the “Matthew effect.”51 We view citations as an indicator of “interest” generated by that article within the academic community. Only the passage of time will determine the verdict on its quality and ultimate impact on neurosurgery. As stated by Hirsch, “. . .a single number can never give more than a rough approximation to an individual's multifaceted profile, and many other factors should be considered in combination in evaluating an individual.”28 We cannot agree more. But to the specific issue of publication productivity, we do feel that a collection of bibliometrics (that is, creating a bibliometric “profile”), each metric with its own strengths, can adequately paint a detailed and accurate picture of an individual's publication and research efforts. Academic neurosurgery should embrace and refine such profiles to provide the most objective method of quantifying publications and their impact.

Conclusions

Interest in the application of bibliometrics is growing. Our paper represents a comprehensive analysis of nearly all academic neurosurgeons and departments within the US. We have created the largest bibliometric profile database—h-index, m-quotient, hc-index, and g-index—in an effort to create benchmarks for individuals. All citation metrics varied across academic rank. We have, for the first time, analyzed publication productivity between male and female neurosurgeons. Males are more productive, but not when corrected for the confounder of academic rank. There also appears to be a difference in publication efforts among neurosurgical subspecialties, with vascular, neuro-oncology/skull base, functional, radiosurgery, and peripheral nerve neurosurgeons being the more productive. Departmental academic output can be adequately measured by the sum of individual faculties' h-indices department-wide. Caution must be heeded when using Google Scholar or Web of Science for calculating individual h-indices; Scopus may provide more accurate results for individual analysis.

Acknowledgment

We thank Andrew J. Gienapp for his expertise in medical editing and contributions to this manuscript.

Disclosure

The authors report no conflict of interest concerning the materials or methods used in this study or the findings specified in this paper.

Author contributions to the study and manuscript preparation include the following. Conception and design: Klimo, Khan. Acquisition of data: Khan, Taylor, Venable, Wham. Analysis and interpretation of data: Klimo, Khan, Thompson. Drafting the article: Klimo, Khan, Michael. Critically revising the article: all authors. Reviewed submitted version of manuscript: all authors. Approved the final version of the manuscript on behalf of all authors: Klimo. Statistical analysis: Khan, Thompson. Administrative/technical/material support: Klimo, Michael. Study supervision: Klimo.

Portions of this paper were presented in a poster at the 2013 Annual Meeting of the Congress of Neurological Surgeons in San Francisco, CA, October 19–23, 2013.

References

  • 1

    Alonso SCabrerizo FJHerrera-Viedma EHerrera F: h-index: A review focused in its variants, computation and standardization for different scientific fields. J Informetr 3:2732892009

    • Search Google Scholar
    • Export Citation
  • 2

    Aoun SGBendok BRRahme RJDacey RG JrBatjer HH: Standardizing the evaluation of scientific and academic performance in neurosurgery-critical review of the “h” index and its variants. World Neurosurg [epub ahead of print]2012

    • Search Google Scholar
    • Export Citation
  • 3

    Atasoylu AAWright SMBeasley BWCofrancesco J JrMacpherson DSPartridge T: Promotion criteria for clinician-educators. J Gen Intern Med 18:7117162003

    • Search Google Scholar
    • Export Citation
  • 4

    Bakkalbasi NBauer KGlover JWang L: Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomed Digit Libr 3:72006

    • Search Google Scholar
    • Export Citation
  • 5

    Baldock CMa ROrton CG: Point/counterpoint. The h index is the best measure of a scientist's research productivity. Med Phys 36:104310452009

    • Search Google Scholar
    • Export Citation
  • 6

    Bar-Ilan J: Informetrics at the beginning of the 21st century—a review. J Informetr 2:1522008

  • 7

    Barres BA: Does gender matter?. Nature 442:1331362006

  • 8

    Beasley BWWright SMCofrancesco J JrBabbott SFThomas PABass EB: Promotion criteria for clinician-educators in the United States and Canada. A survey of promotion committee chairpersons. JAMA 278:7237281997

    • Search Google Scholar
    • Export Citation
  • 9

    Benway BMKalidas PCabello JMBhayani SB: Does citation analysis reveal association between h-index and academic rank in urology?. Urology 74:30332009

    • Search Google Scholar
    • Export Citation
  • 10

    Bligh JBrice J: Further insights into the roles of the medical educator: the importance of scholarly management. Acad Med 84:116111652009

    • Search Google Scholar
    • Export Citation
  • 11

    Bornmann LDaniel HD: What do we know about the h index?. J Am Soc Inf Sci Technol 58:138113852007

  • 12

    Campbell PGAwe OOMaltenfort MGMoshfeghi DMLeng TMoshfeghi AA: Medical school and residency influence on choice of an academic career and academic productivity among neurosurgery faculty in the United States. Clinical article. J Neurosurg 115:3803862011

    • Search Google Scholar
    • Export Citation
  • 13

    Cheek JGarnham BQuan J: What's in a number? Issues in providing evidence of impact and quality of research(ers). Qual Health Res 16:4234352006

    • Search Google Scholar
    • Export Citation
  • 14

    Choi MFuller CDThomas CR Jr: Estimation of citation-based scholarly activity among radiation oncology faculty at domestic residency-training institutions: 1996–2007. Int J Radiat Oncol Biol Phys 74:1721782009

    • Search Google Scholar
    • Export Citation
  • 15

    Cole JRZuckerman H: The productivity puzzle: persistence and changes in patterns of publication of men and women scientists. Adv Motiv Achiev 2:2172581984

    • Search Google Scholar
    • Export Citation
  • 16

    Costas RBordons M: The h-index: advantages, limitations and its relation with other bibliometric indicators at the micro level. J Informetr 1:1932032007

    • Search Google Scholar
    • Export Citation
  • 17

    Costas Rvan Leeuwen TNvan Raan AF: The “Mendel syndrome” in science: durability of scientific literature and its effects on bibliometric analysis of individual scientists. Scientometrics 89:1772052011

    • Search Google Scholar
    • Export Citation
  • 18

    Egghe L: The Hirsch index and related impact measures. Ann Rev Info Sci Tech 44:651142010

  • 19

    Egghe L: An improvement of the h-index: the g-index. ISSI Newsletter 2:892006

  • 20

    Egghe L: Theory and practice of the g-index. Scientometrics 69:1311522006

  • 21

    Eloy JASvider PChandrasekhar SSHusain QMauro KMSetzen M: Gender disparities in scholarly productivity within academic otolaryngology departments. Otolaryngol Head Neck Surg 148:2152222013

    • Search Google Scholar
    • Export Citation
  • 22

    Eloy JASvider PFCherla DVDiaz LKovalerchik OMauro KM: Gender disparities in research productivity among 9,952 academic physicians. Laryngoscope 123:186518752013

    • Search Google Scholar
    • Export Citation
  • 23

    Falagas MEPitsouni EIMalietzis GAPappas G: Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses. FASEB J 22:3383422008

    • Search Google Scholar
    • Export Citation
  • 24

    Garfield E: Is citation analysis a legitimate evaluation tool?. Scientometrics 1:3593751979

  • 25

    Gaster NGaster M: A critical assessment of the h-index. Bioessays 34:8308322012

  • 26

    Harzing AW: The Publish or Perish Book: Your Guide to Effective and Responsible Citation Analysis.. Melbourne, AustraliaTarma Software Research Pty Ltd2011

    • Search Google Scholar
    • Export Citation
  • 27

    Harzing AWvan der Wal R: Google Scholar as a new source for citation analysis?. Ethics Sci Environ Polit 8:62712008

  • 28

    Hirsch JE: An index to quantify an individual's scientific research output. Proc Natl Acad Sci U S A 102:16569165722005

  • 29

    Kalra RRKestle JR: An assessment of academic productivity in pediatric neurosurgery. Clinical article. J Neurosurg Pediatr 12:2622652013

    • Search Google Scholar
    • Export Citation
  • 30

    Khan NThompson CJChoudhri AFBoop FAKlimo P Jr: Part I: The application of the h-index to groups of individuals and departments in academic neurosurgery. World Neurosurg [epub ahead of print]2013

    • Search Google Scholar
    • Export Citation
  • 31

    Khan NRThompson CJTaylor DRGabrick KSChoudhri AFBoop FR: Part II: Should the h-index be modified? An analysis of the m-quotient, contemporary h-index, authorship value, and impact factor. World Neurosurg [epub ahead of print]2013

    • Search Google Scholar
    • Export Citation
  • 32

    Kulkarni AVAziz BShams IBusse JW: Comparisons of citations in Web of Science, Scopus, and Google Scholar for articles published in general medical journals. JAMA 302:109210962009

    • Search Google Scholar
    • Export Citation
  • 33

    Lee JKraus KLCouldwell WT: Use of the h index in neurosurgery. Clinical article. J Neurosurg 111:3873922009

  • 34

    Long JS: Measures of sex-differences in scientific productivity. Soc Forces 71:1591781992

  • 35

    Moed HF: New developments in the use of citation analysis in research evaluation. Arch Immunol Ther Exp (Warsz) 57:13182009

  • 36

    Pagel PSHudetz JA: An analysis of scholarly productivity in United States academic anaesthesiologists by citation bibliometrics. Anaesthesia 66:8738782011

    • Search Google Scholar
    • Export Citation
  • 37

    Patel VMAshrafian HAlmoudaris AMakanjuola JBucciarelli-Ducci CDarzi A: Measuring academic performance for healthcare researchers with the H index: which search tool should be used?. Med Princ Pract 22:1781832013

    • Search Google Scholar
    • Export Citation
  • 38

    Ponce FALozano AM: Academic impact and rankings of American and Canadian neurosurgical departments as assessed using the h index. Clinical article. J Neurosurg 113:4474572010

    • Search Google Scholar
    • Export Citation
  • 39

    Poynard TThabut DMunteanu MRatziu VBenhamou YDeckmyn O: Hirsch index and truth survival in clinical research. PLoS ONE 5:e120442010

    • Search Google Scholar
    • Export Citation
  • 40

    Quigley MRHolliday EBFuller CDChoi MThomas CR Jr: Distribution of the h-index in radiation oncology conforms to a variation of power law: implications for assessing academic productivity. J Cancer Educ 27:4634662012

    • Search Google Scholar
    • Export Citation
  • 41

    Rad AEBrinjikji WCloft HJKallmes DF: The H-index in academic radiology. Acad Radiol 17:8178212010

  • 42

    Sahel JA: Quality versus quantity: assessing individual research performance. Sci Transl Med 3:84cm132011

  • 43

    Saleem T: The Hirsch index – a play on numbers or a true appraisal of academic output?. Int Arch Med 4:252011

  • 44

    Sidiropoulos AKatsaros DManolopoulos Y: Generalized Hirsch h-index for disclosing latent facts in citation networks. Scientometrics 72:2532802007

    • Search Google Scholar
    • Export Citation
  • 45

    Spearman CMQuigley MJQuigley MRWilberger JE: Survey of the h index for all of academic neurosurgery: another power-law phenomenon? Clinical article. J Neurosurg 113:9299332010

    • Search Google Scholar
    • Export Citation
  • 46

    Summers LH: Remarks at NBER conference on diversifying the science and engineering workforce Harvard University(http://www.harvard.edu/president/speeches/summers_2005/nber.php) [Accessed November 24 2013]

    • Search Google Scholar
    • Export Citation
  • 47

    Svider PFChoudhry ZAChoudhry OJBaredes SLiu JKEloy JA: The use of the h-index in academic otolaryngology. Laryngoscope 123:1031062013

    • Search Google Scholar
    • Export Citation
  • 48

    Svider PFMauro KMSanghvi SSetzen MBaredes SEloy JA: Is NIH funding predictive of greater research productivity and impact among academic otolaryngologists?. Laryngoscope 123:1181222013

    • Search Google Scholar
    • Export Citation
  • 49

    Symonds MRGemmell NJBraisher TLGorringe KLElgar MA: Gender differences in publication output: towards an unbiased metric of research performance. PLoS ONE 1:e1272006

    • Search Google Scholar
    • Export Citation
  • 50

    Turaga KKGamblin TC: Measuring the surgical academic output of an institution: the “institutional” H-index. J Surg Educ 69:4995032012

    • Search Google Scholar
    • Export Citation
  • 51

    Wendl MC: H-index: however ranked, citations need context. Nature 449:4032007. (Letter)

  • 52

    Williamson JR: My h-index turns 40: my midlife crisis of impact. ACS Chem Biol 4:3113132009

  • 53

    Xie YShauman KA: Sex differences in research productivity: new evidence about an old puzzle. Am Sociol Rev 63:8478701996

  • 54

    Zhang CT: The e-index, complementing the h-index for excess citations. PLoS ONE 4:e54292009

  • 55

    Zhang LThijs BGlänzel W: The diffusion of H-related literature. J Informetr 5:5835932011

If the inline PDF is not rendering correctly, you can download the PDF file here.

Article Information

Address correspondence to: Paul Klimo Jr., M.D., M.P.H., Semmes-Murphey Neurologic & Spine Institute, 6325 Humphreys Blvd., Memphis, TN 38120. email: pklimo@semmes-murphey.com.

Please include this information when citing this paper: published online December 20, 2013; DOI: 10.3171/2013.11.JNS131708.

© AANS, except where prohibited by US copyright law.

Headings

References

  • 1

    Alonso SCabrerizo FJHerrera-Viedma EHerrera F: h-index: A review focused in its variants, computation and standardization for different scientific fields. J Informetr 3:2732892009

    • Search Google Scholar
    • Export Citation
  • 2

    Aoun SGBendok BRRahme RJDacey RG JrBatjer HH: Standardizing the evaluation of scientific and academic performance in neurosurgery-critical review of the “h” index and its variants. World Neurosurg [epub ahead of print]2012

    • Search Google Scholar
    • Export Citation
  • 3

    Atasoylu AAWright SMBeasley BWCofrancesco J JrMacpherson DSPartridge T: Promotion criteria for clinician-educators. J Gen Intern Med 18:7117162003

    • Search Google Scholar
    • Export Citation
  • 4

    Bakkalbasi NBauer KGlover JWang L: Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomed Digit Libr 3:72006

    • Search Google Scholar
    • Export Citation
  • 5

    Baldock CMa ROrton CG: Point/counterpoint. The h index is the best measure of a scientist's research productivity. Med Phys 36:104310452009

    • Search Google Scholar
    • Export Citation
  • 6

    Bar-Ilan J: Informetrics at the beginning of the 21st century—a review. J Informetr 2:1522008

  • 7

    Barres BA: Does gender matter?. Nature 442:1331362006

  • 8

    Beasley BWWright SMCofrancesco J JrBabbott SFThomas PABass EB: Promotion criteria for clinician-educators in the United States and Canada. A survey of promotion committee chairpersons. JAMA 278:7237281997

    • Search Google Scholar
    • Export Citation
  • 9

    Benway BMKalidas PCabello JMBhayani SB: Does citation analysis reveal association between h-index and academic rank in urology?. Urology 74:30332009

    • Search Google Scholar
    • Export Citation
  • 10

    Bligh JBrice J: Further insights into the roles of the medical educator: the importance of scholarly management. Acad Med 84:116111652009

    • Search Google Scholar
    • Export Citation
  • 11

    Bornmann LDaniel HD: What do we know about the h index?. J Am Soc Inf Sci Technol 58:138113852007

  • 12

    Campbell PGAwe OOMaltenfort MGMoshfeghi DMLeng TMoshfeghi AA: Medical school and residency influence on choice of an academic career and academic productivity among neurosurgery faculty in the United States. Clinical article. J Neurosurg 115:3803862011

    • Search Google Scholar
    • Export Citation
  • 13

    Cheek JGarnham BQuan J: What's in a number? Issues in providing evidence of impact and quality of research(ers). Qual Health Res 16:4234352006

    • Search Google Scholar
    • Export Citation
  • 14

    Choi MFuller CDThomas CR Jr: Estimation of citation-based scholarly activity among radiation oncology faculty at domestic residency-training institutions: 1996–2007. Int J Radiat Oncol Biol Phys 74:1721782009

    • Search Google Scholar
    • Export Citation
  • 15

    Cole JRZuckerman H: The productivity puzzle: persistence and changes in patterns of publication of men and women scientists. Adv Motiv Achiev 2:2172581984

    • Search Google Scholar
    • Export Citation
  • 16

    Costas RBordons M: The h-index: advantages, limitations and its relation with other bibliometric indicators at the micro level. J Informetr 1:1932032007

    • Search Google Scholar
    • Export Citation
  • 17

    Costas Rvan Leeuwen TNvan Raan AF: The “Mendel syndrome” in science: durability of scientific literature and its effects on bibliometric analysis of individual scientists. Scientometrics 89:1772052011

    • Search Google Scholar
    • Export Citation
  • 18

    Egghe L: The Hirsch index and related impact measures. Ann Rev Info Sci Tech 44:651142010

  • 19

    Egghe L: An improvement of the h-index: the g-index. ISSI Newsletter 2:892006

  • 20

    Egghe L: Theory and practice of the g-index. Scientometrics 69:1311522006

  • 21

    Eloy JASvider PChandrasekhar SSHusain QMauro KMSetzen M: Gender disparities in scholarly productivity within academic otolaryngology departments. Otolaryngol Head Neck Surg 148:2152222013

    • Search Google Scholar
    • Export Citation
  • 22

    Eloy JASvider PFCherla DVDiaz LKovalerchik OMauro KM: Gender disparities in research productivity among 9,952 academic physicians. Laryngoscope 123:186518752013

    • Search Google Scholar
    • Export Citation
  • 23

    Falagas MEPitsouni EIMalietzis GAPappas G: Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses. FASEB J 22:3383422008

    • Search Google Scholar
    • Export Citation
  • 24

    Garfield E: Is citation analysis a legitimate evaluation tool?. Scientometrics 1:3593751979

  • 25

    Gaster NGaster M: A critical assessment of the h-index. Bioessays 34:8308322012

  • 26

    Harzing AW: The Publish or Perish Book: Your Guide to Effective and Responsible Citation Analysis.. Melbourne, AustraliaTarma Software Research Pty Ltd2011

    • Search Google Scholar
    • Export Citation
  • 27

    Harzing AWvan der Wal R: Google Scholar as a new source for citation analysis?. Ethics Sci Environ Polit 8:62712008

  • 28

    Hirsch JE: An index to quantify an individual's scientific research output. Proc Natl Acad Sci U S A 102:16569165722005

  • 29

    Kalra RRKestle JR: An assessment of academic productivity in pediatric neurosurgery. Clinical article. J Neurosurg Pediatr 12:2622652013

    • Search Google Scholar
    • Export Citation
  • 30

    Khan NThompson CJChoudhri AFBoop FAKlimo P Jr: Part I: The application of the h-index to groups of individuals and departments in academic neurosurgery. World Neurosurg [epub ahead of print]2013

    • Search Google Scholar
    • Export Citation
  • 31

    Khan NRThompson CJTaylor DRGabrick KSChoudhri AFBoop FR: Part II: Should the h-index be modified? An analysis of the m-quotient, contemporary h-index, authorship value, and impact factor. World Neurosurg [epub ahead of print]2013

    • Search Google Scholar
    • Export Citation
  • 32

    Kulkarni AVAziz BShams IBusse JW: Comparisons of citations in Web of Science, Scopus, and Google Scholar for articles published in general medical journals. JAMA 302:109210962009

    • Search Google Scholar
    • Export Citation
  • 33

    Lee JKraus KLCouldwell WT: Use of the h index in neurosurgery. Clinical article. J Neurosurg 111:3873922009

  • 34

    Long JS: Measures of sex-differences in scientific productivity. Soc Forces 71:1591781992

  • 35

    Moed HF: New developments in the use of citation analysis in research evaluation. Arch Immunol Ther Exp (Warsz) 57:13182009

  • 36

    Pagel PSHudetz JA: An analysis of scholarly productivity in United States academic anaesthesiologists by citation bibliometrics. Anaesthesia 66:8738782011

    • Search Google Scholar
    • Export Citation
  • 37

    Patel VMAshrafian HAlmoudaris AMakanjuola JBucciarelli-Ducci CDarzi A: Measuring academic performance for healthcare researchers with the H index: which search tool should be used?. Med Princ Pract 22:1781832013

    • Search Google Scholar
    • Export Citation
  • 38

    Ponce FALozano AM: Academic impact and rankings of American and Canadian neurosurgical departments as assessed using the h index. Clinical article. J Neurosurg 113:4474572010

    • Search Google Scholar
    • Export Citation
  • 39

    Poynard TThabut DMunteanu MRatziu VBenhamou YDeckmyn O: Hirsch index and truth survival in clinical research. PLoS ONE 5:e120442010

    • Search Google Scholar
    • Export Citation
  • 40

    Quigley MRHolliday EBFuller CDChoi MThomas CR Jr: Distribution of the h-index in radiation oncology conforms to a variation of power law: implications for assessing academic productivity. J Cancer Educ 27:4634662012

    • Search Google Scholar
    • Export Citation
  • 41

    Rad AEBrinjikji WCloft HJKallmes DF: The H-index in academic radiology. Acad Radiol 17:8178212010

  • 42

    Sahel JA: Quality versus quantity: assessing individual research performance. Sci Transl Med 3:84cm132011

  • 43

    Saleem T: The Hirsch index – a play on numbers or a true appraisal of academic output?. Int Arch Med 4:252011

  • 44

    Sidiropoulos AKatsaros DManolopoulos Y: Generalized Hirsch h-index for disclosing latent facts in citation networks. Scientometrics 72:2532802007

    • Search Google Scholar
    • Export Citation
  • 45

    Spearman CMQuigley MJQuigley MRWilberger JE: Survey of the h index for all of academic neurosurgery: another power-law phenomenon? Clinical article. J Neurosurg 113:9299332010

    • Search Google Scholar
    • Export Citation
  • 46

    Summers LH: Remarks at NBER conference on diversifying the science and engineering workforce Harvard University(http://www.harvard.edu/president/speeches/summers_2005/nber.php) [Accessed November 24 2013]

    • Search Google Scholar
    • Export Citation
  • 47

    Svider PFChoudhry ZAChoudhry OJBaredes SLiu JKEloy JA: The use of the h-index in academic otolaryngology. Laryngoscope 123:1031062013

    • Search Google Scholar
    • Export Citation
  • 48

    Svider PFMauro KMSanghvi SSetzen MBaredes SEloy JA: Is NIH funding predictive of greater research productivity and impact among academic otolaryngologists?. Laryngoscope 123:1181222013

    • Search Google Scholar
    • Export Citation
  • 49

    Symonds MRGemmell NJBraisher TLGorringe KLElgar MA: Gender differences in publication output: towards an unbiased metric of research performance. PLoS ONE 1:e1272006

    • Search Google Scholar
    • Export Citation
  • 50

    Turaga KKGamblin TC: Measuring the surgical academic output of an institution: the “institutional” H-index. J Surg Educ 69:4995032012

    • Search Google Scholar
    • Export Citation
  • 51

    Wendl MC: H-index: however ranked, citations need context. Nature 449:4032007. (Letter)

  • 52

    Williamson JR: My h-index turns 40: my midlife crisis of impact. ACS Chem Biol 4:3113132009

  • 53

    Xie YShauman KA: Sex differences in research productivity: new evidence about an old puzzle. Am Sociol Rev 63:8478701996

  • 54

    Zhang CT: The e-index, complementing the h-index for excess citations. PLoS ONE 4:e54292009

  • 55

    Zhang LThijs BGlänzel W: The diffusion of H-related literature. J Informetr 5:5835932011

Cited By

Metrics

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 700 700 27
PDF Downloads 301 301 14
EPUB Downloads 0 0 0

PubMed

Google Scholar