Annual Meeting Reports

Bridging the Geographic Science Gap: Editors’ View of Evaluating and Rating Scientific Papers

In this session, the speakers—through case studies in China, Brazil, and Croatia— discussed how journals evaluate scientific research papers quantitatively.

Adrian Stanley, chief executive officer of The Charlesworth Group (USA), Inc, noted China’s increases in the number of institutions of higher learning (from 2000 in 2002 to 4000 in 2005) and the number of doctorates awarded (from 5000 in 1997 to more than 35,000 in 2009). Researchers in China aim to publish in journals with impact factors of 2 or higher. The Chinese Ministry of Education investigates annually how many papers universities publish in journals that are included in the Science Citation Index. Stanley noted that according to the Thomson Reuters Science Watch 2007 report, China was ranked 5th in number of papers and 12th in number of citations in the top 148 countries.

Stanley said that Thomson Reuters has worked with the Chinese Academy of Sciences on a Chinese Science Citation Database, which covers 1200 top Chinese scholarly periodicals and has nearly 2 million records. He enumerated some criteria that the Institute of Scientific and Technical Information of China uses in evaluating scientific journals. Among the criteria are quotation rate (number of citations by other journals divided by number of all citations) and spreading factor (the average number of journals per 100 citations). He noted the overreliance by some in China on the impact factor in judging the quality of manuscripts, and he said that as the Chinese improve their use of English, the Chinese publishing or journal system would improve.

Mauricio Rocha e Silva reported on a study evaluating English writing proficiency and publication rates of Brazilian scientists. He said that Brazil is one of the 15 largest economies in the world ($1.2 trillion) but is ranked 50th among 80 nations in human-development indexes and 15th in production of science.

Rocha e Silva mentioned that the Brazilian Ministry of Science and Technology keeps a freely accessible Internet-based registry of curricula vitae of all Brazilian scientists (more than 60,000), called the Lattes Platform. An analysis of the database indicated that scientists who assessed themselves as proficient in English writing published more scientific papers than their counterparts who were not as proficient in English. The research also showed that scientists who were proficient in English had higher impact factors. “If you have good writing skills, you are cited often,” Rocha e Silva explained. He said that in Brazil—in contrast to China, India, and North Korea—English has not been adopted as a language of instruction, and he argued that Brazilian scientists need to be more proficient in English writing to increase their visibility globally.

Ana Marusic presented a study—of which she was one of the authors—of the citation rates and visibility of large and small journals. The study included an analysis of Croatian journals indexed in SCOPUS (an abstract and citation database of research literature and Web sources): 11 journals had also been indexed in the Web of Science before 2007 and had impact factors, and 16 were newly indexed in the Web of Science.

To explore whether being indexed in the Web of Science (and having the prospect of an impact factor) increased citations in a different database, Marusic and colleagues looked at citation trends in SCOPUS (total number of citations of the two groups of journals in a calendar year divided by the total number of articles in the journals in the same year). They found no statistical difference between the two groups of journals. Regardless of the indexing, good journals that published original research received citations and were more visible to the international scientific community.

She recommended that small countries consider putting their content online to increase their citations and visibility.