首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Questions of definition and measurement continue to constrain a consensus on the measurement of interdisciplinarity. Using Rao-Stirling (RS) Diversity sometimes produces anomalous results. We argue that these unexpected outcomes can be related to the use of “dual-concept diversity” which combines “variety” and “balance” in the definitions (ex ante). We propose to modify RS Diversity into a new indicator (DIV) which operationalizes “variety,” “balance,” and “disparity” independently and then combines them ex post. “Balance” can be measured using the Gini coefficient. We apply DIV to the aggregated citation patterns of 11,487 journals covered by the Journal Citation Reports 2016 of the Science Citation Index and the Social Sciences Citation Index as an empirical domain and, in more detail, to the citation patterns of 85 journals assigned to the Web-of-Science category “information science & library science” in both the cited and citing directions. We compare the results of the indicators and show that DIV provides improved results in terms of distinguishing between interdisciplinary knowledge integration (citing references) versus knowledge diffusion (cited impact). The new diversity indicator and RS diversity measure different features. A routine for the measurement of the various operationalization of diversity (in any data matrix) is made available online.  相似文献   

2.
The paper articulates the problems of journal publication in a relatively small country such as Romania where locally (i.e. nationally) published journals include most of the national medical scientific output. The starting point was a study ordered by the Cluj University of Medicine and Pharmacy Scientific Council, for the purpose of obtaining an objectively ranked list of all current Romanian biomedical journals that could be used in the evaluation of the scientific activity of the university academic staff. Sixty‐five current biomedical journals were identified—of which more than half were new titles that had appeared over the past 5 years. None of these are included in the Science Citation Index or Journal Citation Reports (JCR). A set of criteria was used for ranking the journals: peer review, inclusion in international databases, publication time lag, language of articles and abstracts, journal specific index and domestic impact factor. The period covered, along with tools and formulas used are presented. The problems of Romanian biomedical journals as well as ways of improving publishing standards are discussed. Also emphasized is the necessity for increased awareness in the medical scholarly community and the role of the library in this respect.  相似文献   

3.
4.
Publishing articles in a prestigious journal is a golden rule for university professors and researchers nowadays. Impact factor, journal rank, and citation count, included in Science Citation Index managed by Thomson Reuters Web of Science, are the most important indicators for evaluating the quality of academic journals. By listing the journals encompassed in the “Integrative and Complementary Medicine” category of Science Citation Index from 2003 to 2013, this paper examines the publication trends of journals in the category. The examination includes number, country of origin, ranking, and languages of journals. Moreover, newly listed or removed journals in the category, journal publishers, and open access strategies are examined. It is concluded that the role of journal publisher should not be undermined in the “Integrative and Complementary Medicine” category.  相似文献   

5.
Collaboration practices vary greatly per scientific area and discipline and influence the scientific performance and its scholarly communication. In this study, the collaborative pattern of the Information Retrieval (IR) research field is analyzed using co-authored articles retrieved from Social Science Citation Index for a period of 11 years from 1987 to 1997. The level of collaboration, journal collaborative distribution, disciplinary collaborative distribution and country collaboration are probed according to IR collaborative research. Findings are discussed from the above perspectives in detail. In particular, this study reveals a perceptible upward trend of collaborative IR research with the results of these research efforts being reported in all major core IR journals. The inter-disciplinary and intra-disciplinary scholarly communications in collaborative researches are very much in evidence and cover broad areas like psychology, and computer and medical sciences, respectively.  相似文献   

6.
Collaboration practices vary greatly per scientific area and discipline and influence the scientific performance and its scholarly communication. In this study, the collaborative pattern of the Information Retrieval (IR) research field is analyzed using co-authored articles retrieved from Social Science Citation Index for a period of 11 years from 1987 to 1997. The level of collaboration, journal collaborative distribution, disciplinary collaborative distribution and country collaboration are probed according to IR collaborative research. Findings are discussed from the above perspectives in detail. In particular, this study reveals a perceptible upward trend of collaborative IR research with the results of these research efforts being reported in all major core IR journals. The inter-disciplinary and intra-disciplinary scholarly communications in collaborative researches are very much in evidence and cover broad areas like psychology, and computer and medical sciences, respectively.  相似文献   

7.
苏成  潘云涛  袁军鹏  马峥 《编辑学报》2015,27(4):330-333
本研究构建了适合期刊引用网络的优化PageRank、HITS和SALSA算法,利用原始和优化后各算法计算了2013年的中国科技论文与引文数据库中1 989种期刊的权值,并与传统的影响因子和总被引频次进行了对比研究,分析讨论了各算法的特性、优缺点和适用范围.  相似文献   

8.
夏旭 《图书馆论坛》2004,24(6):90-95
以《复印报刊资料》(下称资料)研究论文定量分析和2000—2003年图书馆学情报学期刊全文转载排名为基础,通过比较其综合排名与《中文核心期刊要目总览》、《中文社会科学引文索引》、《中国人文社会科学核心期刊要览》的差异和CNKI、《中文科技期刊引文数据库》收录期刊基金论文和被引频次验证排名合理性,结果表明《资料》期刊排名有一定的合理性,基金论文和被引频次是衡量期刊排名的客观指标。  相似文献   

9.
The journal impact factor (JIF) has been questioned considerably during its development in the past half-century because of its inconsistency with scholarly reputation evaluations of scientific journals. This paper proposes a publication delay adjusted impact factor (PDAIF) which takes publication delay into consideration to reduce the negative effect on the quality of the impact factor determination. Based on citation data collected from Journal Citation Reports and publication delay data extracted from the journals’ official websites, the PDAIFs for journals from business-related disciplines are calculated. The results show that PDAIF values are, on average, more than 50% higher than JIF results. Furthermore, journal ranking based on PDAIF shows very high consistency with reputation-based journal rankings. Moreover, based on a case study of journals published by ELSEVIER and INFORMS, we find that PDAIF will bring a greater impact factor increase for journals with longer publication delay because of reducing that negative influence. Finally, insightful and practical suggestions to shorten the publication delay are provided.  相似文献   

10.
《资料收集管理》2013,38(1-2):77-95
The impact of monographs in a vertebrate zoology collection on the scientific literature was assessed using a randomly selected sample (52 monographs), Science Citation Index and a statistical package. Characterstics of the monographs considered were: copyright date, circulation, citation frequency and subdiscipline (ichthyology, herpetology, ornithology, mammalogy). Citing references were dispersed among journals in a wide array of disciplines. A few monographs proved to be very highly cited (one being cited nearly 600 times), and so generated the majority of the database of 2,971 citations. The ichthyology monographs generated the broadest subject dispersion among citing references. The herpetology collection is less active than are the others in terms of circulation and current citation frequency. The sample has been generating an ever increasing share of the citations in the Science Citation Index. A method that applies citation analysis to the evaluation of monograph collections is outlined.  相似文献   

11.
12.
The Emerging Sources Citation Index (ESCI) was created recently, in 2015, but few assessments of its journal coverage have been made. The present study tries to fill that gap by comparing its coverage with that of other international abstracting and indexing (A&I) databases. Using this measure, it is feasible to benchmark this index against the other citation indexes for acceptance criteria. We analysed 6,296 ESCI‐indexed journals, 8,889 Science Citation Index Expanded (SCIE), 3,258 Social Science Citation Index (SSCI), 1,784 Arts and Humanities Citation Index (AHCI), and 22,749 Scopus journals as indexed in July 2017 to determine their inclusion in 105 databases. We found that 19.3% of the ESCI journals are not covered by any other A&I databases, a high figure compared with only 0.5% SCIE, 0.3% SSCI, 0.3% AHCI, and 5.5% Scopus journals. This low coverage suggests that the selection criteria for ESCI journals are not consistent with the overall trend in the other classical citation indexes.  相似文献   

13.
Citation based approaches, such as the impact factor and h-index, have been used to measure the influence or impact of journals for journal rankings. A survey of the related literature for different disciplines shows that the level of correlation between these citation based approaches is domain dependent. We analyze the correlation between the impact factors and h-indices of the top ranked computer science journals for five different subjects. Our results show that the correlation between these citation based approaches is very low. Since using a different approach can result in different journal rankings, we further combine the different results and then re-rank the journals using a combination method. These new ranking results can be used as a reference for researchers to choose their publication outlets.  相似文献   

14.
Examining a comprehensive set of papers (n = 1837) that were accepted for publication by the journal Angewandte Chemie International Edition (one of the prime chemistry journals in the world) or rejected by the journal but then published elsewhere, this study tested the extent to which the use of the freely available database Google Scholar (GS) can be expected to yield valid citation counts in the field of chemistry. Analyses of citations for the set of papers returned by three fee-based databases – Science Citation Index, Scopus, and Chemical Abstracts – were compared to the analysis of citations found using GS data. Whereas the analyses using citations returned by the three fee-based databases show very similar results, the results of the analysis using GS citation data differed greatly from the findings using citations from the fee-based databases. Our study therefore supports, on the one hand, the convergent validity of citation analyses based on data from the fee-based databases and, on the other hand, the lack of convergent validity of the citation analysis based on the GS data.  相似文献   

15.
自然科学期刊自引对影响因子的"调控"   总被引:14,自引:0,他引:14  
李运景  侯汉清 《情报学报》2006,25(2):172-178
本文利用《中国科技期刊引证报告》,重新计算了其中几个学科的一些期刊除去自引后的影响因子,并对去除前和去除后的影响因子与期刊排名进行了对比,以考察期刊自引对影响因子和期刊排名的影响。调查发现目前个别期刊过度自引已经使期刊排名发生了失真。最后对如何遏制这种现象提出了一些建议。  相似文献   

16.
The standard impact factor allows one to compare scientific journals only within particular scientific subjects. To overcome this limitation, another indicator of citation, viz., the thematically weighted impact factor (TWIF), is proposed. This indicator allows one to compare journals of various subjects and takes the fact that a journal belongs to several subjects into account. Information on the thematic headings of a journal and the value of a standard impact factor is necessary for calculation of the indicator. The TWIF, which is calculated according to the citation index of Journal Citation Reports, is investigated in this article.  相似文献   

17.
The cost effectiveness and quality of full-text journals are analyzed for four prominent online aggregated journal packages: EBSCOhost Academic Search FullTEXT, UMI Proquest Direct Periodicals Research II, IAC’s Expanded Academic ASAP, and H.W. Wilson’s OmniFile. Price data from EBSCO’s Librarians’ Handbook are used to assess the total and average value of social sciences journals in each package. Quality of social sciences journals coverage is compared based on citation impact factors as recorded in Journal Citation Reports—Social Sciences Edition.  相似文献   

18.
Citation averages, and Impact Factors (IFs) in particular, are sensitive to sample size. Here, we apply the Central Limit Theorem to IFs to understand their scale-dependent behavior. For a journal of n randomly selected papers from a population of all papers, we expect from the Theorem that its IF fluctuates around the population average μ, and spans a range of values proportional to σ/n, where σ2 is the variance of the population's citation distribution. The 1/n dependence has profound implications for IF rankings: The larger a journal, the narrower the range around μ where its IF lies. IF rankings therefore allocate an unfair advantage to smaller journals in the high IF ranks, and to larger journals in the low IF ranks. As a result, we expect a scale-dependent stratification of journals in IF rankings, whereby small journals occupy the top, middle, and bottom ranks; mid-sized journals occupy the middle ranks; and very large journals have IFs that asymptotically approach μ. We obtain qualitative and quantitative confirmation of these predictions by analyzing (i) the complete set of 166,498 IF & journal-size data pairs in the 1997–2016 Journal Citation Reports of Clarivate Analytics, (ii) the top-cited portion of 276,000 physics papers published in 2014–2015, and (iii) the citation distributions of an arbitrarily sampled list of physics journals. We conclude that the Central Limit Theorem is a good predictor of the IF range of actual journals, while sustained deviations from its predictions are a mark of true, non-random, citation impact. IF rankings are thus misleading unless one compares like-sized journals or adjusts for these effects. We propose the Φ index, a rescaled IF that accounts for size effects, and which can be readily generalized to account also for different citation practices across research fields. Our methodology applies to other citation averages that are used to compare research fields, university departments or countries in various types of rankings.  相似文献   

19.
This paper explores a new indicator of journal citation impact, denoted as source normalized impact per paper (SNIP). It measures a journal's contextual citation impact, taking into account characteristics of its properly defined subject field, especially the frequency at which authors cite other papers in their reference lists, the rapidity of maturing of citation impact, and the extent to which a database used for the assessment covers the field's literature. It further develops Eugene Garfield's notions of a field's ‘citation potential’ defined as the average length of references lists in a field and determining the probability of being cited, and the need in fair performance assessments to correct for differences between subject fields. A journal's subject field is defined as the set of papers citing that journal. SNIP is defined as the ratio of the journal's citation count per paper and the citation potential in its subject field. It aims to allow direct comparison of sources in different subject fields. Citation potential is shown to vary not only between journal subject categories – groupings of journals sharing a research field – or disciplines (e.g., journals in mathematics, engineering and social sciences tend to have lower values than titles in life sciences), but also between journals within the same subject category. For instance, basic journals tend to show higher citation potentials than applied or clinical journals, and journals covering emerging topics higher than periodicals in classical subjects or more general journals. SNIP corrects for such differences. Its strengths and limitations are critically discussed, and suggestions are made for further research. All empirical results are derived from Elsevier's Scopus.  相似文献   

20.
《资料收集管理》2013,38(3-4):185-198
This paper combines insights from the Sociology of Science concerning the strong role of thesis advisors in shaping the attitudes of young scientists with an appreciation of Journal Citation Reports in identifying families of related journals. The result is a series of tools for the librarian to use in anticipating or confirming the journal purchase requests of this particular clientele. It is shown that checklists based on the individual master/apprentice relationship can be prepared prior to the first meeting of librarian and young scientist, can be used with confidence in negotiations, and are often superior to lists based on more generalized data. A successful test involving the case studies of 231 young biochemists and their advisors is offered as tentative validation of these procedures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号