首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A generic lexical URL segmentation framework for counting links,colinks or URLs
Authors:Mike Thelwall  David Wilkinson
Institution:School of Computing and Information Technology, University of Wolverhampton, Wulfruna Street, Wolverhampton WV1 1SB, UK
Abstract:Large sets of Web page links, colinks, or URLs sometimes need to be counted or otherwise summarized by researchers to analyze Web growth or publishing. Computing professionals also use them to evaluate Web sites or optimize search engines. Despite the apparently simple nature of these types of data, many different summarization methods have been used in the past. Some of these methods may not have been optimal. This article proposes a generic lexical framework to unify and extend existing methods through abstract notions of link lists and URL lists. The approach is built upon decomposing URLs by lexical segments, such as domain names, and systematically characterizing the counting options available. In addition, counting method choice recommendations are inferred from a very general set of theoretical research assumptions. The article also offers practical advice for analyzing raw data from search engines.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号