首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
目前比较成熟的安全模型大部分是访问控制策略模型,用于描述与维护系统中数据的保密性和完整性。基于角色的访问控制(RBAC)借鉴了人们熟知的用户组、权限组以及职责分离的概念和应用,以角色为中心的权限管理更符合公司和企业的实际管理方式,所以RBAC模型的研究和应用发展得非常快。通过给用户分配恰当的角色,使角色拥有一定的权限成为访问控制的主体,可以提高管理效率。介绍了基于角色的出租车管理系统的部分设计;阐述了模块设计与应用和数据库的设计理念;成功地实现了对不同角色的安全访问控制,有效地解决了安全访问控制问题。  相似文献   

2.
在基于角色的访问控制(RBAC)模型结构的基础上,提出了一种多级权限访问控制模型。该模型采用逐级授权方式,细化了权限控制粒度,实现了权限动态配置、模块复用,能够更灵活高效地对系统进行权限访问控制。并将其应用于实际系统项目的建设。  相似文献   

3.
以一个B/S系统为例,基于角色访问控制(RBAC)原理,实现了基于PureMVC框架的角色访问控制功能,为多个单位合作共建的系统实现了灵活、实用的基于角色的权限管理和访问控制,提高了系统安全性。实际应用表明,该系统具有较好的可扩展性和通用性。  相似文献   

4.
基于RBAC的电子政务信息资源访问控制策略研究   总被引:1,自引:0,他引:1  
为提高电子政务系统的安全性和保护政务信息资源,合理构建访问控制显得尤为关键,基于角色的访问控制(RBAC)是目前解决系统信息资源统一访问控制的有效方法。针对电子政务建设安全性的需求,介绍RBAC中用户、角色、权限之间的关系和特点,阐明电子政务信息资源访问控制中引入RBAC的意义。在此基础上,构建电子政务环境下基于RBAC的政务信息资源访问控制的应用框架。  相似文献   

5.
基于角色的数字图书馆访问控制管理   总被引:1,自引:0,他引:1  
赵雅洁 《现代情报》2009,29(6):76-79
采用基于角色的用户访问控制管理方法,本文对数字图书馆的用户管理机制进行了分析和研究,对其中的关键问题进行了解决,给出了设计与实现的具体方案,该方案对数字图书馆用户管理建设具有借鉴意义。并通过角色——页面模型简化了RBAC的实现,通过角色继承实现了权限的自动分配和简化了系统用户管理操作,从而提高了整个系统的安全性。  相似文献   

6.
基于RBAC的信息系统访问控制模型   总被引:5,自引:1,他引:5  
王亚民 《情报杂志》2005,24(10):43-45
指出传统的以功能模块直接对用户进行权限分配已不利于系统管理和扩展。基于角色的访问控制(RBAC)强调用户的权限不是由用户名而是由用户在组织中的角色决定的,通过角色间接地访问系统资源,保证了权限配置的灵活性和安全性。研究RBAC的体系架构,提出了一种基于用户功能模块的权限管理机制,并给出了实现方法。  相似文献   

7.
李学俭  黄晨晖 《中国科技信息》2007,(10):130-130,132
本文依据RBAC设计模型实现了基于角色的权限控制在进销存系统的应用。首先描述了RBAC的基本思想,然后重点叙述了系统架构及实现效果,文中提出采用RBAC权限模型进行权限系统的实现与部署,将用户纳入角色组进行集中统一的管理的思路,在企业应用系统开发中具有重要的参考价值。  相似文献   

8.
WEB环境下角色的权限控制是一个热点问题,针对此问题详细介绍了一种实用的解决方法.基于角色访问控制模型的方法,并在此基础上根据实际项目设计了基于RBAC的权限认证方法,最后给出权限子系统的步骤,简历初步的权限管理模型.  相似文献   

9.
角色访问控制技术在网络安全中的应用   总被引:2,自引:0,他引:2  
访问控制是信息安全技术的重要组成部分。基于角色的访问控制RBAC是一种方便、安全、高效的访问控制机制,本文介绍RBAC的基本概念及应用优势,研究了RBAC机制的实现,分析了角色访问控制技术在网络安全中的应用。  相似文献   

10.
对自主访问控制及RBAC模型进行简述,讨论各自的优缺点,在此基础上提出了基于角色并融合自主访问控制思想的具体Web应用场景下的模型体系,这种模型体系综合了两种访问控制的优点,实践证明这样的融合使得权限管理更加灵活、高效,具有一定的适用性。  相似文献   

11.
为了不刷新浏览器窗口而满足用户的操作,采用基于Ajax技术的DWR和Struts相结合的框架,实现浏览器与服务器的异步通讯,在页面上的少部分内容更新时不再需要传输整个页面的HTML,降低了网络数据传输量。对此技术进行了论述,给出了网上作业系统的应用示例。  相似文献   

12.
This research is a part of ongoing study to better understand citation analysis on the Web. It builds on Kleinberg's research (J. Kleinberg, R. Kumar, P. Raghavan, P. Rajagopalan, A. Tomkins, Invited survey at the International Conference on Combinatorics and Computing, 1999) that hyperlinks between web pages constitute a web graph structure and tries to classify different web graphs in the new coordinate space: out-degree, in-degree. The out-degree coordinate is defined as the number of outgoing web pages from a given web page. The in-degree coordinate is the number of web pages that point to a given web page. In this new coordinate space a metric is built to classify how close or far are different web graphs. Kleinberg's web algorithm (J. Kleinberg, Proceedings of the ACM-SIAM Symposium on Discrete Algorithms, 1998, pp. 668–677) on discovering “hub web pages” and “authorities web pages” is applied in this new coordinate space. Some very uncommon phenomenon has been discovered and new interesting results interpreted. This study does not look at enhancing web retrieval by adding context information. It only considers web hyperlinks as a source to analyze citations on the web. The author believes that understanding the underlying web page as a graph will help design better web algorithms, enhance retrieval and web performance, and recommends using graphs as a part of visual aid for search engine designers.  相似文献   

13.
We present a term weighting approach for improving web page classification, based on the assumption that the images of a web page are those elements which mainly attract the attention of the user. This assumption implies that the text contained in the visual block in which an image is located, called image-block, should contain significant information about the page contents. In this paper we propose a new metric, called the Inverse Term Importance Metric, aimed at assigning higher weights to important terms contained into important image-blocks identified by performing a visual layout analysis. We propose different methods to estimate the visual image-blocks importance, to smooth the term weight according to the importance of the blocks in which the term is located. The traditional TFxIDF model is modified accordingly and used in the classification task. The effectiveness of this new metric and the proposed block evaluation methods have been validated using different classification algorithms.  相似文献   

14.
王成 《人天科学研究》2011,(10):126-127
随着Internet的飞速发展,恶意网页已经成为影响网络安全的主要问题之一。Rootkit是一种基于Windows分层驱动模型的技术。介绍了一种基于Rootkit技术的恶意网页防护系统的设计,对恶意网页防护的研究具有一定的参考价值。  相似文献   

15.
A fast and efficient page ranking mechanism for web crawling and retrieval remains as a challenging issue. Recently, several link based ranking algorithms like PageRank, HITS and OPIC have been proposed. In this paper, we propose a novel recursive method based on reinforcement learning which considers distance between pages as punishment, called “DistanceRank” to compute ranks of web pages. The distance is defined as the number of “average clicks” between two pages. The objective is to minimize punishment or distance so that a page with less distance to have a higher rank. Experimental results indicate that DistanceRank outperforms other ranking algorithms in page ranking and crawling scheduling. Furthermore, the complexity of DistanceRank is low. We have used University of California at Berkeley’s web for our experiments.  相似文献   

16.
根据企业门户中信息更新的特点,结合企业门户信息检索的要求,在蜘蛛程序搜索策略中提出基于重要Web页面的增量获取思想,并利用多线程技术,设计应用于企业门户信息搜集的网络蜘蛛,使网络蜘蛛的搜索效率得到了提高。  相似文献   

17.
The popularity of Twitter for information discovery, coupled with the automatic shortening of URLs to save space, given the 140 character limit, provides cybercriminals with an opportunity to obfuscate the URL of a malicious Web page within a tweet. Once the URL is obfuscated, the cybercriminal can lure a user to click on it with enticing text and images before carrying out a cyber attack using a malicious Web server. This is known as a drive-by download. In a drive-by download a user's computer system is infected while interacting with the malicious endpoint, often without them being made aware the attack has taken place. An attacker can gain control of the system by exploiting unpatched system vulnerabilities and this form of attack currently represents one of the most common methods employed. In this paper we build a machine learning model using machine activity data and tweet metadata to move beyond post-execution classification of such URLs as malicious, to predict a URL will be malicious with 0.99 F-measure (using 10-fold cross-validation) and 0.833 (using an unseen test set) at 1 s into the interaction with the URL. Thus, providing a basis from which to kill the connection to the server before an attack has completed and proactively blocking and preventing an attack, rather than reacting and repairing at a later date.  相似文献   

18.
Search ‘de novo protein design’ on Google and you will find the name David Baker in all results of the first page. Professor David Baker at the University of Washington and other scientists are opening up a new world of fantastic proteins. Protein is the direct executor of most biological functions and its structure and function are fully determined by its primary sequence. Baker''s group developed the Rosetta software suite that enabled the computational prediction and design of protein structures. Being able to design proteins from scratch means being able to design executors for diverse purposes and benefit society in multiple ways. Recently, NSR interviewed Prof. Baker on this fast-developing field and his personal experiences.  相似文献   

19.
In this paper, modelling and simulation of Chua's chaotic oscillator, which exhibits rich chaotic behaviours, are presented by using the bond graph model. Up to now modelling of Chua's chaotic oscillator using bond graph model is not yet developed. The non-linear resistor in the circuit is modelled in this contribution by linear time-invariant components and ideal switches using piecewise linearization approach. The bond graph model of all the circuit including switches is then generated. Simulations are provided via the computer program called as BOMAS using the obtained bond graph model. Finally, Chua's circuit is verified experimentally. It is shown that all experimental and simulation results well agree with the chaotic behaviours of Chua's circuit.  相似文献   

20.
Web sites often provide the first impression of an organization. For many organizations, web sites are crucial to ensure sales or to procure services within. When a person opens a web site, the first impression is probably made in a few seconds, and the user will either stay or move on to the next site on the basis of many factors. One of the factors that may influence users to stay or go is the page aesthetics. Another reason may involve a user’s judgment about the site’s credibility. This study explores the possible link between page aesthetics and a user’s judgment of the site’s credibility. Our findings indicate that when the same content is presented using different levels of aesthetic treatment, the content with a higher aesthetic treatment was judged as having higher credibility. We call this the amelioration effect of visual design and aesthetics on content credibility. Our study suggests that this effect is operational within the first few seconds in which a user views a web page. Given the same content, a higher aesthetic treatment will increase perceived credibility.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号