首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
关于混合熵的讨论   总被引:10,自引:0,他引:10  
分析讨论了其他作者提出的几种混合熵并给出混合熵应遵循的公理。我们还提出了一种新的混合熵形式。指出了混合熵可能的一些应用。  相似文献   

2.
纪滨 《微机发展》2008,(6):73-75
随着对粗糙集理论研究的的深入,基于信息论的信息熵陆续被引入到粗糙集研究中,陆续产生了一些如条件熵、联合熵、知识熵、决策熵、知识粗糙熵、粗集粗糙熵等新的概念,尽管丰富了粗糙集理论和应用,但使用中存在语义不统一的地方,甚至缺乏必要的说明和证明。对这些有价值的新概念作了系统的、严格的、规范的定义及阐述,给出了它们的公式表示,同时,通过相关熵的运算揭示彼此间的关系,最后指出这些熵的应用范畴,以便研究人员在清楚概念的基础上作进一步研究。  相似文献   

3.
The electroencephalogram (EEG) is a representative signal containing information about the condition of the brain. The shape of the wave may contain useful information about the state of the brain. However, the human observer cannot directly monitor these subtle details. Besides, since bio-signals are highly subjective, the symptoms may appear at random in the time scale. Therefore, the EEG signal parameters, extracted and analyzed using computers, are highly useful in diagnostics. The aim of this work is to compare the different entropy estimators when applied to EEG data from normal and epileptic subjects. The results obtained indicate that entropy estimators can distinguish normal and epileptic EEG data with more than 95% confidence (using t-test). The classification ability of the entropy measures is tested using ANFIS classifier. The results are promising and a classification accuracy of about 90% is achieved.  相似文献   

4.
In this paper, we propose some new approaches for attribute reduction in covering decision systems from the viewpoint of information theory. Firstly, we introduce information entropy and conditional entropy of the covering and define attribute reduction by means of conditional entropy in consistent covering decision systems. Secondly, in inconsistent covering decision systems, the limitary conditional entropy of the covering is proposed and attribute reductions are defined. And finally, by the significance of the covering, some algorithms are designed to compute all the reducts of consistent and inconsistent covering decision systems. We prove that their computational complexity are polynomial. Numerical tests show that the proposed attribute reductions accomplish better classification performance than those of traditional rough sets. In addition, in traditional rough set theory, MIBARK-algorithm [G.Y. Wang, H. Hu, D. Yang, Decision table reduction based on conditional information entropy, Chinese J. Comput., 25 (2002) 1-8] cannot ensure the reduct is the minimal attribute subset which keeps the decision rule invariant in inconsistent decision systems. Here, we solve this problem in inconsistent covering decision systems.  相似文献   

5.
在用非线性动力学方法分析EEG信号(睡眠脑电)的过程中,由于近似熵的优点而得到了广泛的应用,但也存在着自身有偏估计、数值和信号长度有关等缺点;在此基础上提出了样品熵的概念,并尝试将样品熵用于睡眠脑电分析,结果表明,它对区分不同的睡眠阶段取得了理想的效果.  相似文献   

6.
车载环境下基于样本熵的语音端点检测方法   总被引:1,自引:0,他引:1  
在语音处理中一个关键性问题是如何准确找到语音的起止位置,目前提出许多的语音端点检测算法不能得到理想的检测结果.由于样本熵是近似熵的改进算法,提出车载环境下基于样本熵的语音端点检测方法.并采用模糊C均值聚类算法和贝叶斯信息判决算法进行样本熵特征门限估计,以及使用双门限法进行语音端点检测.在TIMIT连续语音库上的实验表明,车载噪声环境下,样本熵法和近似熵法的检测正确率均远高于谱熵法和能量谱熵法,而样本熵法相对于近似熵法具有更好的检测效果,特别是当信噪比小于等于OdB时,样本熵法的检测性能优于近似熵法近10%.因此,样本熵法在车栽智能语音领域具有很好的应用前景,能够为车载导航提供准确的语音端点检测技术.  相似文献   

7.
熵及其在空间数据不确定性研究中的应用   总被引:5,自引:0,他引:5  
总结了熵的产生、发展、特性及其应用,讨论了熵与不确定性的关系;针对空间数据不确定性问题,总结了基于熵的空问数据不确定性研究成果,提出了应用混合熵作为统一测度来度量空间数据不确定性的设想。  相似文献   

8.
针对艺术风格绘画分类算法中存在的精度和效率不高等问题,提出一种基于信息 熵的艺术风格绘画分类算法。首先选取西方漫画、素描、油画、水彩画,以及国内烙画、水墨 画、壁画具有代表性的 7 种艺术绘画风格作为研究对象,对图像进行去噪、归一化等预处理。 其次,提取绘画艺术作品风格特征,分别求取图像的颜色熵、分块熵、轮廓熵,并合并构成不 同绘画风格的信息熵。信息熵求取时,将色彩空间转换为 Lab 颜色空间,通过 a、b 通道颜色值 及加权函数获得图像的颜色熵;通过对艺术图像分块求取信息熵,求取分块的信息熵均值获得 分块熵;通过 Contourlet 变换,求取艺术图像的轮廓信息,获得轮廓熵。接着,合并提取的颜 色熵、分块熵、轮廓熵,利用支持向量机(SVM)对艺术风格图像学习训练,获得艺术绘画风格 的分类模型;最后,提取待识别绘画风格样本的熵特征,通过 SVM 分类识别获得最终的分类 结果。该方法具有特征维数少、运算速度快、尺度不变性等优势,实验结果表明,其能提高不 同绘画风格的分类精度和效率。  相似文献   

9.
在粗糙集不确定性度量公式中,模糊熵和模糊度是重要的度量方式。根据粗糙集不确定性度量中模糊熵和新的模糊度公式,提出了在决策信息系统中修正条件信息熵和相对模糊熵的概念,并分别用两种方式证明了熵在属性约简过程中的单调性。然后利用向前添加属性算法进行属性约简,约简结果在RIDAS(roughset based intelligent data analysis system)平台上进行识别率测试,通过实验对比分析了两种新的信息熵与条件信息熵的约简结果,为基于信息熵的属性约简提供了参考。  相似文献   

10.
基于熵的模糊信息测度研究   总被引:1,自引:0,他引:1  
模糊信息测度(Fuzzy Information Measures,FIM)是度量两个模糊集之间相似性大小的一种量度,在模式识别、机器学习、聚类分析等研究中,起着重要的作用.文中对模糊测度进行了分析,研究了基于熵的模糊信息测度理论:首先,概述了模糊测度理论,指出了其优缺点;其次,基于信息熵理论,研究了模糊熵理论,建立了模糊熵公理化体系,讨论了各种模糊熵,在此基础上,提出了模糊绝对熵测度、模糊相对熵测度等模糊熵测度;最后,基于交互熵理论,建立了模糊交互熵理论,进而提出了模糊交互熵测度.这些测度理论,不仅丰富与发展了 FIM理论,而且为模式识别、机器学习、聚类分析等理论与应用研究提供了新的研究方法.  相似文献   

11.
谭睿璞  张文德 《控制与决策》2016,31(11):2005-2012
针对属性权重未知,属性值为直觉语言数的多属性决策问题,提出了一种基于直觉语言熵和广义直觉语言算子的群决策方法.定义了直觉语言熵,并利用直觉语言熵确定属性权重,提出了三种直觉语言算子:广义直觉语言加权几何平均(GILWGA)算子、广义直觉语言有序加权几何(GILOWG)算子及广义直觉语言混合几何(GILHG)算子.利用GILWGA和GILHG算子集结信息,采用基于直觉语言数的得分函数及精确函数进行方案排序与择优,最后通过一个算例说明了该方法的有效性和合理性.  相似文献   

12.
The negation of probability distribution becomes an important topic since some problems are burdensome to deal with directly. Inspired by Yager's negation of probability distribution, an extension model to measure the negation of a probability distribution is proposed using the idea of a nonextensive statistic based on Tsallis entropy. Proofs show that the proposed extension of negation of probability distribution converges to the maximum Tsallis entropy. The proposed model may extend Yager's method to consider the influences of the correlations in a system, which gives the different convergent routes. Some numerical simulation results are used to illustrate the effectiveness of the proposed methodology.  相似文献   

13.
基于条件信息熵的覆盖约简算法   总被引:1,自引:0,他引:1       下载免费PDF全文
李永顺  贾瑞玉 《计算机工程》2010,36(16):176-179
针对覆盖算法中识别精度与泛化能力存在的矛盾,在信息论观点的Rough集理论基础上,提出覆盖熵概念,以决策属性相对于分类器的条件信息熵为约束条件,在确保算法分类能力不降低的情况下,对一组覆盖中信息熵最大的覆盖进行约简,减少了分类器的不确定因素。实验结果证明,该算法具有很好的识别精度与泛化能力,对模糊、不确定的数据也具有较好的处理能力。  相似文献   

14.
一种非线性新相关信息熵定义及其性质、应用   总被引:1,自引:0,他引:1  
在研究了相关信息熵和HPal熵的基础上,提出一种以特征值代替事件发生的概率且以e为底的指数函数形式的改进的非线性新相关信息熵概念.在对有限集最大划分的条件下,推导并从理论上证明了该信息熵的若干性质,这些性质满足香农熵的基本性质.新相关信息熵是一种度量多变量、非线性系统的相关性程度大小的标准.作为多变量之间相关关系的不确定性度量,变量问的相关程度越大,对应的新相关信息熵值越小.新相关信息熵的提出有助于信息融合并为相关分析理论的研究提供了一种新方法和新思路.新相关信息熵和相关信息熵的应用实例结果对比说明新相关信息熵是一种有效且有用的度量非线性系统不确定性的方法.  相似文献   

15.
朱平 《计算机科学》2003,30(10):67-69
Adapting the needs for english education of China, by the concept of Shannon's information entropy, this paper proposes the theory of language entropy and its application model: fish-eye driver model. This work develops initial basis for computer-aided English semantic understand. The paper also discusses three complex problems that need deeper research.  相似文献   

16.
Keeping in view the non-probabilistic nature of experiments, two new measures of weighted fuzzy entropy have been introduced and to check their authenticity, the essential properties of these measures have been studied. Under the fact that measures of entropy can be used for the study of optimization principles when certain partial information is available, we have applied the existing as well as the newly introduced weighted measures of fuzzy entropy to study the maximum entropy principle.  相似文献   

17.
Approximate entropy (ApEn) and sample entropy (SampEn) are the promising techniques for extracting complex characteristics of cardiovascular variability. Ectopic beats, originating from other than the normal site, are the artefacts contributing a serious limitation to heart rate variability (HRV) analysis. The approaches like deletion and interpolation are currently in use to eliminate the bias produced by ectopic beats. In this study, normal R–R interval time series of 10 healthy and 10 acute myocardial infarction (AMI) patients were analysed by inserting artificial ectopic beats. Then the effects of ectopic beats editing by deletion, degree-zero and degree-one interpolation on ApEn and SampEn have been assessed. Ectopic beats addition (even 2%) led to reduced complexity, resulting in decreased ApEn and SampEn of both healthy and AMI patient data. This reduction has been found to be dependent on level of ectopic beats. Editing of ectopic beats by interpolation degree-one method is found to be superior to other methods.  相似文献   

18.
在文献[1]的基础上,定义Vague集的偏熵,关联熵和关联熵系数等新概念.对其主要性质进行讨论,并给出了关联熵系数在相似度量中的应用.  相似文献   

19.
针对基于数据流检测木马检测系统的实际需要,提出一种基于信息熵的数据流加密判断算法,引入N-截断熵的概念用于置信区间的计算,并通过仿真建立了可靠的置信区间.该算法通过检测一条数据流的一个数据包,就可以判断整条数据流是否加密,有非常好的效率,可以达到实时在线判断,通过实验验证,算法具有很高的准确率和很低的误报率,算法已应用于基于数据流检测的木马检测系统,完全达到系统要求.  相似文献   

20.
《Pattern recognition》2014,47(2):806-819
In this paper, we propose the regularized discriminant entropy (RDE) which considers both class information and scatter information on original data. Based on the results of maximizing the RDE, we develop a supervised feature extraction algorithm called regularized discriminant entropy analysis (RDEA). RDEA is quite simple and requires no approximation in theoretical derivation. The experiments with several publicly available data sets show the feasibility and effectiveness of the proposed algorithm with encouraging results.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号