共查询到20条相似文献,搜索用时 156 毫秒
1.
介绍对空中目标进行排序的方法,其基本思想是首先构建威胁评估指标体系,对各指标进行量化处理,然后应用灰色关联度分析对空中威胁目标进行威胁评估与排序,找出影响战场态势的重要目标,以便合理正确地分配火力进行打击。通过实例来说明灰色关联分析在目标排序中的应用。 相似文献
2.
基于层次分析法(AHP)的空中目标威胁度估计 总被引:1,自引:0,他引:1
为了对空中目标的威胁度进行科学评估和排序,以便于指挥员正确作出决策和作战指令,提出基于AHP的空中目标威胁估计的方法和具体步骤。首先分析影响空中目标威胁度的具体指标因素,同时构建各个因素的威胁因子和评判函数,然后利用一种改进型的层次分析算法对空中目标的威胁度进行全面综合的评估排序。实验案例结果表明,该评估方法的应用解决... 相似文献
3.
空中来袭目标的威胁程度估计是一个复杂的多因素综合评价过程,其中有些因素具有模糊性和随机性。云模型是以研究系统模糊性和不确定性为背景,实现系统的定性评价与定量研究相结合、相转换的理论模型。论文介绍了云模型的数字特征及相关算法,建立了目标威胁程度判断准则集合,并结合云模型给出了评价语定性与定量的转化方法。实例分析表明:采用云模型进行目标威胁估计,可得到精确可信的威胁程度,从而为威胁估计做出快速准确的判断提供了科学依据。 相似文献
4.
5.
环境质量评价是一个多指标决策过程,考虑到评价指标众多关系复杂,该文运用降维效果显著、能有效解决非线性问题的核主成分分析(KPCA)方法对主成分分析(PCA)综合评价进行改进,建立环境质量综合评价模型。实证研究结果表明该模型能够较客观地反映不同地区的环境状况。 相似文献
6.
7.
8.
9.
《计算机与应用化学》2017,(2)
评价复杂体系中各因素对特定对象的影响程度是一类常见问题。通过影响程度,可以发现主要影响因素,从而实现相应的科研和生产目标。本文提出了一种影响程度因子,用于评价复杂体系中各种因素的影响程度。该影响程度因子基于多元线性模型,通过偏导数得出,并且消除了量纲,以对比不同类型因素的影响程度。在建模样本数量有限的情况下,通过主成分分析对原始建模数据进行降维,然后在主成分和特定对象之间建立多元线性模型,最终获得各因素的影响程度因子。将影响程度因子用于研究卷烟产品9项物理指标对于4种烟气成分含量的影响,得到了各烟气成分含量的主要影响因素,为产品研发提供数据支持。 相似文献
10.
结合光电干扰武器系统的工作过程,对影响目标威胁评估的各种因素进行了分析,讨论了常用威胁评估方法的缺点和不足,提出了基于神经网络的空中目标的威胁估计算法,利用神经网络良好的自适应能力和自学习能力,通过样本数据训练,确定各个因素之间的非线性复杂关系,并通过示例介绍了目标威胁值的解算过程;与层次分析法进行了比较,结果表明,神经网络可以很好地逼近各个因素之间的权重关系,提高了空中目标威胁估计算法的准确性和适应性。 相似文献
11.
在研究复杂问题时,主成分分析方法可以抓住问题的主要矛盾,揭示其内部各因素之间的规律性,提高分析的效率。R软件是一款免费且功能强大的软件,研究表明R软件可以方便快捷地完成主成分分析的计算,且具有很高的计算精度。 相似文献
12.
《Expert systems with applications》2007,32(2):422-426
In this study, we compared classical principal components analysis (PCA), generalized principal components analysis (GPCA), linear principal components analysis using neural networks (PCA-NN), and non-linear principal components analysis using neural networks (NLPCA-NN). Data were extracted from the patient satisfaction query with regard to the satisfaction of patients from hospital staff, which was applied in 2005 at the outpatient clinics of Trakya University Medical Faculty. We found that percentages of explained variance of principal components from PCA-NN and NLPCA-NN were highest for doctor, nurse, radiology technician, laboratory technician, and other staff using a patient satisfaction data set. Results show that methods using NN which have higher percentages of explained variances than classical methods could be used for dimension reduction. 相似文献
13.
Pekka J. Korhonen 《Computational statistics & data analysis》1984,2(3):243-255
In this study we deal with the problem of finding subjective principal components for a given set of variables in a data matrix. The principal components are not determined by maximizing their variances; they are specified by a user, who can maximize the absolute values of the correlations between principal components and the variables important to him. The correlation matrix of the variables is the basic information needed in the analysis.The problem is formulated as a multiple criteria problem and solved by using an interactive procedure. The procedure is convenient to use and easy to implement. We have implemented an experimental version on an APPLE III microcomputer. A graphical display is used as an aid in finding the principal components. An illustrative application is presented, too. 相似文献
14.
并行主成分提取算法在信号特征提取中具有十分重要的作用, 采用加权规则将主子空间(Principal subspace, PS)提取算法转变为并行主成分提取算法是很有效的方式, 但研究加权规则对状态矩阵影响的理论分析非常少. 对加权规则影响的分析不仅可以提供加权规则下的主成分提取算法动力学的详细认知, 而且对于其他子空间跟踪算法转变为并行主成分提取算法的可实现性给出判断条件. 本文通过比较Oja的主子空间跟踪算法和加权Oja并行主成分提取算法, 通过两种算法的差异分析了加权规则对算法提取矩阵方向的影响. 首先, 针对二维输入信号, 研究了提取两个主成分时加权规则的信息准则对状态矩阵方向的作用方式. 进而, 针对大于二维输入信号的情况, 给出加权规则影响多个主成分提取方式的讨论. 最后, MATLAB仿真验证了所提出理论的有效性. 相似文献
15.
主成份分析对高维数据进行维数约简可有效提高聚类算法的性能,但这种方法容易丢失部分对聚类具有贡献的成份.为在维数约简的同时保留对聚类具有贡献的成份,提出一种维数约简与聚类交互进行的迭代算法.每次迭代可表示为约束优化问题,并可求解此优化问题的解析解,进而给出相应的迭代聚类算法,称之为基于约束主成份分析的本文聚类.在Reuter21578、WebKB文档集上的实验结果表明,文中方法与k-均值聚类、非负矩阵分解聚类和谱聚类相比具有较好的性能. 相似文献
16.
Yue-Fei Guo Xiaodong Lin Zhou Teng Xiangyang Xue Jianping Fan 《Pattern recognition》2012,45(3):1211-1219
In this paper, a covariance-free iterative algorithm is developed to achieve distributed principal component analysis on high-dimensional data sets that are vertically partitioned. We have proved that our iterative algorithm converges monotonously with an exponential rate. Different from existing techniques that aim at approximating the global PCA, our covariance-free iterative distributed PCA (CIDPCA) algorithm can estimate the principal components directly without computing the sample covariance matrix. Therefore a significant reduction on transmission costs can be achieved. Furthermore, in comparison to existing distributed PCA techniques, CIDPCA can provide more accurate estimations of the principal components and classification results. We have demonstrated the superior performance of CIDPCA through the studies of multiple real-world data sets. 相似文献
17.
We propose a constrained EM algorithm for principal component analysis (PCA) using a coupled probability model derived from single-standard factor analysis models with isotropic noise structure. The single probabilistic PCA, especially for the case where there is no noise, can find only a vector set that is a linear superposition of principal components and requires postprocessing, such as diagonalization of symmetric matrices. By contrast, the proposed algorithm finds the actual principal components, which are sorted in descending order of eigenvalue size and require no additional calculation or postprocessing. The method is easily applied to kernel PCA. It is also shown that the new EM algorithm is derived from a generalized least-squares formulation. 相似文献
18.
19.
R. M. LARK 《International journal of remote sensing》2013,34(4):779-787
Discriminant analysis and canonical variates analysis on principal components of a number of extracts from multi-spectral images showed that low order components with large eigenvalues are not necessarily the most important for distinguishing classes of landcover and discarding components with small eigenvalues may reduce the accuracy of discrimination. It is therefore inadvisable to use principal components analysis for reducing the number of wavebands used for discriminant analysis. 相似文献
20.
In this work, we investigate a new ranking method for principal component analysis (PCA). Instead of sorting the principal components in decreasing order of the corresponding eigenvalues, we propose the idea of using the discriminant weights given by separating hyperplanes to select among the principal components the most discriminant ones. The method is not restricted to any particular probability density function of the sample groups because it can be based on either a parametric or non-parametric separating hyperplane approach. In addition, the number of meaningful discriminant directions is not limited to the number of groups, providing additional information to understand group differences extracted from high-dimensional problems. To evaluate the discriminant principal components, separation tasks have been performed using face images and three different databases. Our experimental results have shown that the principal components selected by the separating hyperplanes allow robust reconstruction and interpretation of the data, as well as higher recognition rates using less linear features in situations where the differences between the sample groups are subtle and consequently most difficult for the standard and state-of-the-art PCA selection methods. 相似文献