首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2869篇
  免费   703篇
  国内免费   640篇
工业技术   4212篇
  2024年   36篇
  2023年   131篇
  2022年   204篇
  2021年   195篇
  2020年   199篇
  2019年   188篇
  2018年   177篇
  2017年   181篇
  2016年   183篇
  2015年   205篇
  2014年   231篇
  2013年   219篇
  2012年   242篇
  2011年   276篇
  2010年   183篇
  2009年   206篇
  2008年   207篇
  2007年   217篇
  2006年   168篇
  2005年   144篇
  2004年   93篇
  2003年   63篇
  2002年   58篇
  2001年   29篇
  2000年   33篇
  1999年   34篇
  1998年   22篇
  1997年   14篇
  1996年   11篇
  1995年   12篇
  1994年   6篇
  1993年   7篇
  1992年   3篇
  1991年   2篇
  1990年   4篇
  1989年   9篇
  1988年   4篇
  1987年   2篇
  1984年   4篇
  1983年   5篇
  1981年   1篇
  1980年   3篇
  1977年   1篇
排序方式: 共有4212条查询结果,搜索用时 15 毫秒
1.
We introduce a new probabilistic approach to dealing with uncertainty, based on the observation that probability theory does not require that every event be assigned a probability. For a nonmeasurable event (one to which we do not assign a probability), we can talk about only the inner measure and outer measure of the event. In addition to removing the requirement that every event be assigned a probability, our approach circumvents other criticisms of probability-based approaches to uncertainty. For example, the measure of belief in an event turns out to be represented by an interval (defined by the inner and outer measures), rather than by a single number. Further, this approach allows us to assign a belief (inner measure) to an event E without committing to a belief about its negation -E (since the inner measure of an event plus the inner measure of its negation is not necessarily one). Interestingly enough, inner measures induced by probability measures turn out to correspond in a precise sense to Dempster-Shafer belief functions. Hence, in addition to providing promising new conceptual tools for dealing with uncertainty, our approach shows that a key part of the important Dempster-Shafer theory of evidence is firmly rooted in classical probability theory. Cet article présente une nouvelle approche probabiliste en ce qui concerne le traitement de l'incertitude; celle-ci est basée sur l'observation que la théorie des probabilityés n'exige pas qu'une probabilityé soit assignée à chaque événement. Dans le cas d'un événement non mesurable (un événement pour lequel on n'assigne aucune probabilityé), nous ne pouvons discuter que de la mesure intérieure et de la mesure extérieure de l'évenément. En plus d'éliminer la nécessité d'assigner une probabilityéà l'événement, cette nouvelle approche apporte une réponse aux autres critiques des approches à l'incertitude basées sur des probabilityés. Par exemple, la mesure de croyance dans un événement est représentée par un intervalle (défini par la mesure intérieure et extérieure) plutǒt que par un nombre unique. De plus, cette approche nous permet d'assigner une croyance (mesure intérieure) à un événement E sans se compromettre vers une croyance à propos de sa négation -E (puisque la mesure intérieure d'un événement et la mesure intérieure de sa négation ne sont pas nécessairement une seule et unique mesure). II est intéressant de noter que les mesures intérieures qui résultent des mesures de probabilityé correspondent d'une manière précise aux fonctions de croyance de Dempster-Shafer. En plus de constituer un nouvel outil conceptuel prometteur dans le traitement de l'incertitude, cette approche démontre qu'une partie importante de la théorie de l'évidence de Dempster-Shafer est fermement ancrée dans la theorie classique des probabilityés.  相似文献   
2.
Centroid-based categorization is one of the most popular algorithms in text classification. In this approach, normalization is an important factor to improve performance of a centroid-based classifier when documents in text collection have quite different sizes and/or the numbers of documents in classes are unbalanced. In the past, most researchers applied document normalization, e.g., document-length normalization, while some consider a simple kind of class normalization, so-called class-length normalization, to solve the unbalancedness problem. However, there is no intensive work that clarifies how these normalizations affect classification performance and whether there are any other useful normalizations. The purpose of this paper is three folds; (1) to investigate the effectiveness of document- and class-length normalizations on several data sets, (2) to evaluate a number of commonly used normalization functions and (3) to introduce a new type of class normalization, called term-length normalization, which exploits term distribution among documents in the class. The experimental results show that a classifier with weight-merge-normalize approach (class-length normalization) performs better than one with weight-normalize-merge approach (document-length normalization) for the data sets with unbalanced numbers of documents in classes, and is quite competitive for those with balanced numbers of documents. For normalization functions, the normalization based on term weighting performs better than the others on average. For term-length normalization, it is useful for improving classification accuracy. The combination of term- and class-length normalizations outperforms pure class-length normalization and pure term-length normalization as well as unnormalization with the gaps of 4.29%, 11.50%, 30.09%, respectively.  相似文献   
3.
受全球气候变化影响和人类活动的加剧,干旱灾害发生频次增加且强度增大,严重威胁着我国的粮食安全和水安全。准确及时的旱情预报,对于制定科学有效的干旱应对策略、减少灾害造成的损失具有重大意义。从基于数理统计模型的预报技术和基于物理机制模型的预报技术两方面入手,梳理回顾了国内外研究进展,揭示了当前预报技术所存在的问题,并提出针对性的解决方案。未来研究应注重提高干旱监测数据的质量、突破核心关键技术、构建全国旱情预报业务化系统,为抗旱减灾事业提供强有力的科技支撑。  相似文献   
4.
As the demand for high-quality stereo images has grown in recent years, stereoscopic image quality assessment (SIQA) has become an important research area in modern image processing technology.In this paper, we propose a no-reference stereoscopic image quality assessment (NR-SIQA) model using heterogeneous ensemble learning ‘quality-aware’ features from luminance image, chrominance image, disparity and cyclopean images via quaternion wavelet transform (QWT). Firstly, luminance image and chrominance image are generated by CIELAB color space as monocular perception, and the novel disparity and cyclopean images are utilized to complement with monocular information. Then, a number of ‘quality-aware’ features in the quaternion wavelet domain are discovered, including entropy, texture features, energy features, energy differences features and MSCN coefficients of high frequency sub-band. Finally, a heterogeneous ensemble model via support vector regression (SVR) & extreme learning machine (ELM) & random forest (RF) is proposed to predict quality score, and bootstrap sampling and rotated feature space are used to increase the diversity of data distribution. Comparing with the state-of-the-art NR-SIQA models, experimental results on four public databases prove the accuracy and robustness of the proposed model.  相似文献   
5.
In real world, the automatic detection of liver disease is a challenging problem among medical practitioners. The intent of this work is to propose an intelligent hybrid approach for the diagnosis of hepatitis disease. The diagnosis is performed with the combination of k‐means clustering and improved ensemble‐driven learning. To avoid clinical experience and to reduce the evaluation time, ensemble learning is deployed, which constructs a set of hypotheses by using multiple learners to solve a liver disease problem. The performance analysis of the proposed integrated hybrid system is compared in terms of accuracy, true positive rate, precision, f‐measure, kappa statistic, mean absolute error, and root mean squared error. Simulation results showed that the enhanced k‐means clustering and improved ensemble learning with enhanced adaptive boosting, bagged decision tree, and J48 decision tree‐based intelligent hybrid approach achieved better prediction outcomes than other existing individual and integrated methods.  相似文献   
6.
针对基于遥感影像的水体提取方法存在水体提取不完整和误提的现象,提出了一种基于SPOT-5多光谱影像的矿区塌塘水体提取方法。在利用波段合成增加一个可用波段的基础上对已有的水体提取方法进行适当的改进,并基于决策树分类器和改进后的方法进行矿区水体的四级提取,保证了水体提取的完整性,同时减少了误提率;最后利用实测数据对水体提取的精度进行了评定。试验结果表明,基于决策树分类器的水体提取方法具有较高的精度,能满足矿区实际应用的需要。  相似文献   
7.
选择性集成是当前机器学习领域的研究热点之一。由于选择性集成属于NP"难"问题,人们多利用启发式方法将选择性集成转化为其他问题来求得近似最优解,因为各种算法的出发点和描述角度各不相同,现有的大量选择性集成算法显得繁杂而没有规律。为便于研究人员迅速了解和应用本领域的最新进展,本文根据选择过程中核心策略的特征将选择性集成算法分为四类,即迭代优化法、排名法、分簇法、模式挖掘法;然后利用UCI数据库的20个常用数据集,从预测性能、选择时间、结果集成分类器大小三个方面对这些典型算法进行了实验比较;最后总结了各类方法的优缺点,并展望了选择性集成的未来研究重点。  相似文献   
8.
Breast cancer continues to be a significant public health problem in the world. Early detection is the key for improving breast cancer prognosis. Mammogram breast X-ray is considered the most reliable method in early detection of breast cancer. However, it is difficult for radiologists to provide both accurate and uniform evaluation for the enormous mammograms generated in widespread screening. Micro calcification clusters (MCCs) and masses are the two most important signs for the breast cancer, and their automated detection is very valuable for early breast cancer diagnosis. The main objective is to discuss the computer-aided detection system that has been proposed to assist the radiologists in detecting the specific abnormalities and improving the diagnostic accuracy in making the diagnostic decisions by applying techniques splits into three-steps procedure beginning with enhancement by using Histogram equalization (HE) and Morphological Enhancement, followed by segmentation based on Otsu's threshold the region of interest for the identification of micro calcifications and mass lesions, and at last classification stage, which classify between normal and micro calcifications ‘patterns and then classify between benign and malignant micro calcifications. In classification stage; three methods were used, the voting K-Nearest Neighbor classifier (K-NN) with prediction accuracy of 73%, Support Vector Machine classifier (SVM) with prediction accuracy of 83%, and Artificial Neural Network classifier (ANN) with prediction accuracy of 77%.  相似文献   
9.
针对光照变化人脸识别问题中传统的光谱回归算法不能很好地进行特征提取而严重影响识别性能的问题,提出了局部判别嵌入优化光谱回归分类的人脸识别算法。计算出训练样本的特征向量;借助于数据的近邻和分类关系,利用局部判别嵌入算法构建分类问题所需的嵌入,同时学习每种分类的子流形所需的嵌入;利用光谱回归分类算法计算投影矩阵,并利用最近邻分类器完成人脸的识别。在两大人脸数据库扩展YaleB及CMU PIE上的实验验证了该算法的有效性,实验结果表明,相比其他光谱回归算法,该算法取得了更高的识别率、更好的工作特性,并且降低了计算复杂度。  相似文献   
10.
The introduction of small quantities of lead into calcium hydroxyapatite catalysts produces marked increases in the selectivity to C2+ hydrocarbons, while the conversion of methane remains relatively constant. Small surface concentrations of lead are sufficient to achieve C2+ selectivities of 80 and 90%, with oxygen and nitrous oxide, respectively, in contrast with 18 and 46%, respectively, obtained in the absence of lead. Since surface concentration of lead species sufficient to stabilize pairs of methyl radicals in close proximity to each other would be expected to facilitate the formation of C2 hydrocarbons, an ensemble effect appears to be extant.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号