首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   348篇
  免费   12篇
  国内免费   3篇
工业技术   363篇
  2023年   6篇
  2022年   13篇
  2021年   28篇
  2020年   9篇
  2019年   22篇
  2018年   21篇
  2017年   28篇
  2016年   13篇
  2015年   14篇
  2014年   26篇
  2013年   38篇
  2012年   23篇
  2011年   18篇
  2010年   14篇
  2009年   18篇
  2008年   10篇
  2007年   10篇
  2006年   4篇
  2005年   7篇
  2004年   4篇
  2002年   2篇
  2001年   2篇
  1999年   1篇
  1998年   3篇
  1997年   2篇
  1996年   5篇
  1995年   1篇
  1994年   1篇
  1993年   3篇
  1991年   2篇
  1990年   1篇
  1989年   3篇
  1988年   1篇
  1985年   1篇
  1979年   2篇
  1977年   2篇
  1976年   3篇
  1974年   2篇
排序方式: 共有363条查询结果,搜索用时 15 毫秒
1.
This paper presents a method for reconstructing unreliable spectral components of speech signals using the statistical distributions of the clean components. Our goal is to model the temporal patterns in speech signal and take advantage of correlations between speech features in both time and frequency domain simultaneously. In this approach, a hidden Markov model (HMM) is first trained on clean speech data to model the temporal patterns which appear in the sequences of the spectral components. Using this model and according to the probabilities of occurring noisy spectral component at each states, a probability distributions for noisy components are estimated. Then, by applying maximum a posteriori (MAP) estimation on the mentioned distributions, the final estimations of the unreliable spectral components are obtained. The proposed method is compared to a common missing feature method which is based on the probabilistic clustering of the feature vectors and also to a state of the art method based on sparse reconstruction. The experimental results exhibits significant improvement in recognition accuracy over a noise polluted Persian corpus.  相似文献   
2.
3.
Communication overhead is the key obstacle to reaching hardware performance limits. The majority is associated with software overhead, a significant portion of which is attributed to message copying. To reduce this copying overhead, we have devised techniques that do not require to copy a received message in order for it to be bound to its final destination. Rather, a late-binding mechanism, which involves address translation and a dedicated cache, facilitates fast access to received messages by the consuming process/thread.We have introduced two policies namely Direct to Cache Transfer (DTCT) and lazy DTCT that determine whether a message after it is bound needs to be transferred into the data cache. We have studied the proposed methods in simulation and have shown their effectiveness in reducing access times to message payloads by the consuming process.  相似文献   
4.
Data co-clustering refers to the problem of simultaneous clustering of two data types. Typically, the data is stored in a contingency or co-occurrence matrix C where rows and columns of the matrix represent the data types to be co-clustered. An entry C ij of the matrix signifies the relation between the data type represented by row i and column j. Co-clustering is the problem of deriving sub-matrices from the larger data matrix by simultaneously clustering rows and columns of the data matrix. In this paper, we present a novel graph theoretic approach to data co-clustering. The two data types are modeled as the two sets of vertices of a weighted bipartite graph. We then propose Isoperimetric Co-clustering Algorithm (ICA)—a new method for partitioning the bipartite graph. ICA requires a simple solution to a sparse system of linear equations instead of the eigenvalue or SVD problem in the popular spectral co-clustering approach. Our theoretical analysis and extensive experiments performed on publicly available datasets demonstrate the advantages of ICA over other approaches in terms of the quality, efficiency and stability in partitioning the bipartite graph.  相似文献   
5.
6.
The existing service-life prediction models rarely account for the effect of cracks on mass transport and durability of concrete. To correct this deficiency, transport in fractured porous media must be studied. The objective of this paper is to quantify the water permeability of localized cracks as a function of crack geometry (i.e., width, tortuosity, and surface roughness). Plain and fiber-reinforced mortar disk specimens were cracked by splitting tension; and the crack profile was digitized by image analysis and translated into crack geometric properties. Crack permeability was measured using a Darcian flow-thru cell. The results show that permeability is a function of crack width square. Crack tortuosity and roughness reduce the permeability by a factor of 4 to 6 below what is predicted by the theory for smooth parallel plate cracks. Although tortuosity and roughness exhibit fractal behavior, their proper measurement is possible and results in correct estimation of crack permeability.  相似文献   
7.
8.
9.
Most of industrial applications of statistical process control involve more than one quality characteristics to be monitored. These characteristics are usually correlated, causing challenges for the monitoring methods. These challenges are resolved using multivariate quality control charts that have been widely developed in recent years. Nonetheless, multivariate process monitoring methods encounter a problem when the quality characteristics are of the attribute type and follow nonnormal distributions such as multivariate binomial or multivariate Poisson. Since the data analysis in the latter case is not as easy as the normal case, more complexities are involved to monitor multiattribute processes. In this paper, a hybrid procedure is developed to monitor multiattribute correlated processes, in which number of defects in each characteristic is important. Two phases are proposed to design the monitoring scheme. In the first phase, the inherent skewness of multiattribute Poisson data is almost removed using a root transformation technique. In the second phase, a method based on the decision on belief concept is employed. The transformed data obtained from the first phase are implemented on the decision on belief (DOB) method. Finally, some simulation experiments are performed to compare the performances of the proposed methodology with the ones obtained using the Hotelling T 2 and the MEWMA charts in terms of in-control and out-of-control average run length criteria. The simulation results show that the proposed methodology outperforms the other two methods.  相似文献   
10.
A new process termed here as remelting and sedimentation (RAS) was developed to produce functionally graded Al/SiC composites with a smooth concentration gradient of SiC particles along the height of samples, as opposed to a step change. For this purpose, first settling velocities of different-sized SiC particles in aluminum A356 melt were measured, and the results exhibited a reasonably good agreement with those predicted via the modified Stokes law. Then slices of particulate Al/SiC composites with different SiC contents of 5, 10, 15, and 20 vol.% were stacked in a cast iron mold and heated at 650 °C resulting in remelting and unification of the different composite parts. Considering the preliminary settling experiments, the composite slurry was held at this temperature for three different times to investigate the optimum holding time for obtaining a smooth gradient of SiC concentration along the height of the sample. After quenching, the samples were sectioned and subjected to metallographic studies and hardness measurements. The results confirmed that holding the melt for 60 s provides sufficient settling and redistribution of SiC particles and results in successful production of a functionally graded material.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号