首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12165篇
  免费   688篇
  国内免费   842篇
工业技术   13695篇
  2024年   11篇
  2023年   80篇
  2022年   144篇
  2021年   233篇
  2020年   189篇
  2019年   204篇
  2018年   205篇
  2017年   261篇
  2016年   346篇
  2015年   381篇
  2014年   708篇
  2013年   665篇
  2012年   669篇
  2011年   1025篇
  2010年   665篇
  2009年   693篇
  2008年   752篇
  2007年   887篇
  2006年   834篇
  2005年   715篇
  2004年   655篇
  2003年   671篇
  2002年   482篇
  2001年   398篇
  2000年   281篇
  1999年   286篇
  1998年   256篇
  1997年   207篇
  1996年   154篇
  1995年   147篇
  1994年   102篇
  1993年   67篇
  1992年   74篇
  1991年   46篇
  1990年   38篇
  1989年   43篇
  1988年   30篇
  1987年   8篇
  1986年   16篇
  1985年   19篇
  1984年   12篇
  1983年   10篇
  1982年   9篇
  1981年   6篇
  1980年   4篇
  1979年   2篇
  1978年   1篇
  1976年   3篇
  1973年   1篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
11.
针对基于数据驱动的人脸画像合成算法像素特征缺乏对光照变化和复杂背景的鲁棒性,常合成低质量的画像的问题,文中提出基于深度概率图模型的鲁棒人脸画像合成算法.采用预处理方法调整测试照片的光照亮度和人脸姿态,使之与训练照片一致.采用深度特征代替像素特征进行近邻匹配,采用深度概率图模型对画像重建权重和深度特征权重联合建模,得到合成画像的最佳重构表示.为了提高画像合成速度,提出快速近邻搜索方法.实验验证文中算法的鲁棒性和快速性.  相似文献   
12.
Abstract

The performance of reliability inference strongly depends on the modeling of the product’s lifetime distribution. Many products have complex lifetime distributions whose optimal settings are not easily found. Practitioners prefer to use simpler lifetime distribution to facilitate the data modeling process while knowing the true distribution. Therefore, the effects of model mis-specification on the product’s lifetime prediction is an interesting research area. This article presents some results on the behavior of the relative bias (RB) and relative variability (RV) of pth quantile of the accelerated lifetime (ALT) experiment when the generalized Gamma (GG3) distribution is incorrectly specified as Lognormal or Weibull distribution. Both complete and censored ALT models are analyzed. At first, the analytical expressions for the expected log-likelihood function of the misspecified model with respect to the true model is derived. Consequently, the best parameter for the incorrect model is obtained directly via a numerical optimization to achieve a higher accuracy model than the wrong one for the end-goal task. The results demonstrate that the tail quantiles are significantly overestimated (underestimated) when data are wrongly fitted by Lognormal (Weibull) distribution. Moreover, the variability of the tail quantiles is significantly enlarged when the model is incorrectly specified as Lognormal or Weibull distribution. Precisely, the effect on the tail quantiles is more significant when the sample size and censoring ratio are not large enough. Supplementary materials for this article are available online.  相似文献   
13.
Verification recently has become a challenging topic for business process languages. Verification techniques like model checking allow to ensure that a process complies with domain-specific requirements, prior to the execution. To execute full-state verification techniques like model checking, the state space of the process needs to be constructed. This tends to increase exponentially with the size of the process schema, or it can even be infinite. We address this issue by means of requirements-specific reduction techniques, i.e., reducing the size of the state space without changing the result of the verification. We present an approach that, for a given requirement the system must fulfill, identifies the tasks relevant for the verification. Our approach then uses these relevant tasks for a reduction that confines the process to regions of interest for the verification. To evaluate our new technique, we use real-world industrial processes and requirements. Mainly because these processes make heavy use of parallelization, full-state-search verification algorithms are not able to verify them. With our reduction in turn, even complex processes with many parallel branches can be verified in less than 10 s.  相似文献   
14.
We examined the exhaust performance of a hybrid ventilation strategy for maintaining a safe evacuation environment for tunnel users in a tunnel fire. The hybrid ventilation strategy combines the longitudinal ventilation strategy with the point ventilation strategy which is a type of transverse ventilation strategy. The model tunnel developed by this study was scaled to 1/5 the size of a full-scale tunnel. The model-scale experiment was performed taking into consideration Froude's law of similarity. Measurement items were the distribution of temperature and concentration of smoke inside the tunnel, longitudinal wind velocity, mass flow of smoke in the point ventilation duct, and the heat release rate of the fire source. The following main conclusions were obtained. The smoke height was constant even when varying the extraction rate of smoke from the ceiling vent. The backlayering length and critical velocity of the smoke flow in the hybrid strategy could be predicted by the methodology developed by using the longitudinal strategy. The hybrid strategy maintained a safe evacuation environment on both sides of the tunnel fire.  相似文献   
15.
Two-stage vapor compression technology has high potential of performance improvement for cold climate heat pumps, and there are several types of inter-stage configurations that need to be evaluated before making a choice. A general model of these configurations is first derived from a subcooler cycle and then is extended to be capable of evaluating many other inter-stage configurations by employing an “input domain”. The model is solved with a sequential algorithm and an analytical initial solution of the intermediate pressure is presented. After an experimentally validation with additional calculations of the subcooling parameter, the evaporating and condensing pressure, this general model is then used in the performance comparison and analysis of eight different inter-stage configurations. At last, case studies show that, this general model is capable of performing performance comparison among cycles with different types of inter-stage configurations, as well as refrigerant selection and operational analysis.  相似文献   
16.
Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Process models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior. Existing process mining techniques can only uncover these differences, but the actual repair of the model is left to the user and is not supported. In this paper we investigate the problem of repairing a process model w.r.t. a log such that the resulting model can replay the log (i.e., conforms to it) and is as similar as possible to the original model. To solve the problem, we use an existing conformance checker that aligns the runs of the given process model to the traces in the log. Based on this information, we decompose the log into several sublogs of non-fitting subtraces. For each sublog, either a loop is discovered that can replay the sublog or a subprocess is derived that is then added to the original model at the appropriate location. The approach is implemented in the process mining toolkit ProM and has been validated on logs and models from several Dutch municipalities.  相似文献   
17.
The generic model query language GMQL is designed to query collections of conceptual models created in arbitrary graph-based modelling languages. Querying conceptual models means searching for particular model subgraphs that comply with a predefined pattern query. Such a query specifies the structural and semantic properties of the model fragment to be returned. In this paper, we derive requirements for a generic model query language from the literature and formally specify the language’s syntax and semantics. We conduct an analysis of GMQL׳s theoretical and practical runtime performance concluding that it returns query results within satisfactory time. Given its generic nature, GMQL contributes to a broad range of different model analysis scenarios ranging from business process compliance management to model translation and business process weakness detection. As GMQL returns results with acceptable runtime performance, it can be used to query large collections of hundreds or thousands of conceptual models containing not only process models, but also data models or organizational charts. In this paper, we furthermore evaluate GMQL against the backdrop of existing query approaches thereby carving out its advantages and limitations as well as pointing toward future research.  相似文献   
18.
This paper aims to find a practical solution to reduce oscillation on the Smith Predictor (SP) based design with the dead time (DT) uncertainty, making it less sensitive to DT change and more effective in disturbance rejection. First, a conditional feedback mechanism is introduced in SP to reduce the amount of oscillation caused by the model inaccuracies in the DT parameter. Then, to address the oscillation caused by the phase lag in traditional PI controller and uncertain dynamics, this conditional SP is combined with active disturbance rejection control (ADRC), assisted by the knowledge of process dynamics. A practical tuning method is provided for the practicing engineers. The proposed approach is validated in extensive simulation studies with different types of plants and in frequency domain analysis. The simulation results show significant improvements in performance robustness and transient response.  相似文献   
19.
Building Information Models (BIMs) are becoming the official standard in the construction industry for encoding, reusing, and exchanging information about structural assets. Automatically generating such representations for existing assets stirs up the interest of various industrial, academic, and governmental parties, as it is expected to have a high economic impact. The purpose of this paper is to provide a general overview of the as-built modelling process, with focus on the geometric modelling side. Relevant works from the Computer Vision, Geometry Processing, and Civil Engineering communities are presented and compared in terms of their potential to lead to automatic as-built modelling.  相似文献   
20.
The evaluation of functional features of manufactured workpieces is based on GO- and NO-GO-test results, which are obtained by comparing measured geometric characteristics with nominal dimensions and tolerances specified by the designer. These geometrical specifications are based on a tolerancing system, which was originally defined for the function mating capability. Against the background of upcoming lots of other new functions (like reduction of flow resistance, light absorption, reduction of friction, diffraction of light, self-cleaning or mass transmission) are to be realized with our products – particularly by micro- and nano scaled features. If the verification process will deliver the prediction of the achievable degree of functionality, the usability of a part can be assessed more accurately and in consequence quality and economics can be improved. So, a new principle for tolerancing and verifying turns out to be necessary. In this paper the fundamental deficit of the actual tolerancing and specification systems GPS and ASME Y14.5 is derived and the path for enlarging the system by preposing a functional model is shown. To verify the functional capability of the workpieces an approach based on simulations done with the parameterized mathematical–physical model of the function is suggested. Advantages of this approach will be discussed and demonstrated by examples with microstructured inking rolls, crankshafts and injection valves.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号