首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2765篇
  免费   107篇
  国内免费   2篇
工业技术   2874篇
  2023年   19篇
  2022年   19篇
  2021年   59篇
  2020年   39篇
  2019年   61篇
  2018年   78篇
  2017年   57篇
  2016年   90篇
  2015年   67篇
  2014年   90篇
  2013年   125篇
  2012年   132篇
  2011年   163篇
  2010年   90篇
  2009年   93篇
  2008年   86篇
  2007年   91篇
  2006年   81篇
  2005年   68篇
  2004年   48篇
  2003年   49篇
  2002年   60篇
  2001年   32篇
  2000年   46篇
  1999年   50篇
  1998年   195篇
  1997年   111篇
  1996年   86篇
  1995年   48篇
  1994年   50篇
  1993年   52篇
  1992年   29篇
  1991年   24篇
  1990年   24篇
  1989年   18篇
  1988年   26篇
  1987年   25篇
  1986年   22篇
  1985年   31篇
  1984年   26篇
  1983年   21篇
  1982年   15篇
  1981年   26篇
  1980年   19篇
  1979年   16篇
  1978年   13篇
  1977年   20篇
  1976年   47篇
  1973年   16篇
  1940年   10篇
排序方式: 共有2874条查询结果,搜索用时 31 毫秒
71.
固体有害废物安全填埋场基础弱渗透岩石的渗透性,大多数情况下都由实验室和野外方法测定.测定方法的适用性取决于岩石种类、岩体状态和渗透性数量级别.在同一实验段内采用不同的实验方法,一情况下导致产生两个数量级的误差,而更大的不可靠性与使用不同的评价方法有关.因为可靠并可引用的渗透性测定值在废物安全填埋场建设中对地质屏障的评价具有重大意义,如果各地能同时列出科研规划,共同研究对所有可能存在的地质屏障岩石进行全面分析评价,确定不同实验方法之间相关关系,以及使评价方法尽可能统一,是十分必要的  相似文献   
72.
Domino Reactions in Organic Synthesis   总被引:1,自引:0,他引:1  
Tietze LF 《Chemical reviews》1996,96(1):115-136
  相似文献   
73.
Percutaneous radiofrequency ablation (RFA) is becoming a standard minimally invasive clinical procedure for the treatment of liver tumors. However, planning the applicator placement such that the malignant tissue is completely destroyed, is a demanding task that requires considerable experience. In this work, we present a fast GPU-based real-time approximation of the ablation zone incorporating the cooling effect of liver vessels. Weighted distance fields of varying RF applicator types are derived from complex numerical simulations to allow a fast estimation of the ablation zone. Furthermore, the heat-sink effect of the cooling blood flow close to the applicator's electrode is estimated by means of a preprocessed thermal equilibrium representation of the liver parenchyma and blood vessels. Utilizing the graphics card, the weighted distance field incorporating the cooling blood flow is calculated using a modular shader framework, which facilitates the real-time visualization of the ablation zone in projected slice views and in volume rendering. The proposed methods are integrated in our software assistant prototype for planning RFA therapy. The software allows the physician to interactively place virtual RF applicator models. The real-time visualization of the corresponding approximated ablation zone facilitates interactive evaluation of the tumor coverage in order to optimize the applicator's placement such that all cancer cells are destroyed by the ablation.  相似文献   
74.
Medium-sized, open-participation Open Source Software (OSS) projects do not usually perform explicit software process improvement on any routine basis. It would be useful to understand how to get such a project to accept a process improvement proposal and hence to perform process innovation. We want to determine an effective and feasible qualitative research method for studying the above question. We present (narratively) a case study of how we worked towards and eventually found such a research method. The case involves four attempts at collecting suitable data about innovation episodes (direct participation (twice), polling developers for episodes, manually finding episodes in mailing list archives) and the adaptation of the Grounded Theory data analysis methodology. Direct participation allows gathering rather rich data, but does not allow for observing a sufficiently large number of innovation episodes. Polling developers for episodes did not prove to be useful. Using mailing list archives to find data to be analyzed is both feasible and effective. We also describe how the data thus found can be analyzed based on the Grounded Theory Method with suitable adjustments. By-and-large, our findings ought to apply to studying various phenomena in OSS development processes that are similarly heavyweight and infrequent. However, specific details may block this possibility and we cannot predict which details that might be. The amount of effort involved in direct participation approaches to qualitative research can easily be underestimated. Also, survey approaches are not well-suited for many process issues in OSS, because too few developers are sufficiently process-conscious. An approach based on passive observation is a viable alternative in the OSS context due to the availability of large amounts of fairly complete archival data.  相似文献   
75.
Model-based performance evaluation methods for software architectures can help architects to assess design alternatives and save costs for late life-cycle performance fixes. A recent trend is component-based performance modelling, which aims at creating reusable performance models; a number of such methods have been proposed during the last decade. Their accuracy and the needed effort for modelling are heavily influenced by human factors, which are so far hardly understood empirically. Do component-based methods allow to make performance predictions with a comparable accuracy while saving effort in a reuse scenario? We examined three monolithic methods (SPE, umlPSI, Capacity Planning (CP)) and one component-based performance evaluation method (PCM) with regard to their accuracy and effort from the viewpoint of method users. We conducted a series of three experiments (with different levels of control) involving 47 computer science students. In the first experiment, we compared the applicability of the monolithic methods in order to choose one of them for comparison. In the second experiment, we compared the accuracy and effort of this monolithic and the component-based method for the model creation case. In the third, we studied the effort reduction from reusing component-based models. Data were collected based on the resulting artefacts, questionnaires and screen recording. They were analysed using hypothesis testing, linear models, and analysis of variance. For the monolithic methods, we found that using SPE and CP resulted in accurate predictions, while umlPSI produced over-estimates. Comparing the component-based method PCM with SPE, we found that creating reusable models using PCM takes more (but not drastically more) time than using SPE and that participants can create accurate models with both techniques. Finally, we found that reusing PCM models can save time, because effort to reuse can be explained by a model that is independent of the inner complexity of a component. The tasks performed in our experiments reflect only a subset of the actual activities when applying model-based performance evaluation methods in a software development process. Our results indicate that sufficient prediction accuracy can be achieved with both monolithic and component-based methods, and that the higher effort for component-based performance modelling will indeed pay off when the component models incorporate and hide a sufficient amount of complexity.  相似文献   
76.
The construction of a new generation of MEMS which includes micro-assembly steps in the current microfabrication process is a big challenge. It is necessary to develop new production means named micromanufacturing systems in order to perform these new assembly steps. The classical approach called “top-down” which consists in a functional analysis and a definition of the tasks sequences is insufficient for micromanufacturing systems. Indeed, the technical and physical constraints of the microworld (e.g. the adhesion phenomenon) must be taken into account in order to design reliable micromanufacturing systems. A new method of designing micromanufacturing systems is presented in this paper. Our approach combines the general “top-down” approach with a “bottom-up” approach which takes into account technical constraints. The method enables to build a modular architecture for micromanufacturing systems. In order to obtain this modular architecture, we have devised an original identification technique of modules and an association technique of modules. This work has been used to design the controller of an experimental robotic micro-assembly station.  相似文献   
77.
For a number of programming languages, among them Eiffel, C, Java, and Ruby, Hoare-style logics and dynamic logics have been developed. In these logics, pre- and postconditions are typically formulated using potentially effectful programs. In order to ensure that these pre- and postconditions behave like logical formulae (that is, enjoy some kind of referential transparency), a notion of purity is needed. Here, we introduce a generic framework for reasoning about purity and effects. Effects are modelled abstractly and axiomatically, using Moggi’s idea of encapsulation of effects as monads. We introduce a dynamic logic (from which, as usual, a Hoare logic can be derived) whose logical formulae are pure programs in a strong sense. We formulate a set of proof rules for this logic, and prove it to be complete with respect to a categorical semantics. Using dynamic logic, we then develop a relaxed notion of purity which allows for observationally neutral effects such writing on newly allocated memory.  相似文献   
78.
The performance of the DNDC and Daisy model to simulate the water dynamics in a floodplain soil of the North China Plain was tested and compared. While the DNDC model uses a simple cascade approach, the Daisy model applies the physically based Richard's equation for simulating water movement in soil. For model testing a three years record of the soil water content from the Dong Bei Wang experimental station near Beijing was used. There, the effect of nitrogen fertilization, irrigation and straw removal on soil water and nitrogen dynamics was investigated in a three factorial field experiment applying a split-split-plot design with 4 replications. The dataset of one treatment was used for model testing and calibration. Two other independent datasets from further treatments were employed for validating the models. For both models, the simulation results were not satisfying using default parameters. After parameter optimisation and the use of site-specific van Genuchten parameters, however, the Daisy model performed well. But, for the DNDC model, parameter optimisation failed to improve the simulation result. Owing to the fact that many biological processes such as plant growth, nitrification or denitrification depend strongly on the soil water content, our findings bring us to the conclusion that the site-specific suitability of the DNDC model for simulating the soil water dynamics should be tested before further simulation of other processes.  相似文献   
79.
Standard path control laws of autonomous vehicles use the shortest distance between the vehicle’s position and the path as a control error. In order to determine this distance, the projection point onto the path needs to be determined continuously. This requires fast algorithms that feature high numerical reliability in the field of vehicle application.This paper presents two different observer-based approaches for the projection problem. The identity observer reconstructs all states of interest for path control. The second one, a reduced observer, only possesses the curve parameter as a state and calculates the other values by algebraic formulas. Both algorithms consider the continuous movement of the vehicle, the run of the curve, and work without any approximation of the curve. Furthermore, they are applicable for arbitrary parameterized smooth curves, guarantee the required numerical stability, have short calculating time, and show good statistical properties. The performance is shown in several simulations as well as under real conditions.  相似文献   
80.
Minimum size of a graph or digraph of given radius   总被引:1,自引:0,他引:1  
In this paper we show that a connected graph of order n, radius r and minimum degree δ has at least edges, for n large enough, and this bound is sharp. We also present a similar result for digraphs.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号