首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   495篇
  免费   24篇
工业技术   519篇
  2023年   6篇
  2022年   2篇
  2021年   22篇
  2020年   13篇
  2019年   7篇
  2018年   11篇
  2017年   11篇
  2016年   21篇
  2015年   15篇
  2014年   27篇
  2013年   31篇
  2012年   35篇
  2011年   43篇
  2010年   28篇
  2009年   30篇
  2008年   27篇
  2007年   39篇
  2006年   23篇
  2005年   22篇
  2004年   14篇
  2003年   14篇
  2002年   13篇
  2001年   5篇
  2000年   9篇
  1999年   8篇
  1998年   7篇
  1997年   9篇
  1996年   1篇
  1995年   4篇
  1994年   2篇
  1993年   2篇
  1992年   5篇
  1988年   2篇
  1987年   2篇
  1986年   1篇
  1985年   2篇
  1984年   1篇
  1981年   1篇
  1980年   1篇
  1975年   1篇
  1974年   2篇
排序方式: 共有519条查询结果,搜索用时 31 毫秒
11.
Automatic prediction tools play a key role in enabling the application of non-functional requirements analysis, to simplify the selection and the assembly of components for component-based software systems, and in reducing the need for strong mathematical skills for software designers. By exploiting the paradigm of Model-Driven Engineering (MDE), it is possible to automatically transform design models into analytical models, thus enabling formal property verification. MDE is the core paradigm of the KlaperSuite framework presented in this paper, which exploits the KLAPER pivot language to fill the gap between design and analysis of component-based systems for reliability properties. KlaperSuite is a family of tools empowering designers with the ability to capture and analyze quality of service views of their systems, by building a one-click bridge towards a number of established verification instruments. In this article, we concentrate on the reliability-prediction capabilities of KlaperSuite and we evaluate them with respect to several case studies from literature and industry.  相似文献   
12.
During software system evolution, software architects intuitively trade off the different architecture alternatives for their extra-functional properties, such as performance, maintainability, reliability, security, and usability. Researchers have proposed numerous model-driven prediction methods based on queuing networks or Petri nets, which claim to be more cost-effective and less error-prone than current practice. Practitioners are reluctant to apply these methods because of the unknown prediction accuracy and work effort. We have applied a novel model-driven prediction method called Q-ImPrESS on a large-scale process control system from ABB consisting of several million lines of code. This paper reports on the achieved performance prediction accuracy and reliability prediction sensitivity analyses as well as the effort in person hours for achieving these results.  相似文献   
13.
The current Web Services Agreement specification draft proposes a simple request-response protocol for agreement creation only addressing bilateral offer exchanges. This paper proposes a framework augmenting this WS-Agreement to enable negotiations according to a variety of bilateral and multilateral negotiation protocols. The framework design is based on a thorough analysis of taxonomies for negotiations from the literature in order to allow for capturing a variety of different negotiation models within a single, WS-Agreement compatible, framework. In order to provide for the intended flexibility, the proposed protocol takes a two-stage approach: a meta-protocol is conducted among interested parties to agree on a common negotiation protocol first before the real negotiation is carried out in the second step due to the protocol established in the first step.  相似文献   
14.
15.
Model-based performance evaluation methods for software architectures can help architects to assess design alternatives and save costs for late life-cycle performance fixes. A recent trend is component-based performance modelling, which aims at creating reusable performance models; a number of such methods have been proposed during the last decade. Their accuracy and the needed effort for modelling are heavily influenced by human factors, which are so far hardly understood empirically. Do component-based methods allow to make performance predictions with a comparable accuracy while saving effort in a reuse scenario? We examined three monolithic methods (SPE, umlPSI, Capacity Planning (CP)) and one component-based performance evaluation method (PCM) with regard to their accuracy and effort from the viewpoint of method users. We conducted a series of three experiments (with different levels of control) involving 47 computer science students. In the first experiment, we compared the applicability of the monolithic methods in order to choose one of them for comparison. In the second experiment, we compared the accuracy and effort of this monolithic and the component-based method for the model creation case. In the third, we studied the effort reduction from reusing component-based models. Data were collected based on the resulting artefacts, questionnaires and screen recording. They were analysed using hypothesis testing, linear models, and analysis of variance. For the monolithic methods, we found that using SPE and CP resulted in accurate predictions, while umlPSI produced over-estimates. Comparing the component-based method PCM with SPE, we found that creating reusable models using PCM takes more (but not drastically more) time than using SPE and that participants can create accurate models with both techniques. Finally, we found that reusing PCM models can save time, because effort to reuse can be explained by a model that is independent of the inner complexity of a component. The tasks performed in our experiments reflect only a subset of the actual activities when applying model-based performance evaluation methods in a software development process. Our results indicate that sufficient prediction accuracy can be achieved with both monolithic and component-based methods, and that the higher effort for component-based performance modelling will indeed pay off when the component models incorporate and hide a sufficient amount of complexity.  相似文献   
16.
Several missions with an Unmanned Aerial Vehicle (UAV) in different realistic safety, security, and rescue field tests are presented. First, results from two safety and security missions at the 2009 European Land Robot Trials (ELROB) are presented. A UAV in form of an Airrobot AR100-B is used in a reconnaissance and in a camp security scenario. The UAV is capable of autonomous waypoint navigation using onboard GPS processing. A digital video stream from the vehicle is used to create photo maps—also known as mosaicking—in real time at the operator station. This mapping is done using an enhanced version of Fourier Mellin based registration, which turns out to be very fast and robust. Furthermore, results from a rescue oriented scenario at the 2010 Response Robot Evaluation Exercises (RREE) at Disaster City, Texas are presented. The registration for the aerial mosaicking is supplemented by an uncertainty metric and embedded into Simultaneous Localization and Mapping (SLAM), which further enhances the photo maps as main mission deliveries.  相似文献   
17.
3D mapping is very challenging in the underwater domain, especially due to the lack of high resolution, low noise sensors. A new spectral registration method is presented that can determine the spatial 6 DOF transformation between pairs of very noisy 3D scans with only partial overlap. The approach is hence suited to cope with sonar as the predominant underwater sensor. The spectral registration method is based on Phase Only Matched Filtering (POMF) on non-trivially resampled spectra of the 3D data.  相似文献   
18.
The controlled dispersion of fluids, particularly biologically relevant solutions in micro-volumes, is of high practical interest in biotechnology and medicine. Pharmaceutical test assays, for example, need a method for the fast and defined deposition of fluid samples. Most current micro-dispensing methods, i.e. contact-based pin printing, have problems such as time delays, limited dosing velocity, minimum volume or high interference that limit biological applications. Spraying techniques suffer from a lack of reproducibility; a defined deposition of samples on targets is not possible. Here, we introduce a new method for the parallel and spatially defined dispersion of many micro-volumes that overcomes disadvantages of common micro-dispensers. The overall approach is that a fluid drop, produced by a droplet generator, falls on a free trajectory with a defined kinetic energy, and is split by a masking unit placed perpendicular to the flight direction into at least two smaller droplets (Zimmermann et al. in Method and device for dosing fluid media, WO/2002/102515, Germany, 2002). On the target, the resulting droplets form reproducible patterns, which are enlarged and scalable images of the grid pattern. Possible applications for this method are non-contact cell patterning, cell encapsulation, cryopreservation and fast mixing processes in micro-volumes. Here, we use this method for the direct and defined parallel positioning of cell suspensions on specific substrates, which can be useful for test assays, tissue engineering and cryopreservation.  相似文献   
19.
Many firms are considering ‘bring-your-own-device’ (BYOD) programs, under which their employees are allowed to bring their own devices to work and use them for both private and business purposes. This study examines what factors determine an employee's intention to participate in a corporate BYOD program and how such programs affect employer attractiveness. We approach our study of acceptance of corporate BYOD programs from the perspective of technology acceptance research. For this purpose, we propose a modified and extended UTAUT model. The model was tested by surveying students in their final term (n = 444). We show that performance expectancies have the strongest positive effect on intention, while perceived threats negatively impact intention. Finally, behavioural intention was positively associated with employer attractiveness, which leads to clear indications for companies considering establishing corporate BYOD programs. BYOD seems to play an increasingly important role in attracting and retaining future talent.  相似文献   
20.
The direct observation of cells over time using time-lapse microscopy can provide deep insights into many important biological processes. Reliable analyses of motility, proliferation, invasive potential or mortality of cells are essential to many studies involving live cell imaging and can aid in biomarker discovery and diagnostic decisions. Given the vast amount of image- and time-series data produced by modern microscopes, automated analysis is a key feature to capitalize the potential of time-lapse imaging devices. To provide fast and reproducible analyses of multiple aspects of cell behaviour, we developed TimeLapseAnalyzer. Apart from general purpose image enhancements and segmentation procedures, this extensible, self-contained, modular cross-platform package provides dedicated modalities for fast and reliable analysis of multi-target cell tracking, scratch wound healing analysis, cell counting and tube formation analysis in high throughput screening of live-cell experiments. TimeLapseAnalyzer is freely available (MATLAB, Open Source) at http://www.informatik.uni-ulm.de/ni/mitarbeiter/HKestler/tla.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号