首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   27篇
  免费   0篇
  国内免费   4篇
工业技术   31篇
  2023年   2篇
  2022年   1篇
  2021年   2篇
  2016年   4篇
  2015年   6篇
  2014年   3篇
  2013年   2篇
  2012年   1篇
  2009年   2篇
  2008年   1篇
  2007年   1篇
  2005年   1篇
  2004年   1篇
  2002年   1篇
  1997年   1篇
  1996年   1篇
  1987年   1篇
排序方式: 共有31条查询结果,搜索用时 578 毫秒
1.
This paper proposes a new method for control of continuous large-scale systems where the measures and control functions are distributed on calculating members which can be shared with other applications and connected to digital network communications.At first, the nonlinear large-scale system is described by a Takagi-Sugeno(TS) fuzzy model. After that, by using a fuzzy LyapunovKrasovskii functional, sufficient conditions of asymptotic stability of the behavior of the decentralized networked control system(DNCS),are developed in terms of linear matrix inequalities(LMIs). Finally, to illustrate the proposed approach, a numerical example and simulation results are presented.  相似文献   
2.
Accelerated life testing (ALT) is widely used in high-reliability product estimation to get relevant information about an item's performance and its failure mechanisms. To analyse the observed ALT data, reliability practitioners need to select a suitable accelerated life model based on the nature of the stress and the physics involved. A statistical model consists of (i) a lifetime distribution that represents the scatter in product life and (ii) a relationship between life and stress. In practice, several accelerated life models could be used for the same failure mode and the choice of the best model is far from trivial. For this reason, an efficient selection procedure to discriminate between a set of competing accelerated life models is of great importance for practitioners. In this paper, accelerated life model selection is approached by using the Approximate Bayesian Computation (ABC) method and a likelihood-based approach for comparison purposes. To demonstrate the efficiency of the ABC method in calibrating and selecting accelerated life model, an extensive Monte Carlo simulation study is carried out using different distances to measure the discrepancy between the empirical and simulated times of failure data. Then, the ABC algorithm is applied to real accelerated fatigue life data in order to select the most likely model among five plausible models. It has been demonstrated that the ABC method outperforms the likelihood-based approach in terms of reliability predictions mainly at lower percentiles particularly useful in reliability engineering and risk assessment applications. Moreover, it has shown that ABC could mitigate the effects of model misspecification through an appropriate choice of the distance function.  相似文献   
3.
A Case‐Based Reasoning (CBR) system for medical diagnosis mimics the way doctors make a diagnosis. Given a new case, its accuracy in practice depends on successful retrieval of similar cases. CBR systems have had some success in dealing with simple diseases because of the robustness of their case base. However, their diagnostic accuracy suffers when dealing with complex diseases particularly those that involve multiple domains in medicine. An example of such a condition is Premenstrual syndrome (PMS) as it falls under both gynaecology and psychiatry. To address this issue, the paper proposes a CBR‐based expert system that uses the K‐nearest neighbour (KNN) algorithm to search k similar cases based on the Euclidean distance measure. The novelty of the system is in the design of a flexible auto‐set tolerance (T), which serves as a threshold to extract cases for which similarities are greater than the assigned value of T. A prototype software tool with a menu‐driven Graphical User Interface (GUI) has been developed for case input, analysis of results, and case adaptation within the system. Finally, the performance of the tool has been checked on a set of real‐world PMS cases.  相似文献   
4.
In this study, the annealing time and substrates nature effects on the physical properties of CuSbS2 thin films were investigated. CuSbS2 thin films were prepared on various substrates via thermal evaporation technique. The as_deposited films were annealed in air for 60 and 120 min at 250 °C. The atomic force microscope micrographs of as_made and annealed thin films show that the surface morphology is affected by annealing time and substrate variation. X_ray diffraction results show that crystallinity increased with annealing time. The microstructure parameters: crystallite size and dislocation density were calculated. The optical properties were obtained from the analysis of the experimental recorded transmittance and reflectance spectral data over the wavelength range 300–1800 nm. High absorption coefficients (105–106 cm?1) are reached. Values of Eg are close to the theoretical optimum for efficient conversion of solar radiation into electrical power making the material suitable for photovoltaic applications.  相似文献   
5.
A. Rabhi  B. Rezig 《Materials Letters》2008,62(20):3576-3578
Post-growth treatments in vacuum atmosphere were performed on CuSbS2 films prepared by the single-source thermal evaporation method on glass substrates. The films were annealed in vacuum atmosphere for 2 h in temperature range 130-200 °C. The effect of this thermal treatment on the structural, optical and electrical properties of the films was studied. X-ray diffraction (XRD) patterns indicated that the films exhibited an amorphous structure for annealing temperature below 200 °C and a polycrystalline structure with CuSbS2 principal phase. For the films annealed at temperatures below 200 °C one direct optical transition in range 1.8-2 eV was found. For the films annealed at 200 °C, two optical direct transitions emerged at 1.3 and 1.79 eV corresponding to the CuSbS2 and Sb2S3 values respectively. The electrical measurements showed a conversion from low resistivities (3.10− 2-9.10− 2) Ω cm for the samples annealed at temperatures below 200 °C to relatively high resistivities (2 Ω cm) for the samples annealed at 200 °C. In all cases the samples exhibited p-Ztype conductivity.  相似文献   
6.
The ToxicFarm Integrated Cooperation Framework for Virtual Teams   总被引:1,自引:0,他引:1  
Developing a collaboration solution, that scales to an entire organization, that offers an integrated collection of cooperation tools, that is general enough to address a large range of applications, and that is easy to deploy for most people, is still an open challenge. This paper presents ToxicFarm services that are an integral part of a framework for hosting Internet virtual teams. The originality of this work is in providing a synthesis between contributions from different domains, including version management in software engineering, process management in data engineering, and awareness in groupware tools. The paper describes the overall services offered, discusses design choices for their integration and implementation, presents relations with existing work and describes their use in several emerging e-business application domains, such as e-finance, e-learning and e-telecom.  相似文献   
7.
Event data analysis is becoming increasingly of interest to academic researchers looking for patterns in the data. Unlike domain experts working in large companies who have access to IT staff and expensive software infrastructures, researchers find it harder to efficiently manage their event data analysis by themselves. Particularly, user-driven rule management is a challenge especially when analysis rules increase in size and complexity over time. In this paper, we propose an event data analysis platform called EP-RDR intended for non-IT experts that facilitates the evolution of event processing rules according to changing requirements. This platform integrates a rule learning framework called Ripple-Down Rules (RDR) operating in conjunction with an event pattern detection component invoked as a service (EPDaaS). We have built a prototype to demonstrate this solution on real-life scenario involving financial data analysis.  相似文献   
8.
One of the main activities in data‐intensive science is data analysis. Although there are many popular technologies that can assist scientists in various isolated aspects of data analysis, supporting analysis processes in holistic ways that promote system interoperability, integration and automation, as well as scientific reproducibility and efficient data handling, presents many challenges. A common solution to address these challenges is to find efficient ways of integrating various existing technologies together to meet the analysis needs of scientists (which is similar to the idea behind science gateways). We believe that this solution is essentially an exercise in software design; and in many situations, these challenges should be tackled from a software design perspective. Consequently, this paper reviews different architectural design approaches that can be used to address these challenges and proposes a service‐oriented framework called the Ad Hoc Data Grid Environment, which consists of an architectural pattern and its associated operational guidelines. The guidelines prescribe a number of activities based on an iterative decomposition approach to produce and evolve software architectures according to constantly changing user needs. The framework is demonstrated on a case study involving analysis processes required for conducting financial event studies. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   
9.
Ellouzi  Hasna  Rabhi  Mokded  Khedher  Saloua  Debez  Ahmed  Abdelly  Chedly  Zorrig  Walid 《SILICON》2023,15(1):37-60
Silicon - Seed priming has recently gained considerable attention to induce salt tolerance in several crop plants. In the present study, we evaluated the effect of seed priming with silicon (Si)...  相似文献   
10.
Rabhi  F.A. Benatallah  B. 《IEEE network》2002,16(1):15-19
This article studies current developments and trends in the area of capital market systems. In particular, it defines the trading lifecycle and the activities associated with it. The article then investigates opportunities for the integration of legacy systems and existing communication protocols through distributed integrated services that correspond to established business processes. These integrated services link to basic services such as an exchange, a settlement, or a registry service. Examples of such integrated services include pre-trade services (e.g., analytics) or post-trade services (e.g., surveillance). The article then presents the various levels of integration in capital market systems and discusses the standards in place. It establishes that most interactions occur at a low level of abstraction such as the network (e.g., TCP/IP), data format (e.g., FIX, XML), and middleware levels (e.g., CORBA). Finally, the article discusses a software development methodology based on the use of design patterns. These design patterns address the essential aspects of managing integrated services in a technology-independent fashion. These aspects are service wrapping, service composition, service contracting, service discovery, and service execution. The objective of the methodology is to facilitate the rapid development of new integrated services that correspond to emerging business opportunities  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号