首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   28篇
  免费   1篇
工业技术   29篇
  2022年   1篇
  2015年   1篇
  2013年   4篇
  2012年   1篇
  2011年   1篇
  2010年   1篇
  2009年   2篇
  2008年   3篇
  2007年   1篇
  2003年   3篇
  2002年   1篇
  2001年   1篇
  2000年   3篇
  1999年   2篇
  1998年   2篇
  1984年   1篇
  1981年   1篇
排序方式: 共有29条查询结果,搜索用时 375 毫秒
11.
A recent work proposed to simplify fat-trees with adaptive routing by means of a load-balancing deterministic routing algorithm. The resultant network has performance figures comparable to the more complex adaptive routing fat-trees when packets need to be delivered in order. In a second work by the same authors published in IEEE CAL, they propose to simplify the fat-tree to a unidirectional multistage interconnection network (UMIN), using the same load-balancing deterministic routing algorithm. They show that comparable performance figures are achieved with much lower network complexity. In this comment we show that the proposed load-balancing deterministic routing is in fact the routing scheme used by the butterfly network. Moreover we show that the properties of the simplified UMIN network proposed by them are intrinsic to the standard butterfly and other existing UMINs.  相似文献   
12.
CORDIC-based algorithms to compute cos and are proposed. The implementation requires a standard CORDIC module plus a module to compute the direction of rotation, this being the same hardware required for the extended CORDIC vectoring, recently proposed by the authors [T. Lang and E. Antelo, IEEE Transactions on Computers, vol. 47, no. 7, 1998, pp. 736–749.]. Although these functions can be obtained as a special case of this extended vectoring, the specific algorithm we propose here presents two significant improvements: (1) it uses the same datapath width as the standard CORDIC, even when t has 2n bits (to achieve a granularity of 2–n for the whole range). In contrast, the extended vectoring unit requires about 2n bits. (2) no repetitions of iterations are needed (the extended vectoring needs some repetitions). The proposed algorithm is compatible with the extended vectoring and, in contrast with previous implementations, the number of iterations and the delay of each iteration are the same as for the conventional CORDIC algorithm.  相似文献   
13.
14.
In this work, the systematic approach to a plant-wide control design developed in a previous work by the authors [Antelo, L. T., Otero-Muras, I., Banga, J. R, & Alonso, A. A. (2007). A systematic approach to plant-wide control based on thermodynamics. Computers & and Chemical Engineering, 31(7), 677–691] is applied to derive robust decentralized controllers in the challenging benchmark of the Tennessee Eastman Process (TEP). The hierarchical control design is based on the link between thermodynamics and passivity theory as well as on the fundamentals of process networks. First, the TEP process network is decomposed into abstract mass and energy inventory networks. It is over these subnetworks where conceptual inventory control loops are developed to guarantee the convergence of the mass and energy inventories to a given compact region defined by constant inventories. Thermodynamics gives us a function (the entropy) which has a definite curvature (concavity) over these compact regions of the state space. This function can then be employed on these regions to derive natural storage or function candidates which can be used in designing controllers for stabilizing the network. The last step in the systematic control design procedure will be the realization of the controllers using the available degrees of freedom of the process. The inventory control laws will be obtained as a composition of control loops implemented over the real manipulated variables (degrees of freedom).  相似文献   
15.
The effects of the neurotoxin domoic acid (DA) in the central nervous system of rodents (essentially rats and mice) after intraperitoneal administration have been profusely studied in the past. These observations have shown that the toxin induces similar symptoms and pathology in both species, but the lethality varies greatly. This article addresses the common and specific histopathological effects in rats and mice and the difference in sensitivity of these species to DA. Various sublethal and lethal doses were employed in mice (from 3 mg/kg to 8 mg/kg) to observe their neurotoxicity by using different histological techniques, and these results were compared with the pathological effects after the administration of LD50 in rats (2.5 mg/kg). Additionally we also detected the presence of this toxin in various tissues by means of immunohistochemistry. Our results showed that rats are more vulnerable than mice to the neurotoxic effects of DA after intraperitoneal inoculation: lethality was extremely high in rats and the toxin produced hippocampal damage in rats surviving the intoxication, while lesions were not observed in DA‐inoculated mice. As for similarities between rats and mice, both displayed similar clinical signs and in both the toxin was detected in the hypophysis by immunohistochemistry, a brain region not reported to date as target of the toxin. Microsc. Res. Tech. 78:396–403, 2015. © 2015 Wiley Periodicals, Inc.  相似文献   
16.
In this contribution, we consider mixed-integer nonlinear programming problems subject to differential-algebraic constraints. This class of problems arises frequently in process design, and the particular case of integrated process and control system design is considered. Since these problems are frequently non-convex, local optimization techniques usually fail to locate the global solution. Here, we propose a global optimization algorithm, based on extensions of the metaheuristic Tabu Search, in order to solve this challenging class of problems in an efficient and robust way. The ideas of the methodology are explained and, on the basis of two case studies, the performance of the approach is evaluated. The first benchmark problem is a Wastewater Treatment Plant model [Alex, J., Bteau, J. F., Copp, J. B., Hellinga, C., Jeppsson, U., Marsili-Libelli, S., et al. (1999). Benchmark for evaluating control strategies in wastewater treatment plants. In Proceedings of the ECC’99 conference] for nitrogen removal and the second case study is the well-known Tennessee Eastman Process [Downs, J. J., & Vogel, E. F. (1993). A plant-wide industrial process control problem. Computers & Chemical Engineering, 17, 245-255]. Numerical experiments with our new method indicate that we can achieve an improved performance in both cases. Additionally, our method outperforms several other recent competitive solvers for the two challenging case studies considered.  相似文献   
17.
In this work we present an implementation of the exponential function in double precision, in a unit that supports IEEE floating-point arithmetic. As existing proposals, the implementation is based on the use of a floating-point multiplier and additional hardware. We decompose the computation into three subexponentials. The first and third subexponentials are computed in a conventional way (table look-up and polynomial approximation). The second subexponential is computed based on a transformation of the slow radix-2 digit-recurrence algorithm into a fast computation by using the multiplier and additional hardware. We present a design process that permits the selection of the most convenient trade-off between hardware complexity and latency. We discuss the algorithm, the implementation, and perform a rough comparison with three proposed designs. Our estimations indicate that the implementation proposed in this work presents better trade-off between hardware complexity and latency than the compared designs.  相似文献   
18.
Large-scale deployment of biometric systems for web-based services has to tackle technological issues related to security, interoperability and accuracy, and social issues related to privacy protection and biometric acquisition process acceptance. The variety of biometric traits, capturing devices, targeted populations and working scenarios makes the development of a universal solution for all-purpose deployment quite a difficult task. This paper describes the design, implementation and applicability of an open framework for distributed biometric authentication oriented to access-control in web environment. The open principle makes this framework a novel and practical development tool for testing and integrating biometric algorithms and devices from third parties. Special attention has been paid to security and interoperability standards to ease concurrent integration and testing of biometric trait matchers developed by different laboratories or companies. Finally, in order to demonstrate the versatility and usability of the framework, we describe the construction process for a distributed multibiometric database acquisition tool based on this framework.  相似文献   
19.
A technique that uses high resolution infrared (IR) imaging was developed to track and analyze damage evolution of thermal barrier coatings (TBCs) during controlled mechanical testing of a TBC specimen. Coating debonding and spallation were examined during a monotonic load-to-TBC-failure test. The infrared imaging, in concert with a controlled thermal gradient in the specimen, was particularly effective in identifying and tracking localized damage evolution because the damage in the TBC was always associated with a measurable surface-temperature change. It is demonstrated that the combined use of high-resolution infrared imaging and controlled mechanical testing of TBCs is an effective method to characterize the evolution of their failure.  相似文献   
20.
Freeze drying (lyophilization) offers an attractive dehydration method for valuable food and biological products, because it is capable of preserving product quality and biological activity while extending their shelf life. However, despite these benefits in terms of product quality, freeze drying is also a notoriously energy-intensive and time-consuming process. This requires an expensive operation to construct an efficient optimal decision-making tool able to drive the operation through the most effective paths that minimize time and maximize product quality. Here we propose an integrated approach to operational design and control of the freeze-drying process that combines dynamic modeling with efficient optimized off-line and on-line control. The required mass and energy balance equations still contain inherent nonlinearity, even in their lumped parameter version. This results in a set of complex dynamic, computationally costly optimization problems solved by selected global stochastic optimization algorithms. Real-time disturbances and model uncertainties are addressed via the proposed hierarchical multilevel approach, allowing recalculation of the required control strategies. The framework developed has been revealed as a useful tool to systematically define off-line and on-line optimal operation policies for many food and biological processing units.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号