首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   15篇
  免费   1篇
工业技术   16篇
  2019年   1篇
  2016年   1篇
  2015年   1篇
  2014年   2篇
  2010年   2篇
  2009年   4篇
  2008年   1篇
  2004年   1篇
  1999年   1篇
  1998年   1篇
  1997年   1篇
排序方式: 共有16条查询结果,搜索用时 0 毫秒
1.
In the context of future dynamic applications, systems will exhibit unpredictably varying platform resource requirements. To deal with this, they will not only need to be programmable in terms of instruction set processors, but also at least partial reconfigurability will be required. In this context, it is important for applications to optimally exploit the memory hierarchy under varying memory availability. This article presents a mapping strategy for wavelet-based applications: depending on the encountered conditions, it switches to different memory optimized instantations or localizations, permitting up to 51% energy gains in memory accesses. Systematic and parameterized mapping guidelines indicate which localization should be selected when, for varying algorithmic wavelet parameters. The results have been formalized and generalized to be applicable to more general wavelet-based applications.  相似文献   
2.
Many bottlenecks in drug discovery have been addressed with the advent of new assay and instrument technologies. However, storing and processing chemical compounds for screening remains a challenge for many drug discovery laboratories. Although automated storage and retrieval systems are commercially available for medium to large collections of chemical samples, these samples are usually stored at a central site and are not readily accessible to satellite research labs.Drug discovery relies on the rapid testing of new chemical compounds in relevant biological assays. Therefore, newly synthesized compounds must be readily available in various formats to biologists performing screening assays. Until recently, our compounds were distributed in screw cap vials to assayists who would then manually transfer and dilute each sample in an “assay-ready” compound plate for screening. The vials would then be managed by the individuals in an ad hoc manner.To relieve the assayist from searching for compounds and preparing their own assay-ready compound plates, a newly customized compound storage system with an ordering software application was implemented at our research facility that eliminates these bottlenecks. The system stores and retrieves compounds in 1 mL-mini-tubes or microtiter plates, facilitates compound searching by identifier or structure, orders compounds at varying concentrations in specified wells on 96- or 384-well plates, requests the addition of controls (vehicle or reference compounds), etc. The orders are automatically processed and delivered to the assayist the following day for screening. An overview of our system will demonstrate that we minimize compound waste and ensure compound integrity and availability.  相似文献   
3.
Probabilistic analysis is an emerging field of structural engineering which is very significant in structures of great importance like dams, nuclear reactors etc. In this work a Neural Networks (NN) based Monte Carlo Simulation (MCS) procedure is proposed for the vulnerability analysis of large concrete dams, in conjunction with a non-linear finite element analysis for the prediction of the bearing capacity of the Dam using the Continuum Strong Discontinuity Approach. The use of NN was motivated by the approximate concepts inherent in vulnerability analysis and the time consuming repeated analyses required for MCS. The Rprop algorithm is implemented for training the NN utilizing available information generated from selected non-linear analyses. The trained NN is then used in the context of a MCS procedure to compute the peak load of the structure due to different sets of basic random variables leading to close prediction of the probability of failure. This way it is made possible to obtain rigorous estimates of the probability of failure and the fragility curves for the Scalere (Italy) dam for various predefined damage levels and various flood scenarios. The uncertain properties (modeled as random variables) considered, for both test examples, are the Young’s modulus, the Poisson’s ratio, the tensile strength and the specific fracture energy of the concrete.  相似文献   
4.
Progress in the understanding and eventual management of lakes depends upon iterative interactions between model-guided measurements and measurement-tested model development. A research example of this progress can be demonstrated by the measurement campaigns that were mounted in Lake Trichonis, in central Greece. Numerical simulations and analytical theories have been tested against currents, surface seiche and temperatures. Simultaneous water gauges, current meters, anemometric stations, thermometers, sediment and water samplers were used for the verification of the model. The agreement between the theoretically predicted and experimentally determined data was satisfactory for most of the verifications. The most significant error was due to atmospheric pressure. The computed surface seiche showed excellent agreement with the observations, even in spectral analysis. The computed currents showed circulation patterns very similar to those measured in the field. The computed temperature distributions throughout the lake were not in good agreement because of incoming water from the bottom.  相似文献   
5.
In the present paper the weighted integral method in conjunction with Monte Carlo simulation is used for the stochastic finite element-based reliability analysis of space frames. The limit state analysis required at each Monte Carlo simulation is performed using a non-holonomic step-by-step elasto-plastic analysis based on the plastic node method in conjunction with efficient solution techniques. This implementation results in cost effective solutions both in terms of computing time and storage requirements. The numerical results presented demonstrate that this approach provides a realistic treatment for the stochastic finite element-based reliability analysis of large scale three-dimensional building frames.  相似文献   
6.
7.
Accurate, reliable, and timely burn severity maps are necessary for planning, managing and rehabilitation after wildfires. This study aimed at assessing the ability of the Sentinel-2A satellite to detect burnt areas and separate burning severity levels. It also attempted to measure the spectral separability of the different bands and derived indices commonly used to detect burnt areas. A short investigation into the associated environmental variables present in the burnt landscape was also performed to explore the presence of any correlation. As a case study, a wildfire occurred in the Sierra de Gata region of the province of Caceres in North-Eastern Spain was used. A range of spectral indices was computed, including the Normalized Difference Vegetation Index (NDVI) and the Normalized Burn Ratio (NBR). The potential added value of the three new Red Edge bands that come with the Sentinel-2A MSI sensor was also used. The slope, aspect, fractional vegetation cover and terrain roughness were all derived to produce environmental variables. The burning severity was tested using the Spectral Angle Mapper (SAM) classifier. European Environment Agency’s CORINE land cover map was also used to produce the land cover types found in the burned area. The Copernicus Emergency Management Service have produced a grading map for the fire using 0.5 m resolution Pleiades imagery, that was used as reference. Results showed a variable degree of correlation between the burning severity and the tested herein spectral indices. The visible part of the electromagnetic spectrum was not well suited to discern burned from unburned land cover. The NBRb12 (short-wave infrared 2 – SWIR2) produced the best results for detecting burnt areas. SAM resulted in a 73% overall accuracy in thematic mapping. None of the environmental variables appeared to have a significant impact on the burning severity. All in all, our study result showed that Sentinel-2 MSI sensor can be used to discern burnt areas and burning severity. However, further studies in different regions using the same dataset types and methods should be implemented before generalizing the results of the current study.  相似文献   
8.
This paper addresses the modeling of the erasing operation in a realistic flash-EEPROM cell, based on a three-dimensional (3-D) device-simulation code in which models for higher-order physical effects have been incorporated, specifically, the Fowler-Nordheim (FN) and the band-to-band tunneling. The ability of the code to consistently determine the floating-gate potential is shown. The distribution of the band-to-band generation rate within the device during the erasing process is investigated. The experimental characteristics of the erasing process of a memory cell are successfully reproduced  相似文献   
9.
10.
Continuous thin films of Pt on (100) SrTiO3 substrates were dewetted to form Pt particles at 1,150 °C, using an oxygen partial pressure of 10?20 atm. After retraction of thick (50 or 100 nm) Pt films, SrTiO3 anisotropic rods, slightly depleted in Ti, were found on the surface of the substrate. Rods did not form after dewetting of thinner (10 nm) Pt films. After dewetting, a ~10 nm thick interfacial phase was found between the Pt and the SrTiO3. The interfacial phase, based on Sr and containing ~25 at% oxygen, is believed to be a transient state, formed due to Ti depletion from the substrate, resulting in a Pt(Ti) solution in the particles. The interfacial phase forms due to the low oxygen partial pressure used to equilibrate the system, and is expected to influence the electrical properties of devices based on Pt–SrTiO3.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号