共查询到20条相似文献,搜索用时 31 毫秒
1.
Microsystem Technologies - This work is intended to describe the design aspects and to characterize the functionality of a novel flip-chip (FC) structure applicable for THz camera assembling. The... 相似文献
2.
Jeroen Breebaart Ileana Buhan Emile Kelkboom 《Electronic Commerce Research and Applications》2011,10(6):605
Biometric authentication has a great potential to improve the security, reduce cost, and enhance the customer convenience of payment systems. Despite these benefits, biometric authentication has not yet been adopted by large-scale point-of-sale and automated teller machine systems. This paper aims at providing a better understanding of the benefits and limitations associated with the integration of biometrics in a PIN-based payment authentication system. Based on a review of the market drivers and deployment hurdles, a method is proposed in which biometrics can be seamlessly integrated in a PIN-based authentication infrastructure. By binding a fixed binary, renewable string to a noisy biometric sample, the data privacy and interoperability between issuing and acquiring banks can improve considerably compared to conventional biometric approaches. The biometric system security, cost aspects, and customer convenience are subsequently compared to PIN by means of simulations using fingerprints. The results indicate that the biometric authentication performance is not negatively influenced by the incorporation of key binding and release processes, and that the security expressed as guessing entropy of the biometric key is virtually identical to the current PIN. The data also suggest that for the fingerprint database under test, the claimed benefits for cost reduction, improved security and customer convenience do not convincingly materialize when compared to PIN. This result can in part explain why large-scale biometric payment systems are virtually non-existent in Europe and the United States, and suggests that other biometric modalities than fingerprints may be more appropriate for payment systems. 相似文献
3.
Raghvendra Sahai SaxenaAuthor Vitae Arun PanwarAuthor VitaeSukhvinder Singh LambaAuthor Vitae R.K. BhanAuthor Vitae 《Sensors and actuators. A, Physical》2011,171(2):138-145
In this paper, we present a generic circuit model of microbolometer Infrared detector that can be used to simulate the electrical and thermal performance of microbolometers using SPICE like circuit simulator. Using this model, we have studied the effects of various parameters on the microbolometer performance by simulations in PSPICE for its verification. We have validated the model with the performance of our titanium microbolometers being developed at our laboratory. We have tuned the model parameters for these microbolometers and have shown that the simulated performance agrees with the measured performance reasonably well with the variety of measurement condition. The validated model has been used to fine tune the design of our titanium microbolometers as it allows us to monitor some internal parameters also that are not easy to measure in practical devices, like instantaneous temperature of microbolometer for a varying IR intensity falling on it. The proposed model is generic and therefore, using similar procedure it may be used for other types of microbolometers also like amorphous-Si based, vanadium oxide based etc., once it is validated for that. 相似文献
4.
低对比度复杂背景下的小目标检测一直是研究的热点和难点,检测的困难主要在于背景噪声的复杂和目标的微弱.分析和研究了形态膨胀算法均值漂移(Mean Shift)算法:形态膨胀算法对目标进行有效增强,而均值漂移算法改善目标与背景对比度,有利于有效分割目标.最后实现了基于该方法的两种不同情景下的小目标的检测,实验表明该算法具有较好的有效性和鲁棒性.而且,该方法在最终目标选取采用了自适应阈值方法.实验分析表明:算法基本上是定点运算,效率较高,易于实时硬件实现. 相似文献
5.
Sauro Liberatore Author Vitae Jason L. Speyer Author Vitae Andy Chunliang Hsu 《Automatica》2006,42(7):1199-1209
A model-based fault detection filter is developed for structural health monitoring of a simply supported beam. The structural damage represented in the plant model is shown to decompose into a known fault direction vector maintaining a fixed direction, dependent on the damage location, and an arbitrary fault magnitude representing the extent of the damage. According to detection filter theory, if damage occurs, under certain circumstances the fault will be uniquely detected and identified through an associated invariance in the direction imposed on the fault detection filter residuals. The spectral algorithm used to design the detection filter is based on a left eigenstructure assignment approach which accommodates system sensitivities that are revealed as ill-conditioned matrices formed from the eigenvectors in the construction of the detection filter gains. The detection filter is applied to data from an aluminum simply supported beam with four piezoelectric sensors and one piezoelectric actuator. By exciting the structure at the first natural frequency, damage in the form of a 5 mm saw cut made to one side of the beam is detected and localized. 相似文献
6.
Djeziri Mohand A. Djedidi Oussama Morati Nicolas Seguin Jean-Luc Bendahan Marc Contaret Thierry 《Applied Intelligence》2022,52(6):6065-6078
Applied Intelligence - Air toxicity and pollution phenomena are on the rise across the planet. Thus, the detection and control of gas pollution are nowadays major economic and environmental... 相似文献
7.
K. S. Ravichandran Badrinath Narayanamurthy Gopinath Ganapathy Sri Ravalli Jaladhanki Sindhura 《Neural computing & applications》2014,25(1):105-114
This paper presents a novel approach to automatic detection of the erythemato-squamous diseases based on fuzzy extreme learning machine (FELM). Enormous computational efforts are required to classify these erythemato-squamous diseases. Some of the approaches performed previously are through fuzzy logic, artificial neural networks and neuro-fuzzy models. FELM-based differential diagnosis of these diseases involves decisions made by fuzzy logic and extreme learning machine (ELM) with greater efficiency in both time and accuracy. In this paper, we develop a user-friendly interface and this tool will be useful for a dermatologist to estimate the six types of erythemato-squamous diseases with the help of patient’s histopathological and clinical data. Then, the developed interface is derived inbuilt using neural networks, adaptive neuro-fuzzy inference system and FELM. A dataset containing records of 366 patients with 34 features that define six disease characteristics was taken, of which 310 records were used as training data and 56 other records used as testing data. The dataset was preprocessed to obtain fuzzy values as input to get more accurate results in FELM. Given a training set of such records, ELM approach is applied. By combining fuzzy logic and ELM, more accurate results with increased performance are obtained with less computational efforts. Finally, the proposed FELM model proves to be a potential solution for the diagnosis of erythemato-squamous diseases with significant improvement in computational time and accuracy compared with other models discussed in the recent literature. 相似文献
8.
Tiago Esteves Pedro Quelhas Ana Maria Mendon?a Aurélio Campilho 《Machine Vision and Applications》2012,23(4):623-638
Computational methods used in microscopy cell image analysis have largely augmented the impact of imaging techniques, becoming fundamental for biological research. The understanding of cell regulation processes is very important in biology, and in particular confocal fluorescence imaging plays a relevant role for the in vivo observation of cells. However, most biology researchers still analyze cells by visual inspection alone, which is time consuming and prone to induce subjective bias. This makes automatic cell image analysis essential for large scale, objective studies of cells. While the classic approach for automatic cell analysis is to use image segmentation, for in vivo confocal fluorescence microscopy images of plants, such approach is neither trivial nor is it robust to image quality variations. To analyze plant cells in in vivo confocal fluorescence microscopy images with robustness and increased performance, we propose the use of local convergence filters (LCF). These filters are based in gradient convergence and as such can handle illumination variations, noise and low contrast. We apply a range of existing convergence filters for cell nuclei analysis of the Arabidopsis thaliana plant root tip. To further increase contrast invariance, we present an augmentation to local convergence approaches based on image phase information. Through the use of convergence index filters we improved the results for cell nuclei detection and shape estimation when compared with baseline approaches. Using phase congruency information we were able to further increase performance by 11% for nuclei detection accuracy and 4% for shape adaptation. Shape regularization was also applied, but with no significant gain, which indicates shape estimation was good for the applied filters. 相似文献
9.
Benchmarking classifiers to optimally integrate terrain analysis and multispectral remote sensing in automatic rock glacier detection 总被引:2,自引:0,他引:2
Alexander Brenning 《Remote sensing of environment》2009,113(1):239-247
The performance improvements that can be achieved by classifier selection and by integrating terrain attributes into land cover classification are investigated in the context of rock glacier detection. While exposed glacier ice can easily be mapped from multispectral remote-sensing data, the detection of rock glaciers and debris-covered glaciers is a challenge for multispectral remote sensing. Motivated by the successful use of digital terrain analysis in rock glacier distribution models, the predictive performance of a combination of terrain attributes derived from SRTM (Shuttle Radar Topography Mission) digital elevation models and Landsat ETM+ data for detecting rock glaciers in the San Juan Mountains, Colorado, USA, is assessed. Eleven statistical and machine-learning techniques are compared in a benchmarking exercise, including logistic regression, generalized additive models (GAM), linear discriminant techniques, the support vector machine, and bootstrap-aggregated tree-based classifiers such as random forests. Penalized linear discriminant analysis (PLDA) yields mapping results that are significantly better than all other classifiers, achieving a median false-positive rate (mFPR, estimated by cross-validation) of 8.2% at a sensitivity of 70%, i.e. when 70% of all true rock glacier points are detected. The GAM and standard linear discriminant analysis were second best (mFPR: 8.8%), followed by polyclass. For comparison, the predictive performance of the best three techniques is also evaluated using (1) only terrain attributes as predictors (mFPR: 13.1-14.5% for best three techniques), and (2), only Landsat ETM+ data (mFPR: 19.4-22.7%), yielding significantly higher mFPR estimates at a 70% sensitivity. The mFPR of the worst three classifiers was by about one-quarter higher compared to the best three classifiers, and the combination of terrain attributes and multispectral data reduced the mFPR by more than one-half compared to remote sensing only. These results highlight the importance of combining remote-sensing and terrain data for mapping rock glaciers and other debris-covered ice and choosing the optimal classifier based on unbiased error estimators. The proposed benchmarking methodology is more generally suitable for comparing the utility of remote-sensing algorithms and sensors. 相似文献
10.
Correlation filter: an accurate approach to detect and locate low contrast character strings in complex table environment 总被引:1,自引:0,他引:1
Li Y Wang Z Zeng H 《IEEE transactions on pattern analysis and machine intelligence》2004,26(12):1639-1644
Correlation has been used extensively in object detection field. In this paper, two kinds of correlation filters, minimum average correlation energy (MACE) and extended maximum average correlation height (EMACH), are applied as adaptive shift locators to detect and locate smudgy character strings in complex tabular color flight coupon images. These strings in irregular tabular coupon are computer-printed characters but of low contrast and could be shifted out of the table so that we cannot detect and locate them using traditional algorithms. In our experiment, strings are extracted in the preprocessing phase by removing background and then based on geometric information, two correlation filters are applied to locate expected fields. We compare results from two correlation filters and demonstrate that this algorithm is a high accurate approach. 相似文献
11.
In this work, a model-based procedure exploiting analytical redundancy for the detection and isolation of faults on a gas turbine simulated process is presented. The main point of the paper consists of exploiting an identification scheme in connection with dynamic observer or filter design procedures for diagnostic purposes. Thus, black-box modelling and output estimation approaches to fault diagnosis are in particular advantageous in terms of solution complexity and performance achieved. Moreover, the suggested scheme is especially useful when robust solutions are considered for minimising the effects of modelling errors and noise, while maximising fault sensitivity. In order to experimentally verify the robustness of the solution obtained, the proposed FDI strategy has been applied to the simulation data of a single-shaft industrial gas turbine plant in the presence of measurement and modelling errors. Hence, extensive simulations of the test-bed process and Monte Carlo analysis are the tools for assessing experimentally the capabilities of the developed FDI scheme, when compared also with different data-driven diagnosis methods. 相似文献
12.
Zulkifley Mohd Asyraf Abdani Siti Raihanah Zulkifley Nuraisyah Hani 《Multimedia Tools and Applications》2019,78(24):34563-34584
Multimedia Tools and Applications - Automatic pterygium detection is an essential screening tool for health community service groups. It allows non-expert to perform screening process without the... 相似文献
13.
Nanda Umakanta Acharya Debiprasad Priyabrata Patra Sarat Kumar 《Microsystem Technologies》2017,23(3):533-539
Blind zone of a phase frequency detector (PFD) enhances the phase noise in a Charge Pump PLL. This paper presents a novel technique to reduce the blind zone which reduces the reference spur as well. In this proposed work a variable delay element is incorporated in the Reset path of the PFD. The overall PFD delay is maintained at a small positive value to avoid blind zone at lower phase noise. The performance analysis carried out in Cadence design environment and compared with the performance of a PFD using a fixed delay element in its reset path. The comparison shows that the phase noise is improved by 6 dB.
相似文献14.
Moon and Bien (1991) proposed a new accommodation filter for the state estimation and the failure diagnosis, and in addition a minimum variance accommodation filter was suggested for the stochastic system. However, in the above paper the design of the filter for the deterministic system was heuristic, whereas for the stochastic system the gain used was not a minimum variance filter gain. In this correspondence item we present a systematic method of gain determination for the accommodation filter as well as a minimum variance filter gain for the noisy system. 相似文献
15.
Parihar Ashish Singh Chakraborty Swarnendu Kumar 《The Journal of supercomputing》2021,77(12):14305-14355
The Journal of Supercomputing - The problem of mutual exclusion is a highly focused area in the distributed architecture. To avoid inconsistency in data, mutual exclusion ensures that no two... 相似文献
16.
17.
In the last 15 years much effort has been made in the field of segmentation of videos into scenes. We give a comprehensive overview of the published approaches and classify them into seven groups based on three basic classes of low-level features used for the segmentation process: (1) visual-based, (2) audio-based, (3) text-based, (4) audio-visual-based, (5) visual-textual-based, (6) audio-textual-based and (7) hybrid approaches. We try to make video scene detection approaches better assessable and comparable by making a categorization of the evaluation strategies used. This includes size and type of the dataset used as well as the evaluation metrics. Furthermore, in order to let the reader make use of the survey, we list eight possible application scenarios, including an own section for interactive video scene segmentation, and identify those algorithms that can be applied to them. At the end, current challenges for scene segmentation algorithms are discussed. In the appendix the most important characteristics of the algorithms presented in this paper are summarized in table form. 相似文献
18.
T. G. Petrov S. V. Chebanov 《Automatic Documentation and Mathematical Linguistics》2016,50(5):202-213
An approach to the enhancement of mental alertness in order to extract implicit knowledge and convert it into new information is considered. The approach is based on multifaceted analysis of certain objects, situations, problems, ideas, or confusions that appear to be important for a person or a legal entity. The distinctive feature of the approach is the formulation of a large number of “relevant” statements at the initial stage. Each of these statements is used to construct (1) a set of ”factors,” or statements that express the causes, the pretexts, and the conditions (including starting and boundary conditions) and (2) a set of “effects,” or statements that present the consequences, the results, the conclusions, novel requirements and/or suggestions concerning the conditions of implementation, etc. Each factor and effect is subsequently considered as a statement that requires analysis of the same type as the primary statements. The procedure is repeated until the factors and effects that were detected become limited by factors that are not amenable to further analysis or effects that can be regarded as boundary effects relatively to all other effects. The analytical procedure has been validated and is used to solve diverse tasks of pedagogics, to resolve conflicts, and to conduct research. 相似文献
19.
Tarik Cakar Mehmet Bayram Yildirim Mehmet Barut 《Journal of Intelligent Manufacturing》2005,16(4-5):453-462
In this paper, we propose a neuro-genetic decision support system coupled with simulation to design a job shop manufacturing system by achieving predetermined values of targeted performance measures such as flow time, number of tardy jobs, total tardiness and machine utilization at each work center. When a manufacturing system is designed, the management has to make decisions on the availability of resources or capacity, in our setting, the number of identical machines in each work station and the dispatching rule to be utilized in the shop floor to achieve performance values desired. Four different priority rules are used as Earliest due date (EDD), Shortest Processing Time (SPT), Critical ratio (CR) and First Come First Serve (FCFS). In reaching the final decision, design alternatives obtained from the proposed system are evaluated in terms of performance measures. An illustrative example is provided to explain the procedure. 相似文献
20.
Lourdes Araujo Juan Julián Merelo 《Soft Computing - A Fusion of Foundations, Methodologies and Applications》2010,14(3):211-227
This paper presents an evolutionary algorithm for modeling the arrival dates in time-stamped data sequences such as newscasts,
e-mails, IRC conversations, scientific journal articles or weblog postings. These models are applied to the detection of buzz (i.e. terms that occur with a higher-than-normal frequency) in them, which has attracted a lot of interest in the online world with the increasing number of periodic content
producers. That is why in this paper we have used this kind of online sequences to test our system, though it is also valid
for other types of event sequences. The algorithm assigns frequencies (number of events per time unit) to time intervals so
that it produces an optimal fit to the data. The optimization procedure is a trade off between accurately fitting the data
and avoiding too many frequency changes, thus overcoming the noise inherent in these sequences. This process has been traditionally
performed using dynamic programming algorithms, which are limited by memory and efficiency requirements. This limitation can
be a problem when dealing with long sequences, and suggests the application of alternative search methods with some degree
of uncertainty to achieve tractability, such as the evolutionary algorithm proposed in this paper. This algorithm is able
to reach the same solution quality as those classical dynamic programming algorithms, but in a shorter time. We also test
different cost functions and propose a new one that yields better fits than the one originally proposed by Kleinberg on real-world
data. Finally, several distributions of states for the finite state automata are tested, with the result that an uniform distribution
produces much better fits than the geometric distribution also proposed by Kleinberg. We also present a variant of the evolutionary
algorithm, which achieves a fast fit of a sequence extended with new data, by taking advantage of the fit obtained for the
original subsequence. 相似文献