Prediction of stock index remains a challenging task of the financial time series prediction process. Random fluctuations in the stock index make it difficult to predict. Usually the time series prediction is based on the observations of past trend over a period of time. In general, the curve the time series data follows has a linear part and a non-linear part. Prediction of the linear part with past history is not a difficult task, but the prediction of non linear segments is difficult. Though different non-linear prediction models are in use, but their prediction accuracy does not improve beyond a certain level. It is observed that close enough data positions are more informative where as far away data positions mislead prediction of such non linear segments. Apart from the existing data positions, exploration of few more close enough data positions enhance the prediction accuracy of the non-linear segments significantly. In this study, an evolutionary virtual data position (EVDP) exploration method for financial time series is proposed. It uses multilayer perceptron and genetic algorithm to build this model. Performance of the proposed model is compared with three deterministic methods such as linear, Lagrange and Taylor interpolation as well as two stochastic methods such as Uniform and Gaussian method. Ten different stock indices from across the globe are used for this experiment and it is observed that in majority of the cases performance of the proposed EVDP exploration method is better. Some stylized facts exhibited by the financial time series are also documented.
An information security policy is a most vital and therefore essential part of any organisation’s set of controls. Yet most organisations have poorly written policies. This paper examines the possible shortfalls of policy writings. Starting from the meaning and etymological roots of the word, the theory behind particular structures is discussed. The case is made for a single policy, many standards hierarchy. Structural and linguistic considerations including a sample statement hierarchy provide 2 practical suggestions on how to build such a hierarchy. Authorship considerations and some fine details on how to convey the will and purpose of the signatories complete the paper. 相似文献
Multimedia Tools and Applications - This paper discusses the development of an efficient and automated system for the recognition of facial expressions, which is essentially an application... 相似文献
The differences in the bioavailability of different drug products are most frequently caused by differences in the dissolution rates of the active ingredient. In case of magnesium oxide the drug release can be directly determined by reaction kinetics method based on acid neutralization.
For a more precise study of the factors influencing the kinetical characteristics of the neutralization rates it is advisable to use homogeneous granule fractions. Before the granulation the substance was pretreated with silicone oil. The granulation of the obtained grains having hydrophobe surface was carried out in an AEROMATIC STREA-I type laboratory fluidization equipment with Eudragit polymer solved in isopropyl alcohol.
For determining the acid neutralization kinetics of the granules the “constant pH” method and the Rossett-Rice test were used.
As a result of the granulation the neutralization rate decreased. The granules can be considered as an Eudragit matrix which contains the pretreated magnesium oxide in embedded form. During the chemical reaction the resulted salt (magnesium chloride) leaves the surface of the unreacted magnesium oxide unless having a chemical reaction with the polymer. Meanwhile the residual matrix forms a mesh which increases the viscosity of the solution and the thickness of the diffusion layer. The dissolution rate decreases in both cases.
Under the same conditions the kinetic values of the neutralization change by several magnitudes depending on the utilized methods. In this way different systems of medicine, which alter their reaction capacity according to the expected physiological purposes, can be created. 相似文献
The sponge Tethya lyncurium from the Northern Adriatic has been used as an experimental species.A method is outlined for preparation of DNA which yields a highly purified DNA with a double-strand (ds) molecular weight of 25 M-dalton between single-strand (ss) breaks, which when properly damaged can be cut opposite to ss-breaks with nuclease s1. The molecular weights of the resulting ds-DNA pieces and their distribution has been evaluated by electron microscope photographs.Sponges exposed to benzo[a]pyrene (BaP) in the dark only incorporate BaP-derivatives (BaPD) in small amounts, if any. However, in the presence of light, derivatization to BaP derivatives enables effective coupling to occur, as shown previously (R. K. Zahn et al., 1981). Sponges were exposed to radiolabeled BaP in the presence of light. Coupling of BaPD to the DNA as well as the induction of ss-breaks were measured.Light-mediated coupling is concentration dependent from 0.01 – 20 ppb BaP with a correlation coefficient of r = 0.84.Under conditions of possible repair, ss-breaks completely disappear from sponge DNA in the course of three weeks while a substantial fraction of the BaP derivatives persists.Double label experiments show that substantial DNA synthesis occurs during this time. Pollution causes a decrease of the molecular weight of unnicked DNA, re-incubation in clean water an increase. A DNA species of 24 M-dalton seems to play a critical role. If its percentage in the DNA population drops below a critical level, recovery is not longer possible. DNA damage by PAH and repair in sponges seems to differ from that of most eucaryotes. 相似文献
Frameworks are useful guides to the thought processes of information security professionals for building their solutions. These frameworks are not solutions, only guides. They ensure that nothing is left out and that the work is done thoroughly and well. The quality of frameworks unfortunately is not consistent. Following a framework that is not fitting to the business requirements can create false assurance. A methodology is discussed in this paper about building a fitting framework. Asking pertinent questions forms the basis for such framework. The questions and the process of asking those questions determine the quality of the solution. A set of questions are described as examples, and explained how they define the areas that are necessary to enable a sound solution development. Some common errors and misconceptions are highlighted together with pointers to how they can be avoided or overcome. The methodology for developing those areas identified by the questions completes the paper. 相似文献