首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We introduce a technique of time series analysis, potential forecasting, which is based on dynamical propagation of the probability density of time series. We employ polynomial coefficients of the orthogonal approximation of the empirical probability distribution and extrapolate them in order to forecast the future probability distribution of data. The method is tested on artificial data, used for hindcasting observed climate data, and then applied to forecast Arctic sea-ice time series. The proposed methodology completes a framework for ‘potential analysis’ of tipping points which altogether serves anticipating, detecting and forecasting nonlinear changes including bifurcations using several independent techniques of time series analysis. Although being applied to climatological series in the present paper, the method is very general and can be used to forecast dynamics in time series of any origin.  相似文献   

2.
We consider periodic and chaotic dynamics of discrete nonlinear maps in the presence of dynamical noise. We show that dynamical noise corrupting dynamics of a nonlinear map may be considered as a measurement “pseudonoise” with the distribution determined by the Jacobian of the map. The formula for the distribution of the measurement “pseudonoise” for one-dimensional quadratic maps has also been obtained in an explicit form. We expect that our results apply to an arbitrary distribution of low-level dynamical noise and hope that these results could help to find a universal method of discriminating dynamical from measurement noise.  相似文献   

3.
Factor analysis is a well known statistical method to describe the variability among observed variables in terms of a smaller number of unobserved latent variables called factors. While dealing with multivariate time series, the temporal correlation structure of data may be modeled by including correlations in latent factors, but a crucial choice is the covariance function to be implemented. We show that analyzing multivariate time series in terms of latent Gaussian processes, which are mutually independent but with each of them being characterized by exponentially decaying temporal correlations, leads to an efficient implementation of the expectation–maximization algorithm for the maximum likelihood estimation of parameters, due to the properties of block-tridiagonal matrices. The proposed approach solves an ambiguity known as the identifiability problem, which renders the solution of factor analysis determined only up to an orthogonal transformation. Samples with just two temporal points are sufficient for the parameter estimation: hence the proposed approach may be applied even in the absence of prior information about the correlation structure of latent variables by fitting the model to pairs of points with varying time delay. Our modeling allows one to make predictions of the future values of time series and we illustrate our method by applying it to an analysis of published gene expression data from cell culture HeLa.  相似文献   

4.
Change detection is the crucial subject in dynamical systems. There are suitable methods for detecting changes for linear systems and some methods for nonlinear systems, but there is a lack of methods concerning chaotic systems. This paper presents change detection techniques for dynamical systems with chaos. We consider the dynamical system described by the time series which originated from ordinary differential equation and real-world phenomena. We assume that the change parameters are unknown and the change could be either slight or drastic. The process of change detection is based on characteristic dynamical system invariants. Changes in the invariants’ values of the dynamical systems are the indicators of change. We propose a method of change detection based on the fractal dimension and recurrence plot. The automatic detection is provided by control charts. Methods were checked by using small data sets and stream data.  相似文献   

5.
A new approach based on Wasserstein distances, which are numerical costs of an optimal transportation problem, allows us to analyze nonlinear phenomena in a robust manner. The long-term behavior is reconstructed from time series, resulting in a probability distribution over phase space. Each pair of probability distributions is then assigned a numerical distance that quantifies the differences in their dynamical properties. From the totality of all these distances a low-dimensional representation in a Euclidean space is derived, in which the time series can be classified and statistically analyzed. This representation shows the functional relationships between the dynamical systems under study. It allows us to assess synchronization properties and also offers a new way of numerical bifurcation analysis.The statistical techniques for this distance-based analysis of dynamical systems are presented, filling a gap in the literature, and their application is discussed in a few examples of datasets arising in physiology and neuroscience, and in the well-known Hénon system.  相似文献   

6.
This study explores temporal changes in the dynamics of the Holocene ENSO proxy record of the Laguna Pallcacocha sedimentary data using two entropy quantifiers. In particular, we analyze the possible connections between changes in entropy and epochs of rapid climate change (RCC). Our results indicate that the dynamics of the ENSO proxy record during the RCC interval 9000-8000 BP displays very low entropy (high predictability) that is remarkably different from that of the other RCCs of the Holocene. Both entropy quantifiers point out to the existence of cycles with a period close to 2000 years during the mid-to-late Holocene. Within these cycles, we find a tendency for entropy to increase (predictability to decrease) during the two longer RCC periods (6000-5000 and 3500-2500 BP) which might be associated with the reported increased aridity of the low tropics.  相似文献   

7.
The concept of symbolic dynamics, entropy and complexity measures has been widely utilized for the analysis of measured time series. However, little attention as been devoted to investigate the effects of choosing different partitions to obtain the coarse-grained symbolic sequences. Because the theoretical concepts of generating partitions mostly fail in the case of empirical data, one commonly introduces a homogeneous partition which ensures roughly equidistributed symbols. We will show that such a choice may lead to spurious results for the estimated entropy and will not fully reveal the randomness of the sequence. Received 1st September 2000  相似文献   

8.
M. Vahabi  L. Hedayatifar 《Physica A》2010,389(9):1915-1921
Problems regarding pollution must be handled very carefully and precisely, which requires a well-organized and accurate program. In this article, we suggest that different programs should be used for different time scales (short-term or long-term programs) to manage and control the pollution, which we call multiscale programs. Here, we follow the level crossing (LC) method to introduce an optimum program for different time scales to manage pollution. It might be useful to mention that we have used the historical pollution data available for Tehran.  相似文献   

9.
The correlation dimension D 2 and correlation entropy K 2 are both important quantifiers in nonlinear time series analysis. However, use of D 2 has been more common compared to K 2 as a discriminating measure. One reason for this is that D 2 is a static measure and can be easily evaluated from a time series. However, in many cases, especially those involving coloured noise, K 2 is regarded as a more useful measure. Here we present an efficient algorithmic scheme to compute K 2 directly from a time series data and show that K 2 can be used as a more effective measure compared to D 2 for analysing practical time series involving coloured noise.   相似文献   

10.
The upper and lower bounds of the linear variance decay (LVD) dimension density are analytically deduced using multivariate series with uncorrelated and perfectly correlated component series. Then, the normalized LVD dimension density (δnormLVD) is introduced. In order to measure the complexity of a scalar series with δnormLVD, a pseudo-multivariate series was constructed from the scalar time series using time-delay embedding. Thus, δnormLVD is used to characterize the complexity of the pseudo-multivariate series. The results from the model systems and fMRI data of anxiety subjects reveal that this method can be used to analyze short and noisy time series.  相似文献   

11.
刘杰  石书婷  赵军产 《中国物理 B》2013,22(1):10505-010505
The three most widely used methods for reconstructing the underlying time series via the recurrence plots (RPs) of a dynamical system are compared with each other in this paper. We aim to reconstruct a toy series, a periodical series, a random series, and a chaotic series to compare the effectiveness of the most widely used typical methods in terms of signal correlation analysis. The application of the most effective algorithm to the typical chaotic Lorenz system verifies the correctness of such an effective algorithm. It is verified that, based on the unthresholded RPs, one can reconstruct the original attractor by choosing different RP thresholds based on the Hirata algorithm. It is shown that, in real applications, it is possible to reconstruct the underlying dynamics by using quite little information from observations of real dynamical systems. Moreover, rules of the threshold chosen in the algorithm are also suggested.  相似文献   

12.
We use the detrended fluctuation analysis (DFA), the detrended cross correlation analysis (DCCA) and the magnitude and sign decomposition analysis to study the fluctuations in the turbulent time series and to probe long-term nonlinear levels of complexity in weakly and high turbulent flow. The DFA analysis indicate that there is a time scaling region in the fluctuation function, segregating regimes with different scaling exponents. We discuss that this time scaling region is related to inertial range in turbulent flows. The DCCA exponent implies the presence of power-law cross correlations. In addition, we conclude its multifractality for high Reynold’s number in inertial range. Further, we find that turbulent time series exhibit complex features by magnitude and sign scaling exponents.  相似文献   

13.
The ambiguity that can exist, for short datasets, between the observational power spectra of dynamical fractals and low‐order linear memory processes is demonstrated and explained. It is argued that it could be broadly useful to have a highly practical rule‐of‐thumb for assessing whether a data record is sufficiently long to permit distinguishing the two types of processes, and if it is not, to produce an approximate estimate of the amount of additional data that would be required to do so. Such an expression is developed using the AR(1) process as a loose benchmark. Various aspects of the technique are successfully tested using synthetic time series generated by a range of prescribed models, and its application and relevance to observational datasets is then demonstrated using examples from mathematical ecology (wild steelhead population size), geophysics (river flow volume), and econophysics (stock price volatility).  相似文献   

14.
We present an independent test of recently developed methods of potential analysis and degenerate fingerprinting which aim, respectively, to identify the number of states in a system, and to forecast bifurcations. Several samples of modelled data of unknown origin were provided by one author, and the methods were used by the two other authors to investigate these properties. The main idea of the test was to investigate whether the techniques are capable to identify the character of the data of unknown origin, which includes potentiality, possible transitions and bifurcations. Based on the results of the analysis, models were proposed that simulated data equivalent to the test samples. The results obtained were compared with the initial simulations for critical evaluation of the performance of the methods. In most cases, the methods successfully detected the number of states in a system, and the occurrence of transitions between states. The derived models were able to reproduce the test data accurately. However, noise-induced abrupt transitions between existing states cannot be forecast due to the lack of any change in the underlying potential.  相似文献   

15.
Nowadays we are often faced with huge databases resulting from the rapid growth of data storage technologies. This is particularly true when dealing with music databases. In this context, it is essential to have techniques and tools able to discriminate properties from these massive sets. In this work, we report on a statistical analysis of more than ten thousand songs aiming to obtain a complexity hierarchy. Our approach is based on the estimation of the permutation entropy combined with an intensive complexity measure, building up the complexity-entropy causality plane. The results obtained indicate that this representation space is very promising to discriminate songs as well as to allow a relative quantitative comparison among songs. Additionally, we believe that the here-reported method may be applied in practical situations since it is simple, robust and has a fast numerical implementation.  相似文献   

16.
Earthquakes (EQs) are large-scale fracture phenomena in the Earth’s heterogeneous crust. Fracture-induced physical fields allow a real-time monitoring of damage evolution in materials during mechanical loading. Electromagnetic (EM) emissions in a wide frequency spectrum ranging from kHz to MHz are produced by opening cracks, this can be considered as the so-called precursors of general fracture. We emphasize that the MHz radiation appears earlier than the kHz on both laboratory and geophysical scales. An important challenge in this field of research is to distinguish characteristic epochs in the evolution of precursory EM activity and identify them with the equivalent last stages in the EQ preparation process. Recently, we proposed the following two-stage model. (i) The first epoch, which includes the initial emergent MHz EM emission, is thought to be due to the fracture of a highly heterogeneous system that surrounds a family of large high-strength asperities distributed along the activated fault sustaining the system. (ii) The second epoch, which includes the emergent strong impulsive kHz EM radiation, is due to the fracture of the asperities themselves. A catastrophic EQ of magnitude Mw=6.3 occurred on 6 April, 2009 (06/04/09) in central Italy. The majority of the damage occurred in the city of L’Aquila. Clear kHz–MHz EM anomalies had been detected prior to the L’Aquila EQ. Here, we investigate the seismogenic origin of the MHz part of the anomalies. The analysis, which is in terms of intermittent dynamics of critical fluctuations, reveals that the candidate EM precursor (i) can be described as analogous to a thermal continuous phase transition and (ii) has anti-persistent behavior. These features suggest that this candidate precursor was triggered by microfractures in the highly disordered system that surrounded the backbone of asperities of the activated fault. A criterion for underlying strong critical behavior is introduced. In this field of research, reproducibility of results is desirable; and is best done by analyzing a number of precursory MHz EM emissions. We refer to previous studies of precursory MHz EM activities associated with nine significant EQs that have occurred in Greece in recent years. We conclude that all the MHz EM precursors studied, including the present one, can be described as analogous to a continuous second-order phase transition having strong criticality and anti-persistent behavior.  相似文献   

17.
M. Vahabi  G.R. Jafari 《Physica A》2009,388(18):3859-3865
Privatization — a political as well as an economic policy — is generally defined as the transfer of a property or the responsibility for it from the public to the private sector. But privatization is not merely the transfer of the ownership and efficiency of the market should be considered. A successful privatization program induces better profitability and efficiency, higher output, more investment, etc. The main method of privatization is through introducing new stocks to the market to motivate competition. However, for a successful privatization the capability of a market for absorbing the new stock should also be considered. Without paying attention to this aspect, privatization through the introduction of new stocks may lead to reduced market efficiency. We study, based on the complexity theory and in particular the concept of Level Crossing, the effect of the stages of the development, activity, risk, and the waiting times for special events on the privatization.  相似文献   

18.
We derive the nonlinear equations satisfied by the coefficients of linear combinations that maximize their skewness when their variance is constrained to take a specific value. In order to numerically solve these nonlinear equations we develop a gradient-type flow that preserves the constraint. In combination with the Karhunen-Loève decomposition this leads to a set of orthogonal modes with maximal skewness. For illustration purposes we apply these techniques to atmospheric data; in this case the maximal-skewness modes correspond to strongly localized atmospheric flows. We have also checked that the results are statistically significant in spite of the finite length of the data. We show how these ideas can be extended, for example to maximal-flatness modes.  相似文献   

19.
A power law classification scheme (PLCS) of time series correlations is proposed. It is shown that PLCS provides the ability to classify nonlinear correlations and measure their stability. PLCS has been applied to gross domestic product (GDP) per capita of G20 members and their correlations analysed. It has been shown that the method does not only recognise linear correlations properly, but also allows to point out converging time series as well as to distinguish nonlinear correlations. PLCS is capable of crash recognition as it is shown in the Argentina example. Finally the strength of correlations and the stability of correlation matrices have been used to construct a minimum spanning tree (MST). The results were compared with those based on the ultrametric distance (UD). Comparing the structures of MST, UD and PLCS indicates that the latter one is more complicated, but better fits the expected economic relations within the G20.  相似文献   

20.
Recently, the visibility graph (VG) algorithm was proposed for mapping a time series to a graph to study complexity and fractality of the time series through investigation of the complexity of its graph. The visibility graph algorithm converts a fractal time series to a scale-free graph. VG has been used for the investigation of fractality in the dynamic behavior of both artificial and natural complex systems. However, robustness and performance of the power of scale-freeness of VG (PSVG) as an effective method for measuring fractality has not been investigated. Since noise is unavoidable in real life time series, the robustness of a fractality measure is of paramount importance. To improve the accuracy and robustness of PSVG to noise for measurement of fractality of time series in biological time-series, an improved PSVG is presented in this paper. The proposed method is evaluated using two examples: a synthetic benchmark time series and a complicated real life Electroencephalograms (EEG)-based diagnostic problem, that is distinguishing autistic children from non-autistic children. It is shown that the proposed improved PSVG is less sensitive to noise and therefore more robust compared with PSVG. Further, it is shown that using improved PSVG in the wavelet-chaos neural network model of Adeli and c-workers in place of the Katz fractality dimension results in a more accurate diagnosis of autism, a complicated neurological and psychiatric disorder.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号