首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 500 毫秒
1.
2.
Entropy analysis and wavelet transforms are used to study the Southern Oscillation index (SOI) and multivariate ENSO index (MEI) indexes for the El Niño–Southern Oscillation. Trends to large decreasing symbolic chains responsible for La Niña and interrupted increasing chains for El Niño are found in the MEI persistence series. These are all located in regions where the wavelet transforms of both indexes reveal the existence of mid-range correlations. The SOI and MEI indexes are mutually correlated in a non-trivial manner in the time.  相似文献   

3.
The three‐parameter, Generalized Gamma function solution of a recent MEF formulation used to derive liquid spray drop‐size distribution, is applied to sprays resulting from three different atomization processes. The objectives of these applications are to determine the sign of the parameters for which this function reports a more reliable fit and to further understand the parameter stability problem reported elsewhere. It is found that the lack of stability of the parameters is related to a characteristic feature of the mathematical function and appears for a series of spray drop‐size distributions with constant shape. For each situation analyzed in the present study, the Generalized Gamma function provides a very good fit with parameters that are either constant or correlated to the working conditions. As far as the sign of the parameters is concerned, the results show that the best formulation is a function of the spray and that it is impossible to know, a priori, which parameter sign will report the best fit. Finally, for one situation, it is found that the Generalized Gamma function allows extrapolation of drop sizes outside the measured values. All of the results converge to conclude that the three‐parameter Generalized Gamma function, which is identical to the well‐known Nukiyama‐Tanasawa distribution, accumulates valuable attributes to represent liquid spray drop‐size distributions.  相似文献   

4.
Robert H. Swendsen 《Physica A》2010,389(15):2898-2901
Although Ludwig Boltzmann was one of the primary founders of the field of statistical mechanics, very few contemporary physicists have actually read his papers. As a result, some of his ideas have been distorted or even lost over the course of time. In this paper, I will discuss some of the reasons for the neglect of Boltzmann’s writings and try to reintroduce one of his most important ideas, the definition of entropy.  相似文献   

5.
Two categories of life are currently recognized—chemosynthetic and photosynthetic—indicating their principal free energy resource as either chemicals or electromagnetic radiation. Building on recent developments in thermodynamics, we posit a third category of life—thermosynthetic life (TL)—which relies on environmental heat rather than traditional free energy sources. Since thermal energy is more abundant than chemicals or light in many settings, thermosynthesis offers compelling evolutionary possibilities for new life forms. Based on variants of standard cellular machinery, a physical model is proposed for the conversion of thermal energy into biochemical work. Conditions favorable to thermosynthetic life and prospects for its discovery are assessed. Terrestrially, deep-subsurface unicellular anaerobic superthermophiles are deduced to be likely TL candidates.  相似文献   

6.
7.
The applicability of the collective coordinate method (saddle-point approximation) for large-N planar models is discussed. Some unstated assumptions are clarified. Statements that Wilson loops form a complete set of gauge invariant operators are also examined and a set of generalized algebraic Mandelstam relations among Wilson loops is presented. The inclusion of loops that wind around themselves and cross many times, as independent variables, is stressed.  相似文献   

8.
9.
In this paper, an entropy-consistent flux is developed, continuing from the work of the previous paper. To achieve entropy consistency, a second and third-order differential terms are added to the entropy-conservative flux. This new flux function is tested on several one dimensional problems and compared with the original Roe flux. The new flux function exactly preserves the stationary contact discontinuity and does not capture the unphysical rarefaction shock. For steady shock problems, the new flux predicts a slightly more diffused profile whereas for unsteady cases, the captured shock is very similar to those produced by the Roe- flux. The shock stability is also studied in one dimension. Unlike the original Roe flux, the new flux is completely stable which will provide as a candidate to combat multidimensional shock instability, particularly the carbuncle phenomenon.  相似文献   

10.
11.
A perfect diffuser would place 100% of the light leaving the projector in that small region of space where there will be audience eyes to observe it. It would not allow light from sources other than the projector to reach the eyes from the screen. The screen should be affordably priced and cosmetically unremarkable, e.g. seamless. The image seen by any observer should be equally bright over the whole screen. I discuss a way to approximate the perfect projection screen using kinoform diffusers, a Fresnel lens and a mirrored surface.  相似文献   

12.
13.
Two of the present authors have put forward a projective geometry based model of rational trading that implies a model for subjective demand/supply profiles if one considers closing of a position as a random process. We would like to present the analysis of a subjectivity in such trading models. In our model, the trader gets the maximal profit intensity when the probability of transaction is ∼0.5853. We also present a comparison with the model based on the Maximum of Entropy Principle. To the best of our knowledge, this is one of the first analyses that show a concrete situation in which trader profit optimal value is in the class of price-negotiating algorithms (strategies) resulting in non-monotonic demand (supply) curves of the Rest of the World (a collective opponent). Our model suggests that there might be a new class of rational trader strategies that (almost) neglects the supply-demand profile of the market. This class emerges when one tries to minimize the information that strategies reveal.  相似文献   

14.
Approximate and Sample Entropy are two widely used techniques to measure system complexity or regularity based on chosen parameters such as pattern length, m, and tolerance, r. In this paper, we investigate how different values of the time delay parameter, τ can be used in conjunction with standard values of m and r in the computation of Approximate and Sample Entropy. The results show that for time series generated by nonlinear dynamics that have long range correlation, a time delay equal to the first zero crossing or minimum of the autocorrelation function can provide additional information into the characteristics of the time series that may be useful in comparative analysis. With a unity delay, we demonstrate that Approximate and Sample Entropy are possibly measuring only the (linear) autocorrelation properties of the signal, and these are highly invariant under surrogate data generation methods. Hence when this occurs, the complexity measures of the surrogate and original data are not statistically different.  相似文献   

15.
It is shown that direct photons provide a leading twist mechanism for diffractive jet production in which the jets carry away all of the momentum lost by the proton. Two-photon processes are thus expected to asymptotically dominate “super-hard” pomeron events in ep collisions. We report the expected rates from these events for recent ZEUS and H1 data cuts. We also estimate the direct photon contribution to the “super-hard” pomeron events observed by the CERN UA8 group for collisions. It is again argued that direct photons are the leading mechanism for these events. We find that direct photons are an appreciable fraction of the events seen by UA8.  相似文献   

16.
17.
One of the first publications by the ATLAS collaboration using data from the Large Hadron Collider at CERN dealt with the measurement of the production cross section of the W boson. The collaboration “rediscovered” the W in order to, among other things, check whether the detector and analysis methods were working well. Originally, the discovery of the W had been announced in 1983 by the CERN management, referring mainly to work done by its UA1 collaboration. In both the discovery and the “rediscovery”, the convergence of two distinct sets of criteria of data selection was an important concern of the researchers. In 1983, this concern figured prominently in the published paper whereas in 2010 it was mainly dealt with inside the collaboration.  相似文献   

18.
19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号