首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper introduces a new framework for human contour tracking and action sequence recognition. Given a gallery of labeled human contour sequences, we define each contour as a “word” and encode all of them into a contour dictionary. This dictionary will be used to translate the video. To this end, a contour graph is constructed by connecting all the neighboring contours. Then, the motion in a video is viewed as an instance of random walks on this graph. As a result, we can avoid explicitly parameterizing the contour curves and modeling the dynamical system for contour updating. In such a work setting, there are only a few state variables to be estimated when using sequence Monte Carlo (SMC) approach to realize the random walks. In addition, the walks on the graph also perform sequence comparisons implicitly with those in the predefined gallery, from which statistics about class label is evaluated for action recognition. Experiments on diving tracking and recognition illustrate the validity of our method.  相似文献   

2.
We present a novel method for representing “extruded” distributions. An extruded distribution is an M-dimensional manifold in the parameter space of the component distribution. Representations of that manifold are “continuous mixture models”. We present a method for forming one-dimensional continuous Gaussian mixture models of sampled extruded Gaussian distributions via ridges of goodness-of-fit. Using Monte Carlo simulations and ROC analysis, we explore the utility of a variety of binning techniques and goodness-of-fit functions. We demonstrate that extruded Gaussian distributions are more accurately and consistently represented by continuous Gaussian mixture models than by finite Gaussian mixture models formed via maximum likelihood expectation maximization.  相似文献   

3.
It is well known that there is no analytic expression for the electrical capacitance of the unit cube. However, there are several Monte Carlo methods that have been used to numerically estimate this capacitance to high accuracy. These include a Brownian dynamics algorithm [H.-X. Zhou, A. Szabo, J.F. Douglas, J.B. Hubbard, A Brownian dynamics algorithm for calculating the hydrodynamic friction and the electrostatic capacitance of an arbitrarily shaped object, J. Chem. Phys. 100 (5) (1994) 3821–3826] coupled to the “walk on spheres” (WOS) method [M.E. Müller, Some continuous Monte Carlo methods for the Dirichlet problem, Ann. Math. Stat. 27 (1956) 569–589]; the Green’s function first-passage (GFFP) algorithm [J.A. Given, J.B. Hubbard, J.F. Douglas, A first-passage algorithm for the hydrodynamic friction and diffusion-limited reaction rate of macromolecules, J. Chem. Phys. 106 (9) (1997) 3721–3771]; an error-controlling Brownian dynamics algorithm [C.-O. Hwang, M. Mascagni, Capacitance of the unit cube, J. Korean Phys. Soc. 42 (2003) L1–L4]; an extrapolation technique coupled to the WOS method [C.-O. Hwang, Extrapolation technique in the “walk on spheres” method for the capacitance of the unit cube, J. Korean Phys. Soc. 44 (2) (2004) 469–470]; the “walk on planes” (WOP) method [M.L. Mansfield, J.F. Douglas, E.J. Garboczi, Intrinsic viscosity and the electrical polarizability of arbitrarily shaped objects, Phys. Rev. E 64 (6) (2001) 061401:1–061401:16; C.-O. Hwang, M. Mascagni, Electrical capacitance of the unit cube, J. Appl. Phys. 95 (7) (2004) 3798–3802]; and the random “walk on the boundary” (WOB) method [M. Mascagni, N.A. Simonov, The random walk on the boundary method for calculating capacitance, J. Comp. Phys. 195 (2004) 465–473]. Monte Carlo methods are convenient and efficient for problems whose solution includes singularities. In the calculation of the unit cube capacitance, there are edge and corner singularities in the charge density distribution. In this paper, we review the above Monte Carlo methods for computing the electrical capacitance of a cube and compare their effectiveness. We also provide a new result. We will focus our attention particularly on two Monte Carlo methods: WOP [M.L. Mansfield, J.F. Douglas, E.J. Garboczi, Intrinsic viscosity and the electrical polarizability of arbitrarily shaped objects, Phys. Rev. E 64 (6) (2001) 061401:1–061401:16; C.-O. Hwang, M. Mascagni, Electrical capacitance of the unit cube, J. Appl. Phys. 95 (7) (2004) 3798–3802; C.-O. Hwang, T. Won, Edge charge singularity of conductors, J. Korean Phys. Soc. 45 (2004) S551–S553] and the random WOB [M. Mascagni, N.A. Simonov, The random walk on the boundary method for calculating capacitance, J. Comp. Phys. 195 (2004) 465–473] methods.  相似文献   

4.
Monte Carlo simulations may involve skewed, heavy-tailed distributions. When variances of those distributions exist, statistically valid confidence intervals can be obtained using the central limit theorem, providing that the simulation is run “long enough.” If variances do not exist, however, valid confidence intervals are difficult or impossible to obtain. The main result in this paper establishes that upon replacing ordinary Monte Carlo sampling of such heavy-tailed distributions with ex post facto sampling, estimates having finite moments of all orders are ensured for the most common class of infinite variance distributions. We conjecture that this phenomenon applies to all distributions (having finite means) when the ex post facto process is iterated.  相似文献   

5.
6.
We present a framework for 3D model reconstruction, which has potential applications to a spectrum of engineering problems with impacts on rapid design and prototyping, shape analysis, and virtual reality. The framework, composed of four main components, provides a systematic solution to reconstruct geometric model from the surface mesh of an existing object. First, the input mesh is pre-processed to filter out noise. Second, the mesh is partitioned into segments to obtain individual geometric feature patches. Then, two integrated solutions, namely solid feature based strategy and surface feature based strategy, are exploited to reconstruct primitive features from the segmented feature patches. Finally, the modeling operations, such as solid boolean and surface trimming operations, are performed to “assemble” the primitive features into the final model. The concepts of “feature”, “constraint” and “modeling history” are introduced into the entire reconstruction process so that the design intents are retrieved and exhibited in the final model with geometrical accuracy, topological consistency and flexible editability. A variety of industrial parts have been tested to illustrate the effectiveness and robustness of our framework.  相似文献   

7.
Molecular beam epitaxy is an important method for growing thin-films and nanostructures. One of the scientific challenges is to understand the fundamental processes that control the evolution of thin film structure and morphology. The results of kinetic Monte Carlo simulations carried out to study the dependence of the submonolayer scaled island-size distribution on the critical island-size are presented and compared with experiments. A recently developed method which involves a self-consistent coupling of evolution equations for the capture-zone distributions with rate-equations for the island-size distribution is also described. Our method explicitly takes into account the existence of a denuded (“capture”) zone around every island and the correlations between the size of an island and the corresponding average capture zone, and has been used to develop a quantitative rate-equation approach to irreversible submonolayer growth on a two-dimensional substrate. The resulting predictions for the capture-zones, capture numbers, and island-size distributions are in excellent agreement with experimental results and kinetic Monte Carlo simulations.  相似文献   

8.
In the paper, we develop an EPQ (economic production quantity) inventory model to determine the optimal buffer inventory for stochastic demand in the market during preventive maintenance or repair of a manufacturing facility with an EPQ (economic production quantity) model in an imperfect production system. Preventive maintenance, an essential element of the just-in-time structure, may cause shortage which is reduced by buffer inventory. The products are sold with the free minimal repair warranty (FRW) policy. The production system may undergo “out-of-control” state from “in-control” state, after a certain time that follows a probability density function. The defective (non-conforming) items in “in-control” or “out-of-control” state are reworked at a cost just after the regular production time. Finally, an expected cost function regarding the inventory cost, unit production cost, preventive maintenance cost and shortage cost is minimized analytically. We develop another case where the buffer inventory as well as the production rate are decision variables and the expected unit cost considering the above cost functions is optimized also. The numerical examples are provided to illustrate the behaviour and application of the model. Sensitivity analysis of the model with respect to key parameters of the system is carried out.  相似文献   

9.
The cure fraction models are usually used to model lifetime time data with long-term survivors. In the present article, we introduce a Bayesian analysis of the four-parameter generalized modified Weibull (GMW) distribution in presence of cure fraction, censored data and covariates. In order to include the proportion of “cured” patients, mixture and non-mixture formulation models are considered. To demonstrate the ability of using this model in the analysis of real data, we consider an application to data from patients with gastric adenocarcinoma. Inferences are obtained by using MCMC (Markov Chain Monte Carlo) methods.  相似文献   

10.
The task of accurately locating fluid-crystal phase boundaries by computer simulation is hampered by problems associated with traversing mixed-phase states. We describe a recently introduced Monte Carlo (MC) method that circumvents this problem by implementing a global coordinate transformation (“phase switch”) which takes the system from one pure phase to the other in a single MC step. The method is potentially quite general. We illustrate it by application to the freezing of hard spheres.  相似文献   

11.
Thanks to the dramatic decrease of computer costs and the no less dramatic increase in those same computer's capabilities and also thanks to the availability of specific free software and libraries that allow the set up of small parallel computation installations the scientific community is now in a position where parallel computation is within easy reach even to moderately budgeted research groups. The software package PMCD (Parallel Monte Carlo Driver) was developed to drive the Monte Carlo simulation of a wide range of user supplied models in parallel computation environments. The typical Monte Carlo simulation involves using a software implementation of a function to repeatedly generate function values. Typically these software implementations were developed for sequential runs. Our driver was developed to enable the run in parallel of the Monte Carlo simulation, with minimum changes to the original code that implements the function of interest to the researcher. In this communication we present the main goals and characteristics of our software, together with a simple study its expected performance. Monte Carlo simulations are informally classified as “embarrassingly parallel”, meaning that the gains in parallelizing a Monte Carlo run should be close to ideal, i.e. with speed ups close to linear. In this paper our simple study shows that without compromising the easiness of use and implementation, one can get performances very close to the ideal.  相似文献   

12.
Improving the reliability of bootstrap tests with the fast double bootstrap   总被引:2,自引:0,他引:2  
Two procedures are proposed for estimating the rejection probabilities (RPs) of bootstrap tests in Monte Carlo experiments without actually computing a bootstrap test for each replication. These procedures are only about twice as expensive (per replication) as estimating RPs for asymptotic tests. Then a new procedure is proposed for computing bootstrap P values that will often be more accurate than ordinary ones. This “fast double bootstrap” (FDB) is closely related to the double bootstrap, but it is far less computationally demanding. Simulation results for three different cases suggest that the FDB can be very useful in practice.  相似文献   

13.
This research builds on prior work on developing near optimal solutions to the product line design problems within the conjoint analysis framework. In this research, we investigate and compare different genetic algorithm operators; in particular, we examine systematically the impact of employing alternative population maintenance strategies and mutation techniques within our problem context. Two alternative population maintenance methods, that we term “Emigration” and “Malthusian” strategies, are deployed to govern how individual product lines in one generation are carried over to the next generation. We also allow for two different types of reproduction methods termed “Equal Opportunity” in which the parents to be paired for mating are selected with equal opportunity and a second based on always choosing the best string in the current generation as one of the parents which is referred to as the “Queen bee”, while the other parent is randomly selected from the set of parent strings. We also look at the impact of integrating the artificial intelligence approach with a traditional optimization approach by seeding the GA with solutions obtained from a Dynamic Programming heuristic proposed by others. A detailed statistical analysis is also carried out to determine the impact of various problem and technique aspects on multiple measures of performance through means of a Monte Carlo simulation study. Our results indicate that such proposed procedures are able to provide multiple “good” solutions. This provides more flexibility for the decision makers as they now have the opportunity to select from a number of very good product lines. The results obtained using our approaches are encouraging, with statistically significant improvements averaging 5% or more, when compared to the traditional benchmark of the heuristic dynamic programming technique.  相似文献   

14.
The two-dimensional Ising model in the geometry of a long stripe can be regarded as a model system for the study of nanopores. As a quasi-one-dimensional system, it also exhibits a rather interesting “phase behavior”: At low temperatures the stripe is either filled with “liquid” or “gas” and “densities” are similar to those in the bulk. When we approach a “pseudo-critical point” (below the critical point of the bulk) at which the correlation length becomes comparable to the length of the stripe, several interfaces emerge and the systems contains multiple “liquid” and “gas” domains. The transition depends on the size of the stripe and occurs at lower temperatures for larger stripes. Our results are corroborated by simulations of the three-dimensional Asakura–Oosawa model in cylindrical geometry, which displays qualitatively similar behavior. Thus our simulations explain the physical basis for the occurrence of “hysteresis critical points” in corresponding experiments.  相似文献   

15.
Product quality in mechanical assemblies is determined by controlling the propagation of manufacturing variations as the structure is built. This paper focuses on straight-build assembly and uses a probabilistic approach to analyse the influence of component variation on the eccentricity of the build. Connective models are used to predict assembly variations arising from individual component variations, and a probabilistic approach is used to calculate the probability density function (pdf) for the eccentricity of the build. The probabilistic approach considers three different straight-build scenarios: (i) direct build; (ii) best build; and (iii) worst build, for two-dimensional “axi-symmetric” assemblies. The probabilistic approach is much more efficient than Monte Carlo simulation. The paper also uses numerical examples to investigate the accuracy of the probabilistic approach in comparison to Monte Carlo simulation.  相似文献   

16.
The rapid development of information and communication technology and the popularization of the Internet have given a boost to digitization technologies. Since 2001, The National Science Council (NSC) of Taiwan has invested a large amount of funding in the National Digital Archives Program (NDAP) to develop digital content. Some studies have indicated that most respondents had no confidence in particular digital archive websites. Thus, with the Technology Acceptance Model (TAM) as a theoretical basis, the focus of the present study was to identify the factors influencing usage. Extension of the roles of perceived playfulness and interface design was also explored to identify the reasons that digital archives might not be accepted by some users. The present study used a random sampling method to distribute questionnaires to digital archive users via e-mail. The Structural Equation Modeling (SEM) method was used to verify the appropriateness of the study model and whether the hypotheses were confirmed. Study results indicated that the “interface design” is an important factor that influences people to use the digital archives, and that it is separate from the “human factor” and the “human–computer interface” (HCI). Moreover, the results showed that HCI had a significant impact on the “perceived ease of use” and on “usage intentions.” However, the human factor interface showed a significant impact only on “perceived ease of use.” With respect to the hypotheses regarding “usage intentions,” the “perceived usefulness,” “perceived ease of use,” “attitude,” and “perceived playfulness” were not related to “usage intentions.” Therefore, it is necessary to consider the quality of interface design in the development of digital archives in order to promote usage.  相似文献   

17.
Malaria transmission is highly influenced by environmental and climatic conditions but their effects are often not linear. The climate-malaria relation is unlikely to be the same over large areas covered by different agro-ecological zones. Similarly, spatial correlation in malaria transmission arisen mainly due to spatially structured covariates (environmental and human made factors), could vary across the agro-ecological zones, introducing non-stationarity. Malaria prevalence data from West Africa extracted from the “Mapping Malaria Risk in Africa” database were analyzed to produce regional parasitaemia risk maps. A non-stationary geostatistical model was developed assuming that the underlying spatial process is a mixture of separate stationary processes within each zone. Non-linearity in the environmental effects was modeled by separate P-splines in each agro-ecological zone. The model allows smoothing at the borders between the zones. The P-splines approach has better predictive ability than categorizing the covariates as an alternative of modeling non-linearity. Model fit and prediction was handled within a Bayesian framework, using Markov chain Monte Carlo (MCMC) simulations.  相似文献   

18.
Illumination at a surface point is formulated as an integral of a BRDF using the incident radiance over the hemisphere domain. A popular method to compute the integral is Monte Carlo integration, in which the surface illumination is computed as the sum of the integrand evaluated with stochastically sampled rays. Although its simple nature is practically attractive, it incurs the serious drawback of noise artifacts due to estimator variance. In this paper, we propose a novel noiseless Monte Carlo rendering algorithm running in real time on a GPU. The main contribution is a novel importance sampling scheme, which constructs spatially continuous sample rays over a surface. For each evenly spaced polar angle of the eye ray, denoted by θ, incident rays are sampled with a PDF (probability density function) derived from a target BRDF lobe. We develop a force-based update method to create a sequence of consistent ray sets along θ’s. Finally, corresponding rays in the sequence of ray sets are linearly connected to form a continuous ray curve, referred to as a sample thread. When rendering, illumination at a surface point is computed with rays, each of which is given as a point on a sample thread. Because a thread provides a sample ray that continuously varies on a surface, the random variance of the illumination, causing visual noise during the Monte Carlo rendering process, is eliminated. A thread set is precomputed for each BRDF to free the GPU from the burden of sampling during real-time rendering. According to extensive experiments, approximately 100 threads are sufficient for most measured BRDFs with acceptable rendering quality for interactive applications.  相似文献   

19.
The analysis of point-level (geostatistical) data has historically been plagued by computational difficulties, owing to the high dimension of the nondiagonal spatial covariance matrices that need to be inverted. This problem is greatly compounded in hierarchical Bayesian settings, since these inversions need to take place at every iteration of the associated Markov chain Monte Carlo (MCMC) algorithm. This paper offers an approach for modeling the spatial correlation at two separate scales. This reduces the computational problem to a collection of lower-dimensional inversions that remain feasible within the MCMC framework. The approach yields full posterior inference for the model parameters of interest, as well as the fitted spatial response surface itself. We illustrate the importance and applicability of our methods using a collection of dense point-referenced breast cancer data collected over the mostly rural northern part of the state of Minnesota. Substantively, we wish to discover whether women who live more than a 60-mile drive from the nearest radiation treatment facility tend to opt for mastectomy over breast conserving surgery (BCS, or “lumpectomy”), which is less disfiguring but requires 6 weeks of follow-up radiation therapy. Our hierarchical multiresolution approach resolves this question while still properly accounting for all sources of spatial association in the data.  相似文献   

20.
We study several fundamental problems arising from biological sequence analysis. Given a sequence of real numbers, we present two linear-time algorithms, one for locating the “longest” sum-constrained segment, and the other for locating the “shortest” sum-constrained segment. These two algorithms are based on the same framework and run in an online manner, hence they are capable of handling data stream inputs. Our algorithms also yield online linear-time solutions for finding the longest and shortest average-constrained segments by a simple reduction.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号