首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 593 毫秒
1.
Random uncertainties in finite element models in linear structural dynamics are usually modeled by using parametric models. This means that: (1) the uncertain local parameters occurring in the global mass, damping and stiffness matrices of the finite element model have to be identified; (2) appropriate probabilistic models of these uncertain parameters have to be constructed; and (3) functions mapping the domains of uncertain parameters into the global mass, damping and stiffness matrices have to be constructed. In the low-frequency range, a reduced matrix model can then be constructed using the generalized coordinates associated with the structural modes corresponding to the lowest eigenfrequencies. In this paper we propose an approach for constructing a random uncertainties model of the generalized mass, damping and stiffness matrices. This nonparametric model does not require identifying the uncertain local parameters and consequently, obviates construction of functions that map the domains of uncertain local parameters into the generalized mass, damping and stiffness matrices. This nonparametric model of random uncertainties is based on direct construction of a probabilistic model of the generalized mass, damping and stiffness matrices, which uses only the available information constituted of the mean value of the generalized mass, damping and stiffness matrices. This paper describes the explicit construction of the theory of such a nonparametric model.  相似文献   

2.
In this article, we consider the issue of assessing influence of observations in the general class of beta regression models introduced by Simas et al. (Comput. Stat. Data Anal. 54:348–366, 2010), which is very useful in situations in which the response is restricted to the standard unit interval (0,1). Our results generalize those in Espinheira et al. (Comput. Stat. Data Anal. 52:4417–4431, 2008a; J. Appl. Stat. 35:407–419, 2008b), which only apply to linear beta regression models. We define some residuals, and a Portmanteau test for serial correlation. Further, some influence methods, such as the global, local, and total local influence of an individual and generalized leverage, are discussed. Moreover, we also derive the normal curvatures of local influence under various perturbation schemes. Finally, simulation results and an application to real data show the usefulness of our results.  相似文献   

3.
It is well known that model order reduction techniques that project the solution of the problem at hand onto a low-dimensional subspace present difficulties when this solution lies on a nonlinear manifold. To overcome these difficulties (notably, an undesirable increase in the number of required modes in the solution), several solutions have been suggested. Among them, we can cite the use of nonlinear dimensionality reduction techniques or, alternatively, the employ of linear local reduced order approaches. These last approaches usually present the difficulty of ensuring continuity between these local models. Here, a new method is presented, which ensures this continuity by resorting to the paradigm of the partition of unity while employing proper generalized decompositions at each local patch.  相似文献   

4.
Large-scale data analysis problems have become increasingly common across many disciplines. While large volume of data offers more statistical power, it also brings computational challenges. The orthogonalizing expectation–maximization (EM) algorithm by Xiong et al. is an efficient method to deal with large-scale least-square problems from a design point of view. In this article, we propose a reformulation and generalization of the orthogonalizing EM algorithm. Computational complexity and convergence guarantees are established. The reformulation of the orthogonalizing EM algorithm leads to a reduction in computational complexity for least-square problems and penalized least-square problems. The reformulation, named the GOEM (generalized orthogonalizing EM) algorithm, can incorporate a wide variety of convex and nonconvex penalties, including the lasso, group lasso, and minimax concave penalty penalties. The GOEM algorithm is further extended to a wider class of models including generalized linear models and Cox's proportional hazards model. Synthetic and real data examples are included to illustrate its use and efficiency compared with standard techniques. Supplementary materials for this article are available online.  相似文献   

5.
Real-time monitoring is an important task in process control. It often relies on estimation of process parameters in Phase I and Phase II and aims to identify significant differences between the estimates when triggering signals. Real-time contrast (RTC) control charts use classification methods to separate the Phase I and Phase II data and monitor the classification probabilities. However, since the classification probability statistics take discretely distributed values, the corresponding RTC charts become less efficient in the detection ability. In this paper, we propose to use distance-based RTC statistics for process monitoring, which are related to the distance from observations to the classification boundary. We illustrate our idea using the kernel linear discriminant analysis (KLDA) method and develop three distance-based KLDA statistics for RTC monitoring. The performance of the KLDA distance-based charting methods is compared with the classification probability-based control charts. Our results indicate that the distance-based RTC charts are more efficient than the class of probability-based control charts. A real example is used to illustrate the performance of the proposed method.  相似文献   

6.
Spatial connectivity plays an important role in mosquito-borne disease transmission. Connectivity can arise for many reasons, including shared environments, vector ecology and human movement. This systematic review synthesizes the spatial methods used to model mosquito-borne diseases, their spatial connectivity assumptions and the data used to inform spatial model components. We identified 248 papers eligible for inclusion. Most used statistical models (84.2%), although mechanistic are increasingly used. We identified 17 spatial models which used one of four methods (spatial covariates, local regression, random effects/fields and movement matrices). Over 80% of studies assumed that connectivity was distance-based despite this approach ignoring distant connections and potentially oversimplifying the process of transmission. Studies were more likely to assume connectivity was driven by human movement if the disease was transmitted by an Aedes mosquito. Connectivity arising from human movement was more commonly assumed in studies using a mechanistic model, likely influenced by a lack of statistical models able to account for these connections. Although models have been increasing in complexity, it is important to select the most appropriate, parsimonious model available based on the research question, disease transmission process, the spatial scale and availability of data, and the way spatial connectivity is assumed to occur.  相似文献   

7.
Rectangular models of material microstructure are described by their 1- and 2-point (spatial) correlation statistics of placement of local state. In the procedure described here the local state space is described in discrete form; and the focus is on placement of local state within a finite number of cells comprising rectangular models. It is illustrated that effective elastic properties (generalized Hashin–Shtrikman bounds) can be obtained that are linear in components of the correlation statistics. Within this framework the concept of an eigen-microstructure within the microstructure hull is useful. Given the practical innumerability of the microstructure hull, however, we introduce a method for generating a sequence of archetypes of eigen-microstructure, from the 2-point correlation statistics of local state, assuming that the 1-point statistics are stationary. The method is illustrated by obtaining an archetype for an imaginary two-phase material where the objective is to maximize the combination $C_{xxxx}^{*} + C_{xyxy}^{*}$   相似文献   

8.
Earthwork planning is considered in this article and a generic block partitioning and modelling approach is devised to provide strategic plans of various levels of detail. Conceptually, this new approach is more accurate and comprehensive than others, for instance those that are section based. In response to recent environmental concerns, the metric for decision making was fuel consumption or emissions. Haulage distance and gradient, however, are important components of these metrics and are also included. Advantageously, the fuel consumption metric is generic and captures the physical difficulties of travelling over inclines of different gradients, which is consistent across all hauling vehicles. For validation, the proposed models and techniques are applied to a real-world road project. The numerical investigations demonstrate that the models can be solved with relatively little CPU time. The proposed block models also result in solutions of superior quality, i.e. they have reduced fuel consumption and cost. Furthermore, the plans differ considerably from those based solely on a distance-based metric, thus demonstrating a need for the industry to reflect on its current practices.  相似文献   

9.
10.
In this article, a family of models approximating the primitive equations of the atmosphere, which are known to be the fundamental equations of the atmosphere, is presented. The primitive equations of the atmosphere are used as a starting point and asymptotic expansions with respect to the Rossby number are considered to derive the nth-order approximate equations of the primitive equations of the atmosphere. Simple global models of the atmosphere are obtained. These higher-order models are linear and of the same form (with different right-hand sides, depending on the lower-order approximations) as the (first-order) global quasi-geostrophic equations derived in an earlier article. From a computational point of view, there are two advantages. Firstly, all the models are linear, so that they are easy to implement. Secondly, all order models are of the same form, so that, with slight modifications, the numerical code for the (first-order) global quasi-geostrophic model can be employed for all higher-order models. From a physical point of view, higher-order models capture more physical phenomena, such as the meridional flows, even though they are small in magnitude. Of course, there are still many subtle issues involved in this project, such as the convergence of the asymptotics; they will be addressed elsewhere. The article is concluded by a presentation of numerical simulations based on these models.  相似文献   

11.
Modeling a response in terms of the factors that affect it is often required in quality applications. While the normal scenario is commonly assumed in such modeling efforts, leading to the application of linear regression analysis, there are cases when the assumptions underlying this scenario are not valid and alternative approaches need to be pursued, like the normalization of the data or generalized linear modeling. Recently, a new response modeling methodology (RMM) has been introduced, which seems to be a natural generalization of various current scientific and engineering mainstream models, where a monotone convex (concave) relationship between the response and the affecting factor (or a linear combination of factors) may be assumed. The purpose of this paper is to provide the quality practitioner with a survey of these models and demonstrate how they can be derived as special cases of the new RMM. A major implication of this survey is that RMM can be considered a valid approach for quality engineering modeling and, thus, may be conveniently applied where theory‐based models are not available or the goodness‐of‐fit of current empirically‐derived models is unsatisfactory. A numerical example demonstrates the application of the new RMM to software reliability‐growth modeling. The behavior of the new model when the systematic variation vanishes (there is only random variation) is also briefly explored. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   

12.
In this paper, we report predicted results for texture evolution in FCC metals under uniaxial compression test. These results are computed using a newly developed nonlinear rigid viscoplastic crystal plasticity model based on an intermediate interaction law. This interaction law is formulated by the minimization of a normalized error function which combines the local fields’ deviations, from the macroscopic ones, obtained by the classical upper bound (Taylor) and lower bound (Sachs) models. This interaction law leads to results lying between the upper and lower bound approaches by simply varying a scalar weight function ϕ (0 < ϕ < 1). A simple interaction law based on the linear mixture of the fields from the Taylor and Sachs models is also used. The results from these both the linear and nonlinear intermediate approaches are shown in terms of texture evolution under uniaxial compression. These results are discussed in comparison with the well known experimental textures in compressed FCC metals. Finally, we show that the linear intermediate approach yields fairly acceptable texture predictions under compression and that the fully non‐linear approach predicts much better results.  相似文献   

13.
In this paper, an incremental‐secant modulus iteration scheme using the extended/generalized finite element method (XFEM) is proposed for the simulation of cracking process in quasi‐brittle materials described by cohesive crack models whose softening law is composed of linear segments. The leading term of the displacement asymptotic field at the tip of a cohesive crack (which ensures a displacement discontinuity normal to the cohesive crack face) is used as the enrichment function in the XFEM. The opening component of the same field is also used as the initial guess opening profile of a newly extended cohesive segment in the simulation of cohesive crack propagation. A statically admissible stress recovery (SAR) technique is extended to cohesive cracks with special treatment of non‐homogeneous boundary tractions. The application of locally normalized co‐ordinates to eliminate possible ill‐conditioning of SAR, and the influence of different weight functions on SAR are also studied. Several mode I cracking problems in quasi‐brittle materials with linear and bilinear softening laws are analysed to demonstrate the usefulness of the proposed scheme, as well as the characteristics of global responses and local fields obtained numerically by the XFEM. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

14.
It is in many cases very instructive and useful to have the possibility of treating three-dimensional problems by means of two-dimensional models. It always implies a reduction in computing cost which is particularly significant in presence of non-linearities, derived for instance from the presence of contact between the solids involved in the problem. The term generalized plane problem is adopted for a three-dimensional problem in a homogeneous linear elastic cylindrical body where strains and stresses are the same in all transversal sections. This concept covers many practical cases (for instance in the field of composites), a particular situation called generalized plane strain (strains, stresses and displacements are the same in all transversal sections) being the most frequently analyzed. In this paper, a new formulation is developed in a systematic way to solve generalized plane problems for anisotropic materials, with possible friction contact zones, as two-dimensional problems. The numerical solution of these problems is formulated by means of the boundary element method. An explicit expression of a new particular solution of the problem associated to constant body forces is introduced and applied to avoid domain integrations. Some numerical results are presented to show the performance and advantages of the formulation developed.  相似文献   

15.
This article considers some links between classical test theory (CTT) and modern test theory (MTT) such as item response theory (IRT) and the Rasch model in the context of the two-level hierarchical generalized linear model (HGLM). Conceptualizing items as nested within subjects, both the CTT model and the MTT model can be reformulated as an HGLM where item difficulty parameters are represented by fixed effects and subjects' abilities are represented by random effects. In this HGLM framework, the CTT and MTT models differ only in the level 1 sampling model and the associated link function. This article also contrasts the Rasch and two-parameter IRT models by considering the property of specific objectivity in the context of CTT. It is found that the essentially tau-equivalent model exhibits specific objectivity if the data fit the model, but the congeneric measures model does not. Data from English composition scores on essay writing used by J?reskog (1971) are reanalyzed for illustration.  相似文献   

16.
Yun Li  Kay Chen Tan 《Sadhana》2000,25(2):97-110
To overcome the deficiency of ’local model network’ (LMN) techniques, an alternative ’linear approximation model’ (LAM) network approach is proposed. Such a network models a nonlinear or practical system with multiple linear models fitted along operating trajectories, where individual models are simply networked through output or parameter interpolation. The linear models are valid for the entire operating trajectory and hence overcome the local validity of LMN models, which impose the predetermination of a scheduling variable that predicts characteristic changes of the nonlinear system. LAMs can be evolved from sampled step response data directly, eliminating the need for local linearisation upon a pre-model using derivatives of the nonlinear system. The structural difference between a LAM network and an LMN is that the overall model of the latter is a parameter-varying system and hence nonlinear, while the former remains linear time-invariant (LTI). Hence, existing LTI and transfer function theory applies to a LAM network, which is therefore easy to use for control system design. Validation results show that the proposed method offers a simple, transparent and accurate multivariable modelling technique for nonlinear systems.  相似文献   

17.
We propose a U-statistics-based test for null variance components in linear mixed models and obtain its asymptotic distribution (for increasing number of units) under mild regularity conditions that include only the existence of the second moment for the random effects and of the fourth moment for the conditional errors. We employ contiguity arguments to derive the distribution of the test under local alternatives assuming additionally the existence of the fourth moment of the random effects. Our proposal is easy to implement and may be applied to a wide class of linear mixed models. We also consider a simulation study to evaluate the behaviour of the U-test in small and moderate samples and compare its performance with that of exact F-tests and of generalized likelihood ratio tests obtained under the assumption of normality. A practical example in which the normality assumption is not reasonable is included as illustration.  相似文献   

18.
The meshless local boundary integral equation (MLBIE) method with an efficient technique to deal with the time variable are presented in this article to analyze the transient heat conduction in continuously nonhomogeneous functionally graded materials (FGMs). In space, the method is based on the local boundary integral equations and the moving least squares (MLS) approximation of the temperature and heat flux. In time, again the MLS approximates the equivalent Volterra integral equation derived from the heat conduction problem. It means that, the MLS is used for approximation in both time and space domains, and we avoid using the finite difference discretization or Laplace transform methods to overcome the time variable. Finally the method leads to a single generalized Sylvester equation rather than some (many) linear systems of equations. The method is computationally attractive, which is shown in couple of numerical examples for a finite strip and a hollow cylinder with an exponential spatial variation of material parameters.  相似文献   

19.
Mixed model prediction and small area estimation   总被引:2,自引:0,他引:2  
Jiming Jiang  P. Lahiri 《TEST》2006,15(1):1-96
Over the last three decades, mixed models have been frequently used in a wide range of small area applications. Such models offer great flexibilities in combining information from various sources, and thus are well suited for solving most small area estimation problems. The present article reviews major research developments in the classical inferential approach for linear and generalized linear mixed models that are relevant to different issues concerning small area estimation and related problems.  相似文献   

20.
In this article, a new Monte Carlo hybrid local search algorithm (Hyb-LS) is proposed for solving the uncapacitated facility location problem. Hyb-LS is based on repeated sampling using two local search strategies based on best improvement and randomized neighbourhood search. A major advantage of Hyb-LS for its practical use is that the number of restarts is its only parameter to tune. The algorithm is also simple to reimplement, scalable and robust to changes in coefficients within a problem instance. The stopping criterion for local search is learned automatically. Experimental results are presented for four representative and contrasting cost and distance models. The results obtained by Hyb-LS are compared to the optimal or near-optimal solutions found by a mixed integer linear programming (MILP) solver with a generous time limit. For three out of the four models, Hyb-LS obtains better solutions than the upper bound found by the MILP solver for at least one instance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号