首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We provide a case study for the generation of pure hexahedral meshes for the numerical simulation of physiological stress scenarios of the human mandible. Du to its complex and very detailed free-form geometry, the mandible model is very demanding. This test case is used as a running example to demonstrate the applicability of a combinatorial approach for the generation of hexahedral meshes by means of successive dual cycle eliminations, which has been proposed by the second author in previous work. We report on the progress and recent advances of the cycle elimination scheme. The given input data, a surface triangulation obtained from computed tomography data, requires a substantial mesh reduction and a suitable conversion into a quadrilateral surface mesh as a first step, for which we use mesh clustering and b-matching techniques. Several strategies for improved cycle elimination orders are proposed. They lead to a significant reduction in the mesh size and a better structural quality. Based on the resulting combinatorial meshes, gradient-based optimized smoothing with the condition number of the Jacobian matrix as objective together with mesh untangling techniques yielded embeddings of a satisfactory quality. To test our hexahedral meshes for the mandible model within an FEM simulation we used the scenario of a bite on a ‘hard nut.’ Our simulation results are in good agreement with observations from biomechanical experiments.  相似文献   

2.
Numerical treatment for a fractional differential equation (FDE) is proposed and analysed. The solution of the FDE may be singular near certain domain boundaries, which leads to numerical difficulty. We apply the upwind finite difference method to the FDE. The stability properties and a posteriori error analysis for the discrete scheme are given. Then, a posteriori adapted mesh based on a posteriori error analysis is established by equidistributing arc-length monitor function. Numerical experiments illustrate that the upwind finite difference method on a posteriori adapted mesh is more accurate than the method on uniform mesh.  相似文献   

3.
BMSweep is a new algorithm to determine the location of interior nodes while generating hexahedral meshes using the volume sweeping method. Volume sweeping is performed on two and one half-dimensional volumes by identifying a ‘source’ surface which is meshed with quadrilaterals. These quadrilaterals are then swept through the volume towards a ‘target’ surface generating layers of hexahedra along the way. BMSweep uses background mesh interpolation to locate interior nodes during sweeping. The interpolation method provides for quality element creation, while allowing the volume boundary to vary. The cross-section of the volume can vary along the length of the sweep, the sweep path need not be linear, and the source and target areas need not be flat. Three dimensional volumes can be swept using BMSweep after being decomposed into two and one half-dimensional subvolumes.  相似文献   

4.
A. Sgarro 《Calcolo》1978,15(1):41-49
Summary The informational divergence between stochastic matrices is not a metric. In this paper we show that, however, consistent definitions can be given of ‘spheres’, ‘segments’ and ‘straight lines’ using the divergence as a sort of ‘distance’ between stochastic matrices. The geometric nature of many ‘reliability functions’ of Information Theory and Mathematical Statistics is thus clarified. This work has been done within the GNIM-CNR research activity.  相似文献   

5.
Over the years, there have been a number of practical studies with working definitions of ‘mesh’ as related to computational simulation, however, there are only a few theoretical papers with formal definitions of mesh. Algebraic topology papers are available that define tetrahedral meshes in terms of simplices. Algebraic topology and polytope theory has also been utilized to define hexahedral meshes. Additional literature is also available describing particular properties of the dual of a mesh. In this paper, we give several formal definitions in relation to hexahedral meshes and the dual of hexahedral meshes. Our main goal is to provide useful, understandable and minimal definitions specifically for computer scientists or mathematicians working in hexahedral meshing. We also extend these definitions to some useful classifications of hexahedral meshes, including definitions for ‘fundamental’ hexahedral meshes and ‘minimal’ hexahedral meshes.  相似文献   

6.
The Knuth–Bendix ordering (KBO) is one of the term orderings in widespread use. We present a new algorithm to compute KBO, which is (to our knowledge) the first asymptotically optimal one. Starting with an ‘obviously correct’ version, we use program transformation to stepwise develop an efficient version, making clear the essential ideas, while retaining correctness. By theoretical analysis we show that the worst-case behavior is thereby changed from quadratic to linear. Measurements show the practical improvements of the different variants.  相似文献   

7.
This article offers a research update on a 3-year programme initiated by the Kamloops Art Gallery and the University College of the Cariboo in Kamloops, British Columbia. The programme is supported by a ‘Community–University Research Alliance’ grant from the Social Sciences and Humanities Research Council of Canada, and the collaboration focuses on the cultural future of small cities – on how cultural and arts organisations work together (or fail to work together) in a small city setting. If not by definition, then certainly by default, ‘culture’ is associated with big city life: big cities are equated commonly with ‘big culture’; small cities with something less. The Cultural Future of Small Cities research group seeks to provide a more nuanced view of what constitutes culture in a small Canadian city. In particular, the researchers are exploring notions of social capital and community asset building: in this context, ‘visual and verbal representation’, ‘home’, ‘community’ and the need to define a local ‘sense of place’ have emerged as important themes. As the Small Cities programme begins its second year, a unique but key aspect has become the artist-as-researcher. Correspondence and offprint requests to: L. Dubinsky, Kamloops Art Gallery, 101–465 Victoria Street, Kamloops, BC V2C 2A9 Canada. Tel.: 250-828-3543; Email: ldubinsky@museums.ca  相似文献   

8.
When Extended Kalman Filter is used to solve the SLAM problem of a nonlinear system, the linearization error will lead to severe estimation error or even make the method to be divergent. After analyzing the linearization principle of Kalman filters family, two improved methods are suggested to decrease the linearization error. These two methods improve posterior estimation accuracy by revising the observation-update step. Simulation results indicate that the two methods are feasible. The method named ‘Mean Extended Kalman Filter’ performs much better than EKF and UKF for nonlinear SLAM. And the iterated version of EKF and UKF even falls behind MEKF in estimation accuracy. In addition, MEKF is computationally efficient. With a view to both estimation accuracy and computational complexity, MEKF seems to be the best filter of the Kalman filters family for nonlinear SLAM. Experiments are carried out with ‘Car Park Dataset’ and ‘Victoria Park Dataset’ to evaluate the performance of MEKF based SLAM solutions. And the experimental results validate the effectiveness of MEKF in real SLAM applications.  相似文献   

9.
Tackling data with gap-interval time is an important issue faced by the temporal database community. While a number of interval logics have been developed, less work has been reported on gap-interval time. To represent and handle data with time, a clause ‘when’ is generally added into each conventional operator so as to incorporate time dimension in temporal databases, which clause ‘when’ is really a temporal logical sentence. Unfortunately, though several temporal database models have dealt with data with gap-interval time, they still put interval calculus methods on gap-intervals. Certainly, it is inadequate to tackle data with gap-interval time using interval calculus methods in historical databases. Consequently, what temporal expressions are valid in the clause ‘when’ for tackling data with gap-interval time? Further, what temporal operations and relations can be used in the clause ‘when’? To solve these problems, a formal tool for supporting data with gap-interval time must be explored. For this reason, a gap-interval-based logic for historical databases is established in this paper. In particular, we discuss how to determine the temporal relationships after an event explodes. This can be used to describe the temporal forms of tuples splitting in historical databases. Received 2 February 1999 / Revised 9 May 1999 / Accepted in revised form 20 November 1999  相似文献   

10.
LES of reacting flows is rapidly becoming mature and providing levels of precision which can not be reached with any RANS (Reynolds Averaged) technique. In addition to the multiple subgrid scale models required for such LES and to the questions raised by the required numerical accuracy of LES solvers, various issues related to the reliability, mesh independence and repetitivity of LES must still be addressed, especially when LES is used on massively parallel machines. This talk discusses some of these issues: (1) the existence of non physical waves (known as ‘wiggles’ by most LES practitioners) in LES, (2) the effects of mesh size on LES of reacting flows, (3) the growth of rounding errors in LES on massively parallel machines and more generally (4) the ability to qualify a LES code as ‘bug free’ and ‘accurate’. Examples range from academic cases (minimum non-reacting turbulent channel) to applied configurations (a sector of an helicopter combustion chamber).  相似文献   

11.
BAN logic, an epistemic logic for analyzing security protocols, contains an unjustifiable inference rule. The inference rule assumes that possession of H(X) (i.e., the cryptographic hash value of X) counts as a proof of possession of X, which is not the case. As a result, BAN logic exhibits a problematic property, which is similar to unsoundness, but not strictly equivalent to it. We will call this property ‘unsoundness’ (with quotes). The property is demonstrated using a specially crafted protocol, the two parrots protocol. The ‘unsoundness’ is proven using the partial semantics which is given for BAN logic. Because of the questionable character of the semantics of BAN logic, we also provide an alternative proof of ‘unsoundness’ which we consider more important.  相似文献   

12.
We consider the problem of locating a watermark in pages of archaic documents that have been both scanned and back-lit: the problem is of interest to codicologists in identifying and tracking paper materials. Commonly, documents of interest are worn or damaged, and all information is victim to very unfavourable signal-to-noise ratios—this is especially true of ‘hidden’ data such as watermarks and chain lines. We present an approach to recto removal, followed by highlighting of such ‘hidden’ data. The result is still of very low signal quality, and we also present a statistical approach to locate watermarks from a known lexicon of fragments. Results are presented from a comprehensively scanned nineteenth century copy of the Qur’ān. The approach has lent itself to immediate exploitation in improving known watermarks, and distinguishing between twin copies. Mr Hiary was supported by the University of Jordan in pursuing this work.  相似文献   

13.
Benchmarking quality measurement   总被引:2,自引:1,他引:1  
This paper gives a simple benchmarking procedure for companies wishing to develop measures for software quality attributes of software artefacts. The procedure does not require that a proposed measure is a consistent measure of a quality attribute. It requires only that the measure shows agreement most of the time. The procedure provides summary statistics for measures of quality attributes of a software artefact. These statistics can be used to benchmark subjective direct measurement of a quality attribute by a company’s software developers. Each proposed measure is expressed as a set of error rates for measurement on an ordinal scale and these error rates enable simple benchmarking statistics to be derived. The statistics can also be derived for any proposed objective indirect measure or prediction system for the quality attribute. For an objective measure or prediction system to be of value to the company it must be ‘better’ or ‘more objective’ than the organisation’s current measurement or prediction capability; and thus confidence that the benchmark’s objectivity has been surpassed must be demonstrated. By using Bayesian statistical inference, the paper shows how to decide whether a new measure should be considered ‘more objective’ or whether a prediction system’s predictive capability can be considered ‘better’ than the current benchmark. Furthermore, the Bayesian inferential approach is easy to use and provides clear advantages for quantifying and inferring differences in objectivity.
John MosesEmail:
  相似文献   

14.
For a variety of reasons, the relative impacts of neural-net inputs on the output of a network’s computation is valuable information to obtain. In particular, it is desirable to identify the significant features, or inputs, of a data-defined problem before the data is sufficiently preprocessed to enable high performance neural-net training. We have defined and tested a technique for assessing such input impacts, which will be compared with a method described in a paper published earlier in this journal. The new approach, known as the ‘clamping’ technique, offers efficient impact assessment of the input features of the problem. Results of the clamping technique prove to be robust under a variety of different network configurations. Differences in architecture, training parameter values and subsets of the data all deliver much the same impact rankings, which supports the notion that the technique ranks an inherent property of the available data rather than a property of any particular feedforward neural network. The success, stability and efficiency of the clamping technique are shown to hold for a number of different real-world problems. In addition, we subject the previously published technique, which we will call the ‘weight product’ technique, to the same tests in order to provide directly comparable information.  相似文献   

15.
16.
The notion of a graph type is introduced by a collection of axioms. A graph of type (or -graph) is defined as a set of edges, of which the structure is specified by . From this, general notions of subgraph and isomorphism of -graphs are derived. A Cantor-Bernstein (CB) result for -graphs is presented as an illustration of a general proof for different types of graphs. By definition, a relation on -graphs satisfies the CB property if and imply that A and B are isomorphic. In general, the relation ‘isomorphic to a subgraph’ does not satisfy the CB property. However, requiring the subgraph to be disconnected from the remainder of the graph, a relation that satisfies the CB property is obtained. A similar result is shown for -graphs with multiple edges. Received: 25 October 1996 / 5 February 1998  相似文献   

17.
This paper develops a semantics with control over scope relations using Vermeulen’s stack valued assignments as information states. This makes available a limited form of scope reuse and name switching. The goal is to have a general system that fixes available scoping effects to those that are characteristic of natural language. The resulting system is called Scope Control Theory, since it provides a theory about what scope has to be like in natural language. The theory is shown to replicate a wide range of grammatical dependencies, including options for, and constraints on, ‘donkey’, ‘binding’, ‘movement’, ‘Control’ and ‘scope marking’ dependencies.  相似文献   

18.
The need for information technology-mediated cooperation seems obvious. However, what is not obvious is what this means and what social demands such cooperation may imply. To explore this is the intention of the paper. As a first step the paper performs an etymological analysis of the words telecooperation and telecoordination. Such an analysis indicates that cooperation happens when people engage in the production of a work as if ‘one mind or body’, where their activities fuse together in a way that makes the suggestion of separation seem incomprehensible. In the work they do not merely aim to achieve an outcome, they also ‘insert’ themselves ‘in’ the work in a way that makes it a human achievement rather than a mere product – this is cooperation as working-together. With this notion of cooperation in mind the paper then proceeds to analyse the social conditions for cooperation as working-together. It shows, using the work of Wittgenstein, that language is fundamental to cooperation and the sharing of knowledge – not language as a system for the exchange of information but language as a medium for the co-creation of a local way of doing, a local language, to capture the local distinctions that make a particular local activity significant and meaningful to the participants. The paper then proceeds to question this strong notion of cooperation. It argues that most cooperative activities tend not to conform with such stringent demands. The paper suggests that a cooperative problem is best viewed as a situation in which ambiguity is accepted as a structural element of the interaction. From this perspective the paper suggests that hermeneutics may be a productive way to understand the creation of shared interpretative spaces that makes mediated cooperation possible. The paper concludes with some implications for mediated cooperative work.  相似文献   

19.
This paper focuses on orchestration work in the first iteration of a mobile game called Day Of The Figurines, which explores the potential to exploit text messaging as a means of creating an engaging gaming experience. By focusing on orchestration we are especially concerned with the ‘cooperative work that makes the game work’. While the assemblage or family of orchestration practices uncovered by our ethnographic study are specific to the game – including the ways in which behind the scenes staff make sense of messages, craft appropriate responses, and manage and track the production of gameplay narratives as the game unfolds – orchestration work is of general significance to our understanding of new gaming experiences. The focus on orchestration work reveals that behind the scenes staff are co-producers of the game and that the playing of games is, therefore, inseparably intertwined with their orchestration. Furthermore, orchestration work is ‘ordinary’ work that relies upon the taken for granted skills and competences of behind the scenes staff; ‘operators’ and ‘authors’ in this case. While we remain focused on the specifics of this game, explication of the ordinary work of orchestration highlights challenges and opportunities for the continued development of gaming experiences more generally. Indeed, understanding the specificities of orchestration work might be said to be a key ingredient of future development.  相似文献   

20.
An algorithm for the construction of the medial axis of a three-dimensional body given by a triangulation of its bounding surface is described. The indirect construction is based on the Delaunay-triangulation of a set of sample points on the bounding surface. The point set is refined automatically so as to capture the correct topology of the medial axis. The computed medial axis (or better medial surface) is then used for hex-dominant mesh generation. Quad-dominant meshes are generated on the medial subfaces first and extruded to the boundary of the body at both sides. The resulting single cell layer is subdivided in direction normal to the boundary, yielding columns of hexahedral and three-sided prismatic cells. The resulting volume mesh is orthogonal at the boundary and ‘semi-structured’ between boundary and medial surface. Mixed cell types (tets, pyramids, degenerate hexahedra) may result along the medial surface. An advancing front algorithm (paving) is used for meshing the subfaces of the medial surface. Development of the mesh generator has not been fully completed with respect to degenerate parts of the medial axis. First medium-complexity bodies have been meshed, however, showing moderate meshing times.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号