首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
We develop a general formalism for computing high quality, low-cost solutions to nonconvex combinatorial optimization problems expressible as distributed interacting local constraints. For problems of this type, generalized deterministic annealing (GDA) avoids the performance-related sacrifices of current techniques. GDA exploits the localized structure of such problems by assigning K-state neurons to each optimization variable. The neuron values correspond to the probability densities of K-state local Markov chains and may be updated serially or in parallel; the Markov model is derived from the Markov model of simulated annealing (SA), although it is greatly simplified. Theorems are presented that firmly establish the convergence properties of GDA, as well as supplying practical guidelines for selecting the initial and final temperatures in the annealing process. A benchmark image enhancement application is provided where the performance of GDA is compared to other optimization methods. The empirical data taken in conjunction with the formal analytical results suggest that GDA enjoys significant performance advantages relative to current methods for combinatorial optimization.  相似文献   

3.
This paper describes an implementation of a deterministic parsing system, described in Nordgård (1993). The syntax of heuristic rules and how the rules interact with the basic operations of the parser constitute the bulk of the article. The implementation is written in Medley Interlisp, and the system can be run on Sun or Xerox workstations. Torbjørn Nordgård is associate professor of computational linguistics in the Department of Linguistics and Phonetics, University of Bergen. His research interests include computational linguistics, theoretical linguistics, in particular formal syntax and semantics, and philosophy of science. He has published A GB-Related Parser for Norwegian(Bern: Peter Lang) and he is one of the authors of Generativ syntaks: Ei innføring via norsk(Oslo: Novus).  相似文献   

4.
5.
Schmeiser and Barnard described a method for producing the left parse at the end of the bottom-up parsing process. We improve their method in the sense that the left parse is actually produced during the bottom-up parsing process (i.e., with considerably less delay).  相似文献   

6.
Several deterministic identification problems, with partial realization being a special case, are unified in the framework of the mathematical problem "generalized dynamic covers." An algorithm to find such a minimal dynamic cover as well as a uniqueness criterion is given, which yields several identifiability results. This problem also includes the "observer" and the "exact model matching" problems, as well as the problem of finding "minimal inverses for linear systems with arbitrary initial states." It is shown that the problem of finding minimal realizations from a transfer function matrix can also be solved in this framework.  相似文献   

7.
Summary In the following paper we are presenting a new algorithm for the on-line construction of position trees. Reading a given input string from left to right we are generating its position tree with the aid of the general concept of infix trees. An additional chain structure within the trees, called tail node connection, enables us to construct the tree within the best possible time (proportional to the number of nodes).  相似文献   

8.
9.
Two simple but powerful tools are introduced in LR theory, viz. item grammars and parsing automata. Parsing automata are used to define a large class of correct prefix parsers operating in linear time. DeRemer's and Pager's parsing methods turn out to be special cases. LL and LC tests, as well as inclusion theorems for the classes of LL, LC, and LR grammars will be based on parsing automata, too.  相似文献   

10.
11.
This paper has proposed an efficient shaded-face pre-processing technique using front-face symmetry. The existing face recognition PCA technique has a shortcoming of making illumination variation lower the recognition performance of a shaded face. The study has aimed to improve the performance by using the symmetry of the left and right face.In order to evaluate the performance of the proposed face recognition method, the study experimented with the Yale face database with left/right shadows. The experimental methods for this are as following: the existing PCA, PCA with first three eigenfaces excluded, histogram equalization and the proposed method. As the result, it was shown that the proposed method has a rather excellent recognition performance (98.9%).  相似文献   

12.
Robustness, the ability to analyze any input regardless of its grammaticality, is a desirable property for any system dealing with unrestricted natural language text. Error-repair parsing approaches achieve robustness by considering ungrammatical sentences as corrupted versions of valid sentences. In this article we present a deductive formalism, based on Sikkel’s parsing schemata, that can be used to define and relate error-repair parsers and study their formal properties, such as correctness. This formalism allows us to define a general transformation technique to automatically obtain robust, error-repair parsers from standard non-robust parsers. If our method is applied to a correct parsing schema verifying certain conditions, the resulting error-repair parsing schema is guaranteed to be correct. The required conditions are weak enough to be fulfilled by a wide variety of popular parsers used in natural language processing, such as CYK, Earley and Left-Corner.  相似文献   

13.
This paper presents a parsing technique with certain advantages for a compiler system. A review and a comparison with other techniques are given.  相似文献   

14.
We present a new algorithm to construct a (generalized) deterministic Rabin automaton for an LTL formula \(\varphi \). The automaton is the product of a co-Büchi automaton for \(\varphi \) and an array of Rabin automata, one for each \({\mathbf {G}}\)-subformula of \(\varphi \). The Rabin automaton for \({\mathbf {G}}\psi \) is in charge of recognizing whether \({\mathbf {F}}{\mathbf {G}}\psi \) holds. This information is passed to the co-Büchi automaton that decides on acceptance. As opposed to standard procedures based on Safra’s determinization, the states of all our automata have a clear logical structure, which allows for various optimizations. Experimental results show improvement in the sizes of the resulting automata compared to existing methods.  相似文献   

15.
A language implementation with proper compositionality enables a compiler developer to divide-and-conquer the complexity of building a large language by constructing a set of smaller languages. Ideally, these small language implementations should be independent of each other such that they can be designed, implemented and debugged individually, and later be reused in different applications (e.g., building domain-specific languages). However, the language composition offered by several existing parser generators resides at the grammar level, which means all the grammar modules need to be composed together and all corresponding ambiguities have to be resolved before generating a single parser for the language. This produces tight coupling between grammar modules, which harms information hiding and affects independent development of language features. To address this problem, we have developed a novel parsing algorithm that we call Component-based LR (CLR) parsing, which provides code-level compositionality for language development by producing a separate parser for each grammar component. In addition to shift and reduce actions, the algorithm extends general LR parsing by introducing switch and return actions to empower the parsing action to jump from one parser to another. Our experimental evaluation demonstrates that CLR increases the comprehensibility, reusability, changeability and independent development ability of the language implementation. Moreover, the loose coupling among parser components enables CLR to describe grammars that contain LR parsing conflicts or require ambiguous token definitions, such as island grammars and embedded languages.  相似文献   

16.
Efficient error-correcting Viterbi parsing   总被引:2,自引:0,他引:2  
The problem of error-correcting parsing (ECP) using an insertion-deletion-substitution error model and a finite state machine is examined. The Viterbi algorithm can be straightforwardly extended to perform ECP, though the resulting computational complexity can become prohibitive for many applications. We propose three approaches in order to achieve an efficient implementation of Viterbi-like ECP which are compatible with beam search acceleration techniques. Language processing and shape recognition experiments which assess the performance of the proposed algorithms are presented  相似文献   

17.
详细描述了如何用Jena2解析本体,包括OWL文件中本体模型的载入,类、子类、实例、对象属性、值属性的提取,以及如何找到某一对象属性相关联的约束值等。  相似文献   

18.
Summary Making use of the fact that two-level grammars (TLGs) may be thought of as finite specification of context-free grammars (CFGs) with infinite sets of productions, known techniques for parsing CFGs are applied to TLGs by first specifying a canonical CFG G — called skeleton grammar — obtained from the cross-reference of the TLG G. Under very natural restrictions it can be shown that for these grammar pairs (G, G) there exists a 1 — 1 correspondence between leftmost derivations in G and leftmost derivations in G. With these results a straightforward parsing algorithm for restricted TLGs is given.  相似文献   

19.
The Recursive Descent method of parsing is well established in practice. An Incremental Parsing algorithm using the Recursive Descent method is presented. The algorithm is applicable to LL(1) grammars. The algorithm has been implemented for a subset of Pascal.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号

京公网安备 11010802026262号