首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 343 毫秒
1.
Cobham stated that sequences over a finite alphabet can be generated by anr-substitution if and only if they arer-automatic. Those sequences linked with automata as well as with substitutions may nevertheless possess sets of subwords which are, if interpreted as languages, not even context-free. This result is obtained by a study of the properties of paperfolding sequences. It is shown that any context-free set consisting of subwords of paperfolding sequences is finite.  相似文献   

2.
Motivated by certain coding techniques for reliable DNA computing, we consider the problem of characterizing nontrivial languages D that are maximal with the property that D * is contained in the subword closure of a given set S of words of some fixed length k. This closure is simply the set of all words whose subwords of length k must be in S. We provide a deep structural characterization of these languages D, which leads to polynomial time algorithms for computing such languages.  相似文献   

3.
4.
5.
High-dimensional data and the small sample size problem occur in many modern pattern classification applications such as face recognition and gene expression data analysis. To deal with such data, one important step is dimensionality reduction. Principal component analysis (PCA) and between-group analysis (BGA) are two commonly used methods, and various extensions of these two methods exist. The principle of these two approaches comes from their best approximation property. From a pattern recognition perspective, we show that PCA, which is based on the total scatter matrix, preserves linear separability, and BGA, which is based on between-class scatter matrix, retains the distance between class centroids. Moreover, we propose an automatic nonparameter uncorrelated discriminant analysis (UDA) algorithm based on the maximum margin criterion (MMC). The extracted features via UDA are statistically uncorrelated. UDA combines rank-preserving dimensionality reduction and constraint discriminant analysis and also serves as an effective solution for the small-sample-size problem. Experiments with face images and gene expression data sets are conducted to evaluate UDA in terms of classification accuracy and robustness.  相似文献   

6.
Motivated by a study of similar problems stated for factors of words, we study forbidden subwords of a language or a word. A procedure for obtaining the set of all words avoiding a given set of subwords is presented. On the other hand, an algorithm for computing the set of minimal forbidden subwords for a word is given. The case of a two-letter alphabet appears to be particularly interesting and it is considered separately.  相似文献   

7.
基于词根的中国手语识别方法   总被引:1,自引:0,他引:1  
迄今为止,手语识别面临的最大问题是如何解决词汇集易扩充的连续识别,提出一种大词汇量连续中国手语识别方法,将词根作为识别基元,由于基元的数目是有限的,因此基于HMM的手语信号的训练和识别变得比较容易处理,可以实现更大词汇量的识别。除此之外,所提方法还有利于实现手势语和手指语的混合识别。从中国手语中共整理现2400多个词根,为每个词根建一个并行的HMM模型,对各数据流的HMM模型进行聚集,确定出手识别的基元。根据这些基元对手妫刻苦骊,并建立了树状搜索网络,使用状态垄点上高斯密度函数聚类、语言模型和N-Best方法提高系统的速度和精度。对5119个手语词做了实验,连续语句的识别率可在90%以上。  相似文献   

8.
We study the repetition of subwords in languages generated by morphisms. Fundamental to our approach is the notion of quasi-repetitive elements. Using these elements we present a new characterization for repetitive morphisms, from which we derive a simple proof for the fact that a D0L-language is repetitive if and only if it is strongly repetitive (Ehrenfeucht and Rozenberg, Inform. and Control 59 (1983) 13–35). From this proof we obtain a structurally simple polynomial-time algorithm for deciding whether such a language is repetitive. From further results on quasi-repetitive elements we obtain as a consequence a complete characterization for all those morphisms on a two-letter alphabet that are repetitive, a result which is closely related to a result of Séébold (Bull. EATCS 36 (1988) 137–151) on the D0L periodicity problem. Finally, we characterize those morphisms f on a two-letter alphabet, for which the language L(f) generated by f or the language SL(f) of subwords of L(f) are context-free or even regular.  相似文献   

9.
To achieve maximum efficiency, modern embedded processors for media applications exploit single instruction multiple data (SIMD) instructions. SIMD instructions provide a form of vectorization where a large machine word is viewed as a vector of subwords and the same operation is performed on all subwords in parallel. Systematic usage of SIMD instructions can significantly improve program performance. With C becoming the dominant language for programming embedded devices, there is a clear need for C compilers that use SIMD instructions whenever appropriate. However, SIMD instructions typically require each memory access to be aligned with the instruction's data access size. Therefore an important problem in designing the compiler is to determine whether a C pointer is aligned, i.e. whether it refers to the beginning of a machine word. In this paper, we describe our SIMD generation algorithm and present an analysis method which determines the alignment of pointers at compile time. The alignment information is used to reduce the number of dynamic alignment checks and the overhead incurred by them. Our method uses an interprocedural analysis which propagates pointer alignment information in function bodies and through function calls. The effectiveness of our method is supported by experimental results which show that in typical programs the alignments of about 50% of the pointers can be statically determined. Copyright © 2006 John Wiley & Sons, Ltd.  相似文献   

10.
We study a weak stability property called recurrence for a class of hybrid systems. An open set is recurrent if there are no finite escape times and every complete trajectory eventually reaches the set. Under sufficient regularity properties for the hybrid system we establish that the existence of a smooth, radially unbounded Lyapunov function that decreases along solutions outside an open, bounded set is a necessary and sufficient condition for recurrence of that set. Recurrence of open, bounded sets is robust to sufficiently small state dependent perturbations and this robustness property is crucial for establishing the existence of a Lyapunov function that is smooth. We also highlight some connections between recurrence and other well studied properties like asymptotic stability and ultimate boundedness.  相似文献   

11.
In Ref.[19], Toyama proved that the union of two confluent term-rewriting systems that share absolutely no function symbols or constants is likewise confluent, a property called modularity. The proof of this beautiful modularity result, technically based on slicing terms into an homogeneous cap and a so called alien, possibly heterogeneous substitution,was later substantially simplified in Refs.[8,12]. In this paper, we present a further simplification of the proof of Toyama's result for confluence, which shows that the crux of the problem lies on two di?erent properties: a cleaning lemma, whose goal is to anticipate the application of collapsing reductions and a modularity property of ordered completion that allows to pairwise match the caps and alien substitutions of two equivalent terms obtained from the cleaning lemma. The approach allows for arbitrary kinds of rules, and scales up to rewriting modulo arbitrary sets of equations.  相似文献   

12.
We establish a first step towards a “Rice theorem” for tilings: for non-trivial sets, it is undecidable to know whether two different tile sets produce the same tilings of the plane. Then, we study quasiperiodicity functions associated with tilings. This function is a way to measure the regularity of tilings. We prove that, not only almost all recursive functions can be obtained as quasiperiodicity functions, but also, a function which overgrows any recursive function.  相似文献   

13.
We consider the transition graphs of regular ground tree (or term) rewriting systems. The vertex set of such a graph is a (possibly infinite) set of trees. Thus, with a finite tree automaton one can represent a regular set of vertices. It is known that the backward closure of sets of vertices under the rewriting relation preserves regularity, i.e., for a regular set T of vertices the set of vertices from which one can reach T can be accepted by a tree automaton. The main contribution of this paper is to lift this result to the recurrence problem, i.e., we show that the set of vertices from which one can reach infinitely often a regular set T is regular, too. Since this result is effective, it implies that the problem whether, given a tree t and a regular set T, there is a path starting in t that infinitely often reaches T, is decidable. Furthermore, it is shown that the problems whether all paths starting in t eventually (respectively, infinitely often) reach T, are undecidable. Based on the decidability result we define a fragment of temporal logic with a decidable model-checking problem for the class of regular ground tree rewriting graphs.  相似文献   

14.
Finite test sets are a useful tool for deciding the membership problem for the universal closure of a given tree language, that is, for deciding whether a term has all its ground instances in the given language. A uniform test set for the universal closure must serve the following purpose: In order to decide membership of a term, it is sufficient to check whether all its test set instances belong to the underlying language. A possible application, and our main motivation, is ground reducibility, an essential concept for many approaches to inductive reasoning. Ground reducibility modulo some rewrite system is membership in the universal closure of the set of reducible ground terms. Here, test sets always exist, and several algorithmic approaches are known. The resulting sets, however, are often unnecessarily large. In this paper we consider regular languages and linear closure operators. We prove that universal as well as existential closure, defined analogously, preserve regularity. By relating test sets to tree automata and to appropriate congruence relations, we show how to characterize, how to compute, and how to minimize ground and non-ground test sets. In particular, optimal solutions now replace previous ad hoc approximations for the ground reducibility problem.  相似文献   

15.
This article explores how the effectiveness of learning to parse with neural networks can be improved by including two architectural features relevant to language: generalisations across syntactic constituents and bounded resource effects. A number of neural network parsers have recently been proposed, each with a different approach to the representational problem of outputting parse trees. In addition, some of the parsers have explicitly attempted to capture an important regularity within language, which is to generalise information across syntactic constituents. A further property of language is that natural bounds exist for the number of constituents which a parser need retain for later processing. Both the generalisations and the resource bounds may be captured in architectural features which enhance the effectiveness and efficiency of learning to parse with neural networks. We describe a number of different types of neural network parser, and compare them with respect to these two features. These features are both explicitly present in the Simple Synchrony Network parser, and we explore and illustrate their impact on the process of learning to parse in some experiments with a recursive grammar.  相似文献   

16.
Harmonic periods have wide applicability in industrial real-time systems. Rate monotonic (RM) is able to schedule task sets with harmonic periods up to 100% utilization. Also, if there is no release jitter and execution time variation, RM and EDF generate the same schedule for each instance of a task. As a result, all instances of a task are interfered by the same amount of workload. This property decreases the jitters that happen during sampling and actuation of the tasks, and hence, it increases the quality of service in control systems. In this paper, we consider the problem of optimal period assignment where the periods are constrained to be harmonic and the task set is required to be feasible. We study two variants of this problem. In the first one, the objective is to maximize the system utilization, while in the second one, the goal is to minimize the total weighted sum of the periods. First, we assume that an interval is determined a priori for each task from which its period can be selected. We show that both variants of the problem are (at least) weakly NP-hard. This is shown by reducing the NP-complete number partitioning problem to the mentioned harmonic period assignment problems. Afterwards, we consider a variant of the second problem in which the periods are not restricted to a special interval. We present two approximation algorithms with polynomial-time complexity for this problem and show that the maximum relative error of these algorithms is bounded by a factor of 1.125. Our evaluations show that, on the average, results of the approximation algorithms are very close to an optimal solution.  相似文献   

17.
We consider two single machine bicriteria scheduling problems in which jobs belong to either of two different disjoint sets, each set having its own performance measure. The problem has been referred to as interfering job sets in the scheduling literature and also been called multi-agent scheduling where each agent's objective function is to be minimized. In the first problem (P1) we look at minimizing total completion time and number of tardy jobs for the two sets of jobs and present a forward SPT-EDD heuristic that attempts to generate the set of non-dominated solutions. The complexity of this specific problem is NP-hard; however some pseudo-polynomial algorithms have been suggested by earlier researchers and they have been used to compare the results from the proposed heuristic. In the second problem (P2) we look at minimizing total weighted completion time and maximum lateness. This is an established NP-hard problem for which we propose a forward WSPT-EDD heuristic that attempts to generate the set of supported points and compare our solution quality with MIP formulations. For both of these problems, we assume that all jobs are available at time zero and the jobs are not allowed to be preempted.  相似文献   

18.
The problem of reconstruction of a word from a set of its subwords is considered. It is assumed that the set is generated by unit shifts of a fixed window along an unknown word. For the problem without constrains on the unknown word, a method of reconstruction is proposed based on the search for Euler paths or Euler cycles in a de Bruijn multidigraph. The search is based on symbolic multiplication of adjacency matrices with special operations of multiplication and addition of edge names. The method makes it possible to find reconstructed words and the number of reconstructions.  相似文献   

19.
We consider a constrained equational logic where the constraints are membership conditions tswheresis interpreted as a regular tree language. Our logic includes a fragment of second-order equational logic (without projections) where second-order variables range over regular sets of contexts. The problem with constrained equational logics is the failure of the critical pair lemma. This is the reason why we propose new deduction rules for which the critical pair lemma is restored. Computing critical pairs requires, however, solving some constraints in a second-order logic with membership constraints. In a second paper we give a terminating set of transformation rules for these formulas, which decides the existence of a solution, thus showing a new term scheme unification algorithm.Since an order-sorted signature is nothing but a bottom–up tree automaton, order-sorted equational logic falls into the scope of our study; our results show how to perform order-sorted completion without regularity and without sort-decreasingness. It also shows how to perform unification in the order-sorted case, with some higher-order variables (without any regularity assumption).  相似文献   

20.
点模式匹配是计算机视觉和模式识别领域中的一个重要问题。通过研究,在假定待匹配的两个点模式中已知有三对点整体对应的前提下,基于射影坐标以及对投景变换和排序变换同时保持不变的p^2--不变量等理论,通过定义一种广义距离,给出了一种求解透视变换下,点数不等的两个平面点模式匹配问题的新算法。理论分析和仿真实验表明,该算法是快速、有效的。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号