首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 105 毫秒
1.
This paper combines image processing with 3D magnetic tracking method to develop a scalpel for haptic simulation in surgical cutting. First, a cutting parameter acquisition setup is presented and the performance is validated from soft tissue cutting. Then, based on the acquired input-output data pairs, a method for fuzzy system modeling is presented, that is, after partitioning each input space equally and giving the premises and the total number of fuzzy rules, the consequent parameters and the fuzzy membership functions (MF) of the input variables are learned and optimized via a neurofuzzy modeling technique. Finally, a haptic scalpel implemented with the established cutting model is described. Preliminary results show the feasibility of the haptic display system for real-time interaction.  相似文献   

2.
This paper introduces some improvements on the intelligent backtracking strategy for forward chaining theorem proving. How to decide a minimal useful consequent atom set for a refutation derived at a node in a proof tree is discussed. In most cases, an unnecessary non-Horn clause used for forward chaining will be split only once. The increase of the search space by invoking unnecessary forward chaining clauses will be nearly linear, not exponential anymore.In this paper, the principle of the proposed method and its correctness are introduced. Moreover,some examples are provided to show that the proposed approach is powerful for forward chaining theorem proving.  相似文献   

3.
This paper first describes the Grid concept and its current research status. Access Grid is a complement of computational grid. This paper gives a deep analysis of Access Grid and gives a basic model of it. After describing the Access Grid Node‘s software and hardware configuration and deployment, this paper explains how to manage events on the Access Grid. Lastly, this paper gives some new directions that Access Grid research works are going on.  相似文献   

4.
This paper proposes a new graph model for multibus multiprocessor system.Based on the link representation of cut set of the hypergraph,ring sum and its algebraic properties,we can directly calculate the degrees of bus-fault-tolerance and processor-fault-tolerance for any multibus multiprocessor system.Algorithms are listed and all the theorems and proofs are stated.  相似文献   

5.
This paper proposes NNF-a fuzzy Petri Net system based on neural network for proposition logic repesentation,and gives the formal definition of NNF.For the NNF model,forward reasoning algorithm,backward reasoning algorithm and knowledge learning algorithm are discussed based on weight training algorithm of neural network-Back Propagation algorithm.Thus NNF is endowed with the ability of learning a rule.The paper concludes with a discussion on extending NNF to predicate logic,forming NNPrF,and proposing the formal definition and a reasoning algorithm of NNPrF.  相似文献   

6.
This paper introduces the design and implemetation of BCL-3,a high performance low-level communication software running on a cluster of SMPs(CLUMPS) called DAWNING-3000,BCL-3 provides flexible and sufficient functionality to fulfill the communication requirements of fundamental system software developed for DAWNING-3000 while guaranteeing security,scalability,and reliability,Important features of BCL-3 are presented in the paper,including special support for SMP and heterogeneous network environment,semiuser-level communication,reliable and ordered data transfer and scalable flow control,The performance evaluation of BCL-3 over Myrinet is also given.  相似文献   

7.
There are problems to access information from the Web. It is the HTML's weakness in knowledge representation and the WWW's shortage in knowledge handling. This paper presents a ob-ject-oriented hyper text makeup language-HTML++ , which is an enhanced version of HTML, in order to optimize the information structure of hyper text and provide it the capability of semantantic representation. HTML++ provides WWW a practicable approach of importing the capability of information processing such as reasoning. This makes it posstble accessing information in an ad-hoc, direct, precision and customized style.  相似文献   

8.
一种基于后项不定长关联规则的Web个性化推荐方法   总被引:2,自引:0,他引:2  
Web usage mining plays an important part in supporting personalized recommendation on Web and association rule uncovers the interesting relations among items hidden in data. The paper gives an idea of association rule merging-deleting based on the analysis of association rule characteristics and implements it in the rule preparation before the Web personalized recommendation. Furthermore, based on the comparisons in precision, coverage and F1 of recommendation system and the rule numbers used in three kinds of association rules, a Web personalized recommendation method based on uncertain consequent is put forward. After integrative analysis of several recommendation methods, the method given in the paper can be thought as a good selection. At last several pageweighted techniques are introduced in the paper.  相似文献   

9.
In this paper,a noverl technique adopted in HarkMan is introduced.HarkMan is a keywore-spotter designed to automatically spot the given words of a vocabulary-independent task in unconstrained Chinese telephone speech.The speaking manner and the number of keywords are not limited.This paper focuses on the novel technique which addresses acoustic modeling,keyword spotting network,search strategies,robustness,and rejection.The underlying technologies used in HarkMan given in this paper are useful not only for keyword spotting but also for continuous speech recognition.The system has achieved a figure-of-merit value over 90%.  相似文献   

10.
This paper introduces the concept of virtual reality telecommunication(VRT)in a broad sense,and presents the structure and reference model of VRT.At last,a realization method of VRT is discussed.  相似文献   

11.
In most of the auction systems the values of bids are known to the auctioneer. This allows him to manipulate the outcome of the auction. Hence, one might be interested in hiding these values. Some cryptographically secure protocols for electronic auctions have been presented in the last decade. Our work extends these protocols in several ways. On the basis of garbled circuits, i.e., encrypted circuits, we present protocols for sealed-bid auctions that fulfill the following requirements: 1) protocols are information-theoretically t-private for honest but curious parties; 2) the number of bits that can be learned by malicious adversaries is bounded by the output length of the auction; 3) the computational requirements for participating parties are very low: only random bit choices and bitwise computation of the XOR-function are necessary. Note that one can distinguish between the protocol that generates a garbled circuit for an auction and the protocol to evaluate the auction. In this paper we address both problems. We will present a t-private protocol for the construction of a garbled circuit that reaches the lower bound of 2t + 1 parties, and Finally, we address the problem of bid changes in an auction. a more randomness efficient protocol for (t + 1)^2 parties  相似文献   

12.
We propose a non-iterative solution to the PnP problem—the estimation of the pose of a calibrated camera from n 3D-to-2D point correspondences—whose computational complexity grows linearly with n. This is in contrast to state-of-the-art methods that are O(n 5) or even O(n 8), without being more accurate. Our method is applicable for all n≥4 and handles properly both planar and non-planar configurations. Our central idea is to express the n 3D points as a weighted sum of four virtual control points. The problem then reduces to estimating the coordinates of these control points in the camera referential, which can be done in O(n) time by expressing these coordinates as weighted sum of the eigenvectors of a 12×12 matrix and solving a small constant number of quadratic equations to pick the right weights. Furthermore, if maximal precision is required, the output of the closed-form solution can be used to initialize a Gauss-Newton scheme, which improves accuracy with negligible amount of additional time. The advantages of our method are demonstrated by thorough testing on both synthetic and real-data.  相似文献   

13.
Given a graph with a source and a sink node, the NP-hard maximum k-splittable s,t-flow (M k SF) problem is to find a flow of maximum value from s to t with a flow decomposition using at most k paths. The multicommodity variant of this problem is a natural generalization of disjoint paths and unsplittable flow problems. Constructing a k-splittable flow requires two interdepending decisions. One has to decide on k paths (routing) and on the flow values for the paths (packing). We give efficient algorithms for computing exact and approximate solutions by decoupling the two decisions into a first packing step and a second routing step. Usually the routing is considered before the packing. Our main contributions are as follows: (i) We show that for constant k a polynomial number of packing alternatives containing at least one packing used by an optimal M k SF solution can be constructed in polynomial time. If k is part of the input, we obtain a slightly weaker result. In this case we can guarantee that, for any fixed ε>0, the computed set of alternatives contains a packing used by a (1−ε)-approximate solution. The latter result is based on the observation that (1−ε)-approximate flows only require constantly many different flow values. We believe that this observation is of interest in its own right. (ii) Based on (i), we prove that, for constant k, the M k SF problem can be solved in polynomial time on graphs of bounded treewidth. If k is part of the input, this problem is still NP-hard and we present a polynomial time approximation scheme for it.  相似文献   

14.
The question of the contemporary relevance of Heidegger’s reflections on technology to today’s advanced technology is here explored with reference to the notion of “entanglement” towards a review of Heidegger’s understanding of technology and media, including the entertainment industry and modern digital life. Heidegger’s reflections on Gelassenheit have been connected with the aesthetics of the tea ceremony, disputing the material aesthetics of porcelain versus plastic. Here by approaching the art of wabi-sabi as the art of Verfallenheit, I argue that Gelassenheit may be understood in these terms.  相似文献   

15.
Ranking queries, also known as top-k queries, produce results that are ordered on some computed score. Typically, these queries involve joins, where users are usually interested only in the top-k join results. Top-k queries are dominant in many emerging applications, e.g., multimedia retrieval by content, Web databases, data mining, middlewares, and most information retrieval applications. Current relational query processors do not handle ranking queries efficiently, especially when joins are involved. In this paper, we address supporting top-k join queries in relational query processors. We introduce a new rank-join algorithm that makes use of the individual orders of its inputs to produce join results ordered on a user-specified scoring function. The idea is to rank the join results progressively during the join operation. We introduce two physical query operators based on variants of ripple join that implement the rank-join algorithm. The operators are nonblocking and can be integrated into pipelined execution plans. We also propose an efficient heuristic designed to optimize a top-k join query by choosing the best join order. We address several practical issues and optimization heuristics to integrate the new join operators in practical query processors. We implement the new operators inside a prototype database engine based on PREDATOR. The experimental evaluation of our approach compares recent algorithms for joining ranked inputs and shows superior performance.Received: 23 December 2003, Accepted: 31 March 2004, Published online: 12 August 2004Edited by: S. AbiteboulExtended version of the paper published in the Proceedings of the 29th International Conference on Very Large Databases, VLDB 2003, Berlin, Germany, pp 754-765  相似文献   

16.
17.
In real world applications, information is often provided by multiple sources having different priority levels reflecting for instance their reliability. This paper investigates ”Prioritized Removed Sets Revision” (PRSR) for revising stratified DL-Lite knowledge bases when a new sure piece of information, called the input, is added. The strategy of revision is based on inconsistency minimization and consists in determining smallest subsets of assertions (prioritized removed sets) that should be dropped from the current stratified knowledge base in order to restore consistency and accept the input. We consider different forms of input: A membership assertion, a positive or a negative inclusion axiom. To characterize our revision approach, we first rephrase Hansson’s postulates for belief bases revision within a DL-Lite setting, we then give logical properties of PRSR operators. In some situations, the revision process leads to several possible revised knowledge bases where defining a selection function is required to keep results within DL-Lite fragment. The last part of the paper shows how to use the notion of hitting set in order to compute the PRSR outcome. We also study the complexity of PRSR operators, and show that, in some cases, the computational complexity of the result can be performed in polynomial time.  相似文献   

18.
K-optimal rule discovery finds the K rules that optimize a user-specified measure of rule value with respect to a set of sample data and user-specified constraints. This approach avoids many limitations of the frequent itemset approach of association rule discovery. This paper presents a scalable algorithm applicable to a wide range of K-optimal rule discovery tasks and demonstrates its efficiency.  相似文献   

19.
With the popularization of wireless networks and mobile intelligent terminals, mobile crowd sensing is becoming a promising sensing paradigm. Tasks are assigned to users with mobile devices, which then collect and submit ambient information to the server. The composition of participants greatly determines the quality and cost of the collected information. This paper aims to select fewest participants to achieve the quality required by a sensing task. The requirement namely “t-sweep k-coverage” means for a target location, every t time interval should at least k participants sense. The participant selection problem for “t-sweep k-coverage” crowd sensing tasks is NP-hard. Through delicate matrix stacking, linear programming can be adopted to solve the problem when it is in small size. We further propose a participant selection method based on greedy strategy. The two methods are evaluated through simulated experiments using users’ call detail records. The results show that for small problems, both the two methods can find a participant set meeting the requirement. The number of participants picked by the greedy based method is roughly twice of the linear programming based method. However, when problems become larger, the linear programming based method performs unstably, while the greedy based method can still output a reasonable solution.  相似文献   

20.
In spite of significant improvements in video data retrieval, a system has not yet been developed that can adequately respond to a user’s query. Typically, the user has to refine the query many times and view query results until eventually the expected videos are retrieved from the database. The complexity of video data and questionable query structuring by the user aggravates the retrieval process. Most previous research in this area has focused on retrieval based on low-level features. Managing imprecise queries using semantic (high-level) content is no easier than queries based on low-level features due to the absence of a proper continuous distance function. We provide a method to help users search for clips and videos of interest in video databases. The video clips are classified as interesting and uninteresting based on user browsing. The attribute values of clips are classified by commonality, presence, and frequency within each of the two groups to be used in computing the relevance of each clip to the user’s query. In this paper, we provide an intelligent query structuring system, called I-Quest, to rank clips based on user browsing feedback, where a template generation from the set of interesting and uninteresting sets is impossible or yields poor results.
Ramazan Savaş Aygün (Corresponding author)Email:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号