首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
On connectionism, rule extraction, and brain-like learning   总被引:4,自引:0,他引:4  
There is a growing body of work that shows that both fuzzy and symbolic rule systems can be implemented using neural networks. This body of work also shows that these fuzzy and symbolic rules can be retrieved from these networks, once they have been learned by procedures that generally fall under the category of rule extraction. The paper argues that the idea of rule extraction from a neural network involves certain procedures, specifically the reading of parameters from a network, that are not allowed by the connectionist framework that these neural networks are based on. It argues that such rule extraction procedures imply a greater freedom and latitude about the internal mechanisms of the brain than is permitted by connectionism, but that such latitude is permitted by the recently proposed control theoretic paradigm for the brain. The control theoretic paradigm basically suggests that there are parts of the brain that control other parts and has far less restrictions on the kind of procedures that can be called “brain like”. The paper shows that this control theoretic paradigm is supported by new evidence from neuroscience about the role of neuromodulators and neurotransmitters in the brain. In addition, it shows that the control theoretic paradigm is also used in connectionist algorithms, although never acknowledged explicitly. The paper suggests that far better learning and rule extraction algorithms can be developed using these control theoretic notions and they would be consistent with the more recent understanding of how the brain works and learns  相似文献   

2.
It is well known that in an asynchronous system where processes are prone to crash, it is impossible to design a protocol that provides each process with the set of processes that are currently alive. Basically, this comes from the fact that it is impossible to distinguish a crashed process from a process that is very slow or with which communications are very slow. Nevertheless, designing protocols that provide the processes with good approximations of the set of processes that are currently alive remains a real challenge in fault-tolerant-distributed computing. This paper proposes such a protocol, plus a second protocol that allows to cope with heterogeneous communication networks. These protocols consider a realistic computation model where the processes are provided with nonsynchronized local clocks and a function alpha () that takes a local duration Delta as a parameter, and returns an integer that is an estimate of the number of processes that could have crashed during that duration Delta. A simulation-based experimental evaluation of the proposed protocols is also presented. These experiments show that the protocols are practically relevant.  相似文献   

3.
It is argued that Penrose's use of Gödel's theorem to show that no machine can reproduce the achievements of a human mathematician depends on extra assumptions that would also show that no machine could ever check the validity of mathematical proofs; and it is claimed that the latter result contradicts views about the possibility of communicating mathematical proofs that are widely held and that are endorsed by Penrose.  相似文献   

4.
ABSTRACT

In Panteley and Loria (2017), a framework for the study of synchronisation and collective behaviour of networked heterogeneous systems was introduced. It was underlined that in such scenario an emergent collective behaviour arises, one that is inherent to the network and that is independent of the interconnection strength. Therefore, the natural way to make complete study of synchronisation is by investigating, on one hand, the stability of the emergent dynamical system and, on the other, by assessing the difference between the motion of each individual system and that of the emergent one. Thus, if all systems' motions approach that of the emergent dynamics, we say that they reach dynamic consensus. In this paper, we study dynamic consensus of a fairly general class of nonlinear heterogeneous oscillators, called Stuart–Landau. We establish that the emergent dynamics consists in that of an ‘averaged’ oscillator with a global attractor that consists in a limit-cycle and, moreover, we determine its frequency of oscillation. Then, we show that the heterogeneous oscillators achieve practical dynamic consensus, that is, their synchronisation errors measured relative to the collective motion are ultimately bounded.  相似文献   

5.
One of the most influential arguments against the claim that computers can think is that while our intentionality is intrinsic, that of computers is derived: it is parasitic on the intentionality of the programmer who designed the computer-program. Daniel Dennett chose a surprising strategy for arguing against this asymmetry: instead of denying that the intentionality of computers is derived, he endeavours to argue that human intentionality is derived too. I intend to examine that biological plausibility of Dennett’s suggestion and show that Dennett’s argument for the claim that human intentionality is derived because it was designed by natural selection is based on the misunderstanding of how natural selection works.  相似文献   

6.
《国际计算机数学杂志》2012,89(11):2462-2476
The algebra of the Kronecker products of matrices is recapitulated using a notation that reveals the tensor structures of the matrices. It is claimed that many of the difficulties that are encountered in working with the algebra can be alleviated by paying close attention to the indices that are concealed beneath the conventional matrix notation. The vectorization operations and the commutation transformations that are common in multivariate statistical analysis alter the positional relationship of the matrix elements. These elements correspond to numbers that are liable to be stored in contiguous memory cells of a computer, which should remain undisturbed. It is suggested that, in the absence of an adequate index notation that enables the manipulations to be performed without disturbing the data, even the most clear-headed of computer programmers is liable to perform wholly unnecessary and time-wasting operations that shift data between memory cells.  相似文献   

7.
The issue of what we consider to be the identity of a person has become increasingly complex as we have made ever greater use of the facilities and services that have been made available by developing technologies and the Internet. In the past people normally had one identity, while in the current environment it is acceptable to maintain separate ‘identities’ for different aspects of our on-line interactions.Proving beyond a reasonable doubt that an individual that is suspected of a crime that is based on the technologies that we increasingly rely on was the actual perpetrator has always been problematic. It is relatively easy to determine the device that was used, but proving that the suspect was the person that used it has always been more difficult.This paper looks at a range of issues that have affected what we consider to be reasonable proof of identity and a number of the problems that this causes in identifying the perpetrator of a crime.  相似文献   

8.
The common metric temporal logic for continuous time were shown to be insufficient, when it was proved that they cannot express a modality suggested by Pnueli. Moreover no finite temporal logic can express all the natural generalizations of this modality. It followed that if we look for an optimal decidable metric logic we must accept infinitely many modalities, or adopt a different formalism.Here we identify a fragment of the second order monadic logic of order with the “+1” function, that expresses all the Pnueli modalities and much more. Its main advantage over the temporal logics is that it enables us to say not just that within prescribed time there is a point where some punctual event will occur, but also that within prescribed time some process that starts now (or that started before, or that will start soon) will terminate. We prove that this logic is decidable with respect to satisfiability and validity, over continuous time. The proof depends heavily on the theory of compositionality. In particular every temporal logic that has truth tables in this logic is automatically decidable. We extend this result by proving that any temporal logic, that has all its modalities defined by means more general than truth tables, in a logic stronger than the one just described, has a decidable satisfiability problem. We suggest that this monadic logic can be the framework in which temporal logics can be safely defined, with the guarantee that their satisfiability problem is decidable.  相似文献   

9.
10.
We describe a new incremental algorithm for training linear threshold functions: the Relaxed Online Maximum Margin Algorithm, or ROMMA. ROMMA can be viewed as an approximation to the algorithm that repeatedly chooses the hyperplane that classifies previously seen examples correctly with the maximum margin. It is known that such a maximum-margin hypothesis can be computed by minimizing the length of the weight vector subject to a number of linear constraints. ROMMA works by maintaining a relatively simple relaxation of these constraints that can be efficiently updated. We prove a mistake bound for ROMMA that is the same as that proved for the perceptron algorithm. Our analysis implies that the maximum-margin algorithm also satisfies this mistake bound; this is the first worst-case performance guarantee for this algorithm. We describe some experiments using ROMMA and a variant that updates its hypothesis more aggressively as batch algorithms to recognize handwritten digits. The computational complexity and simplicity of these algorithms is similar to that of perceptron algorithm, but their generalization is much better. We show that a batch algorithm based on aggressive ROMMA converges to the fixed threshold SVM hypothesis.  相似文献   

11.
We propose a tone‐mapping algorithm that minimizes the function that represents the visual sensation distortion that occurs after tone mapping. For the function, we consider both brightness and local band‐limited contrast to reduce darkening of images that have high dynamic range when they are displayed with conventional contrast‐based tone mapping using histogram on devices that have low dynamic range. By exploiting human visual characteristics, we simplify the problem and find a closed‐form solution that minimizes the function that represents distortion of visual sensation. In both subjective and objective evaluations, the proposed algorithm achieves a processed image that is most similar to the original image and has the best subjective image quality.  相似文献   

12.
Cairns-Smith has proposed that life began as structural patterns in clays that self-replicated during cycles of crystal growth and fragmentation. Complex, evolved crystal forms could then have catalyzed the formation of a more advanced genetic material. A crucial weakness of this theory is that it is unclear how complex crystals might arise through Darwinian evolution and selection. Here we investigate whether complex crystal patterns could evolve using a model system for crystal growth, DNA tile crystals, that is amenable to both theoretical and experimental inquiry. It was previously shown that in principle, the evolution of crystals assembled from a set of thousands of DNA tile types under very specific environmental conditions could produce arbitrarily complex patterns. Here we show that evolution driven only by the dearth of one monomer type could produce complex crystals from just 12 monomer types. When a monomer type is rare, crystals that use few of this monomer type are selected for. We use explicit enumeration to show that there are situations in which crystal species that use a particular monomer type less frequently will grow faster, yet to do so requires that the information contained in the crystal become more complex. We show that this feature of crystal organization could allow more complex crystal morphologies to be selected for in the right environment, using both analysis in a simple model of self-assembly and stochastic kinetic simulations of crystal growth. The proposed mechanism of evolution is simple enough to test experimentally and is sufficiently general that it may apply to other DNA tile crystals or even to natural crystals, suggesting that complex crystals could evolve from simple starting materials because of relative differences in concentrations of the materials needed for growth.  相似文献   

13.
Subtleties of Transactional Memory Atomicity Semantics   总被引:1,自引:0,他引:1  
Transactional memory has great potential for simplifying multithreaded programming by allowing programmers to specify regions of the program that must appear to execute atomically. Transactional memory implementations then optimistically execute these transactions concurrently to obtain high performance. This work shows that the same atomic guarantees that give transactions their power also have unexpected and potentially serious negative effects on programs that were written assuming narrower scopes of atomicity. We make four contributions: (1) we show that a direct translation of lock-based critical sections into transactions can introduce deadlock into otherwise correct programs, (2) we introduce the terms strong atomicity and weak atomicity to describe the interaction of transactional and non-transactional code, (3) we show that code that is correct under weak atomicity can deadlock under strong atomicity, and (4) we demonstrate that sequentially composing transactional code can also introduce deadlocks. These observations invalidate the intuition that transactions are strictly safer than lock-based critical sections, that strong atomicity is strictly safer than weak atomicity, and that transactions are always composable  相似文献   

14.
Some anecdotal accounts and research reports have suggested that obsessive social media involvement could turn into a compulsive behavior among university students. Unfortunately, the research that sheds light on the possible conditional nature of that relationship is scarce at best. Therefore, this study tries to address this issue by developing a contingency-based model and tests it using data gathered from a sample of university students. The model postulates that compulsive social media use arises due to self-awareness factors, and together they in turn predict problematic learning outcomes. It also postulates that these relationships are moderated by the influence of technological factors. The results indicate that self-esteem has a significant negative influence on compulsive social media use and that interaction anxiousness has a significant positive influence on the same. The results also reveal that only compulsive social media use has a significant direct influence on problematic learning outcomes; and that social media complementarity plays a moderating role in the model. We discuss the implications of these findings for research and practice.  相似文献   

15.
In this paper we provide a psychological account of the nature and development of explanation. We propose that an explanation is an account that provides a conceptual framework for a phenomenon that leads to a feeling of understanding in the reader/hearer. The explanatory conceptual framework goes beyond the original phenomenon, integrates diverse aspects of the world, and shows how the original phenomenon follows from the framework. We propose that explanations in everyday life are judged on the criteria of empirical accuracy, scope, consistency, simplicity, and plausibility. We conclude that explanations in science are evaluated by the same criteria, plus those of precision, formalisms, and fruitfulness. We discuss several types of explanation that are used in everyday life – causal/mechanical, functional, and intentional. We present evidence to show that young children produce explanations (often with different content from those of adults) that have the same essential form as those used by adults. We also provide evidence that children use the same evaluation criteria as adults, but may not apply those additional criteria for the evaluation of explanations that are used by scientists.  相似文献   

16.
17.
Berkeley [Minds Machines 10 (2000) 1] described a methodology that showed the subsymbolic nature of an artificial neural network system that had been trained on a logic problem, originally described by Bechtel and Abrahamsen [Connectionism and the mind. Blackwells, Cambridge, MA, 1991]. It was also claimed in the conclusion of this paper that the evidence was suggestive that the network might, in fact, count as a symbolic system. Dawson and Piercey [Minds Machines 11 (2001) 197] took issue with this latter claim. They described some lesioning studies that they argued showed that Berkeley’s (2000) conclusions were premature. In this paper, these lesioning studies are replicated and it is shown that the effects that Dawson and Piercey rely upon for their argument are merely an artifact of a threshold function they chose to employ. When a threshold function much closer to that deployed in the original studies is used, the significant effects disappear.  相似文献   

18.
To exploit the potential offered by integrated intelligent control systems, it is essential that underlying architectures are available that are able to integrate not only hardware and software, but also human beings into the control loop. By studying the way that a human workforce operates, this paper suggests how a human systems analogy can be drawn up that highlighted some fundamental differences between the way that humanbased, and computer-based systems operate. One of the key differences is that humans are able to reason not only logically, but also in terms of time. This paper addresses the philosphy behind the DENIS architecture — a distributed architecture that enables this high level of compatability between autonomous intelligent agents.  相似文献   

19.
《Knowledge》2000,13(6):361-368
This paper takes the approach that designing is situated and that concepts are formed as a consequence of the situatedness of designing. The paper presents a framework for concept formation that draws on a structure of a design agent that includes sensors, perceptors and conceptors that interact with each other and the external and internal environment of the agent to produce the situation that is a contingent basis for the formation and use of concepts.  相似文献   

20.
An experimental procedure that evaluates the quality of a grasp is developed. In this procedure human subjects grasp a rigid object that is subjected to an external load. Three formulations that capture the sense of grasping quality through the energy level stored in the gripper, maximum value of the applied finger forces, and through the distribution of the grasping forces are considered. The applied finger forces are measured and the quality values of the grasp based on these three different formulations are computed. These grasping quality values are compared with numerical human assessments, that are obtained via a psychophysical magnitude estimation method. We derive an augmented weighted functional that combines these three formulations, and show that it exhibits a high correlation with human quality assessment. Our results demonstrate that the most dominant mechanism that characterizes the quality of a rigid body grasp is the uniformity level of the contact forces. © 1997 John Wiley & Sons, Inc.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号