首页 | 官方网站   微博 | 高级检索  
     


A note on evidence and confirmation in machine learning
Authors:James P Delgrande
Affiliation:School of Computing Science, Simon Fraser University, Burnaby, B.C., Canada V5A 1S6
Abstract:This paper addresses the problem in inductive generalization of determining when a general hypothesis is supported by a particular instance. If we accept that, first, some facts do indeed support a general hypothesis and, second, that an instance that supports a hypothesis also supports all logical consequences of the hypothesis, then unintuitive and problematic results are immediately forthcoming. These assumptions lead, for example, to the conclusion that a blue Honda is confirming evidence for the hypothesis that ravens are black. This problem is variously known as the paradoxes of confirmation or Hempel's paradox . In this paper I develop a formal characterization of the problem. The assumption that whatever supports all classical consequences of the hypothesis is rejected. Rather, I argue that a weaker notion of consequence should be adopted for determining what consequences of a hypothesis are supported by the same evidence. An extant formal system for learning from examples is used to address these problems of evidential support, and it is shown that in this framework the problems do not arise.
Keywords:induction    learning from examples    confirmation    evidence    ravens  
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号