首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper aims to shed light on the dynamics of information systems (IS) as a discipline in the making. We use the ideas of the sociologist Abbott to propose three different stages of a discipline's development: differentiation, competition, and absorption. These stages reflect how disciplines go through different cycles and how they acquire, consolidate or lose elements of knowledge. We map these stages using citation and co-citation analyses of two main IS academic journals (EJIS and MISQ) from 1995 to 2008. Our results indicate that IS is currently in a stage of absorption, with research being consolidated around the theme of ‘IS acceptance’. Dominant models and frameworks related to this theme are predictive and thus lend themselves usable for positivistic and quantitative research. In this stage there is also a healthy degree of variety in IS including dormant elements which could reignite. Implications derived from our findings aim to help in the consolidation and extension of knowledge about IS in both academia and practice.  相似文献   

2.
Knowledge acquisition has been a critical bottleneck in building knowledge-based systems. In past decades, several methods and systems have been proposed to cope with this problem. Most of these methods and systems were proposed to deal with the acquisition of domain knowledge from single expert. However, as multiple experts may have different experiences and knowledge on the same application domain, it is necessary to elicit and integrate knowledge from multiple experts in building an effective expert system. Moreover, the recent literature has depicted that “time” is an important parameter that might significantly affect the accuracy of inference results of an expert system; therefore, while discussing the elicitation of domain expertise from multiple experts, it becomes an challenging and important issue to take the “time” factor into consideration. To cope with these problems, in this study, we propose a Delphi-based approach to eliciting knowledge from multiple experts. An application on the diagnosis of Severe Acute Respiratory Syndrome has depicted the superiority of the novel approach.  相似文献   

3.
Coalition logic (CL) is one of the most influential logical formalisms for strategic abilities of multi-agent systems. CL can specify what a group of agents can achieve through choices of their actions, denoted by [C]? to state that a group of agents C can have a strategy to bring about ? by collective actions, no matter what the other agents do. However, CL lacks the temporal dimension and thus can not capture the dynamic aspects of a system. Therefore, CL can not formalize the evolvement of rational mental attitudes of the agents such as knowledge, which has been shown to be very useful in specifications and verifications of distributed systems, and has received substantial amount of studies. In this paper, we introduce coalition logic of temporal knowledge (CLTK), by incorporating a temporal logic of knowledge (Halpern and Vardi’s logic of CKL n ) into CL to equip CL with the power to formalize how agents’ knowledge (individual or group knowledge) evolves over the time by coalitional forces and the temporal properties of strategic abilities as well. Furthermore, we provide an axiomatic system for CLTK and prove that it is sound and complete, along with the complexity of the satisfiability problem which is shown to be EXPTIME-complete.  相似文献   

4.
5.
Rapid advances in image acquisition and storage technology underline the need for real-time algorithms that are capable of solving large-scale image processing and computer-vision problems. The minimum st cut problem, which is a classical combinatorial optimization problem, is a prominent building block in many vision and imaging algorithms such as video segmentation, co-segmentation, stereo vision, multi-view reconstruction, and surface fitting to name a few. That is why finding a real-time algorithm which optimally solves this problem is of great importance. In this paper, we introduce to computer vision the Hochbaum’s pseudoflow (HPF) algorithm, which optimally solves the minimum st cut problem. We compare the performance of HPF, in terms of execution times and memory utilization, with three leading published algorithms: (1) Goldberg’s and Tarjan’s Push-Relabel; (2) Boykov’s and Kolmogorov’s augmenting paths; and (3) Goldberg’s partial augment-relabel. While the common practice in computer-vision is to use either BK or PRF algorithms for solving the problem, our results demonstrate that, in general, HPF algorithm is more efficient and utilizes less memory than these three algorithms. This strongly suggests that HPF is a great option for many real-time computer-vision problems that require solving the minimum st cut problem.  相似文献   

6.
We consider the processes of achieving alignment in coordinated inter-organizational networks through a case study of a system development project in ARC Transistance, a network of European automobile clubs that cooperate to provide pan-European service. The theoretical contribution of the paper is, first, an extended strategic alignment model for inter-organizational networks that distinguishes between integration of IS with business strategy and infrastructure, and what we label ‘accordance’ between the strategies and infrastructures of the network and the member firms. Second, we propose that for a network organization, network and member strategies might be complementary as well as tightly coupled. We similarly argue that IS architectures for networks should strive for being ‘business strategy-neutral’ to more easily accommodate the diversity of members. Finally, we discuss how the process of developing a network information system can be a driver towards network alignment, but how the lack of effective governance structures makes alignment harder to achieve.  相似文献   

7.
This paper profiles the types of research activity that have been published in EJIS from 1997 to 2007. Our analysis includes variables such as the most productive authors, citation analysis, universities associated with the most research publications, geographic diversity, authors’ background, subject areas most often investigated, unit of analysis and research methodologies. The classification of the topics and methodologies used by the most highly published authors will help prospective authors gauge whether their paper is suitable for EJIS. The major geographical source of information system (IS) research published in EJIS is from AIS region 2 (Europe, the Middle East and Africa), but with a substantial AIS region 1 (American-based researchers and universities) and AIS region 3 (Asia–Pacific) contribution. The most common research method used is the case study approach, with other methods such as surveys and library research also used frequently. IS management and IS development are the two most researched IS topics published in EJIS. This research and results reported in this paper are comparable with a previous paper published about the Information Systems Journal. Any further such studies will thus be able to make similar comparisons between these journals and any others that have subsequently been covered in this way. The paper concludes with the need for more substantive research on the topic if journal comparisons are to achieve their potential.  相似文献   

8.
There has been growing interest in theory building in Information Systems (IS) research. We extend this literature by examining theory building perspectives. We define a perspective as a researcher’s choice of the types of concepts and relationships used to construct a theory, and we examine three perspectives – process, variance, and systems. We contribute by clarifying these perspectives and explaining how they can be used more flexibly in future research. We illustrate the value of this more flexible approach by showing how researchers can use different theoretical perspectives to critique and extend an existing theoretical model (in our case, the IS Success Model). Overall, we suggest a shift from the traditional process-variance dichotomy to a broader view defined by conceptual latitude (the types of concepts and relationships available) and conceptual fit (the types of concepts and relationships appropriate for a given study). We explain why this shift should help researchers as they engage in the knowledge generation process.  相似文献   

9.
Strategic information systems (IS) planning is not an easy task and knowing which critical areas to manage certainly enhances IS planning success. Studies of critical success factors (CSFs) usually dealt with specific systems or management technique implementation, such as manufacturing resource planning (MRP) and total quality management (TQM). There exists little empirical research on CSFs per se in strategic IS planning. This paper is an effort to enhance existing knowledge on how strategic IS planning should be effectively managed. Using data from a survey on IS planning conducted in 1996 by the National University of Singapore, we identified and rank-ordered the CSFs in strategic IS planning in the Singapore context. We also examined the sources of assistance and expertise that companies undertaking IS planning in Singapore can tap.  相似文献   

10.
The authors propose a model for an intelligent assistant to aid in building knowledge-based systems (KBSs) and discuss a preliminary implementation. The assistant participates in KBS construction, including acquisition of an initial model of a problem domain, acquisition of control and task-specific inference knowledge, testing and validation, and long-term maintenance of encoded knowledge. The authors present a hypothetical scenario in which the assistant and a KBS designer cooperate to create an initial domain model and then discuss five categories of knowledge the assistant requires to offer such help. They discuss two software technologies on which the assistant is based: an object-oriented programming language, and a user-interface framework  相似文献   

11.
About 20 years ago, Markus and Robey noted that most research on IT impacts had been guided by deterministic perspectives and had neglected to use an emergent perspective, which could account for contradictory findings. They further observed that most research in this area had been carried out using variance theories at the expense of process theories. Finally, they suggested that more emphasis on multilevel theory building would likely improve empirical reliability. In this paper, we reiterate the observations and suggestions made by Markus and Robey on the causal structure of IT impact theories and carry out an analysis of empirical research published in four major IS journals, Management Information Systems Quarterly (MISQ), Information Systems Research (ISR), the European Journal of Information Systems (EJIS), and Information and Organization (I&O), to assess compliance with those recommendations. Our final sample consisted of 161 theory-driven articles, accounting for approximately 21% of all the empirical articles published in these journals. Our results first reveal that 91% of the studies in MISQ, ISR, and EJIS focused on deterministic theories, while 63% of those in I&O adopted an emergent perspective. Furthermore, 91% of the articles in MISQ, ISR, and EJIS adopted a variance model; this compares with 71% from I&O that applied a process model. Lastly, mixed levels of analysis were found in 14% of all the surveyed articles. Implications of these findings for future research are discussed.  相似文献   

12.
In order to explore scientific writing in Information Systems (IS) journals, we adopt a combination of historical and rhetorical approaches. We first investigate the history of universities, business schools, learned societies and scientific articles. This perspective allows us to capture the legacy of scientific writing standards, which emerged in the 18th and 19th centuries. Then, we focus on two leading IS journals (EJIS and MISQ). An historical analysis of both outlets is carried out, based on data related to their creation, evolution of editorial statements, and key epistemological and methodological aspects. We also focus on argumentative strategies found in a sample of 436 abstracts from both journals. Three main logical anchorages (sometimes combined) are identified, and related to three argumentative strategies: ‘deepening of knowledge’, ‘solving an enigma’ and ‘addressing a practical managerial issue’. We relate these writing norms to historical imprints of management and business studies, in particular: enigma-focused rhetorics, interest in institutionalized literature, neglect for managerially grounded rhetoric and lack of reflexivity in scientific writing. We explain this relation as a quest for academic legitimacy. Lastly, some suggestions are offered to address the discrepancies between these writing norms and more recent epistemological and theoretical stances adopted by IS researchers.  相似文献   

13.
This paper presents a framework for justifying generalization in information systems (IS) research. First, using evidence from an analysis of two leading IS journals, we show that the treatment of generalization in many empirical papers in leading IS research journals is unsatisfactory. Many quantitative studies need clearer definition of populations and more discussion of the extent to which ‘significant’ statistics and use of non-probability sampling affect support for their knowledge claims. Many qualitative studies need more discussion of boundary conditions for their sample-based general knowledge claims. Second, the proposed new framework is presented. It defines eight alternative logical pathways for justifying generalizations in IS research. Three key concepts underpinning the framework are the need for researcher judgment when making any claim about the likely truth of sample-based knowledge claims in other settings; the importance of sample representativeness and its assessment in terms of the knowledge claim of interest; and the desirability of integrating a study's general knowledge claims with those from prior research. Finally, we show how the framework may be applied by researchers and reviewers. Observing the pathways in the framework has potential to improve both research rigour and practical relevance for IS research.  相似文献   

14.
This study reports on a follow-up analysis of a prior study (Teo & King, 1996, Information and Management) of the impact of the integration of information systems planning (ISP) with business planning (BP) on organizational performance. The empirical data are re-analysed using path analysis in order to determine the direct and indirect impacts of BP-ISP integration on intermediate performance measures related to ISP process and output problems, as well as on five perceptual measures of organizational performance. The results empirically substantiate the importance of BP-ISP integration, since higher levels of integration were found to have a significant inverse relationship with the extent of both process and output varieties of ISP problems and a significant positive relationship with each of the five perceptual measures of the extent of IS contributions to overall organization performance.  相似文献   

15.
Instance matching is the problem of determining whether two instances describe the same real-world entity or not. Instance matching plays a key role in data integration and data cleansing, especially for building a knowledge base. For example, we can regard each article in encyclopedias as an instance, and a group of articles which refers to the same real-world object as an entity. Therefore, articles about Washington should be distinguished and grouped into different entities such as Washington, D.C (the capital of the USA), George Washington (first president of the USA), Washington (a state of the USA), Washington (a village in West Sussex, England), Washington F.C. (a football club based in Washington, Tyne and Wear, England), Washington, D.C. (a novel). In this paper, we proposed a novel instance matching approach Active Instance Matching with Pairwise Constraints, which can bring the human into the loop of instance matching. The proposed approach can generate candidate pairs in advance to reduce the computational complexity, and then iteratively select the most informative pairs according to the uncertainty, influence, connectivity and diversity of pairs. We evaluated our approach one two publicly available datasets AMINER and WIDE-NUS and then applied our approach to the two large-scale real-world datasets, Baidu Baike and Hudong Baike, to build a Chinese knowledge base. The experiments and practice illustrate the effectiveness of our approach.  相似文献   

16.
This paper presents an experimental study into the processes that may contribute to building a better knowledge-based system. A model that defines quality to be composed of two related aspects, internal and external quality, is introduced. To test the model, 24 subjects developed a knowledge-based system for MBA course planning in an experimental setting over a 7 week period. Subjects were factored by development methodology (structured vs rapid prototyping), knowledge representation scheme used (rule-based vs hybrid) and programmer quality (naive vs experienced). The major finding is that an appropriate mix of development methodology, knowledge representation and personnel is necessary. No single development methodology or knowledge representation scheme is best, and a considerable number of interactions was observed in the experiment. Rapid prototyping combined with rule-based representation produced the best external quality in terms of functionality, but when combined with hybrid representation produced the worst. Similarly, rapid prototyping combined with hybrid representation produced the worst usability. Programmer quality had a positive effect on coding productivity, which in turn resulted in an increase in system usability. As with conventional software, increasing programmer quality can be very beneficial to both process and content. The study presented provides some evidence of the anomalies that are generated in the course of system development, and how they relate to internal quality. As might be expected, experienced programmers produced significantly fewer anomalies. A relationship was found between internal quality and usability, but not functionality.  相似文献   

17.
This paper addresses how technology-mediated mass collaboration offers a dramatically innovative alternative for producing IS research. We refer to this emerging genre as the crowdsourced research genre and develop a framework to structure discourse on how it may affect the production of IS research. After systematically traversing the alternative genre’s landscape using the framework, we propose a research agenda of the most substantial and imminent issues for the successful development of the genre, including contributor incentives, scholarly contribution assessment, anonymity, governance, intellectual property ownership, and value propositions. In addressing this research agenda, we reflect on what might be learned from other areas in which crowdsourcing has been established with success.  相似文献   

18.
This study offers an alternative interpretation to Banville and Landry’s (B&L, 1989) Can the Field of MIS Be Disciplined?, the canonical text that argued persuasively against the adoption of the Kuhnian view of scientific progress for the information systems (IS) field. Much has transpired in the quarter of a century since its publication, which provides us with new sources of understanding about paradigms and how they relate to the challenges faced by the IS field. On the basis of the hermeneutical principles of tradition, prejudice, temporal distance, history of effect and application, this study describes the context from which B&L was written, its dependence on Whitley’s (1984) The Intellectual and Social Organization of the Sciences, and examines several of its claims and assertions. In contrast to B&L, this study finds the Kuhnian model of scientific progress well suited for a multidisciplinary and pluralistic field like IS and concludes with guidelines on how to reclaim the more transformative aspects of the paradigm concept, engender a culture of contextual borrowing from reference disciplines, and encourage conceptual development and autonomous theory construction.  相似文献   

19.
Kurt Gödel’s Incompleteness theorem is well known in Mathematics/Logic/Philosophy circles. Gödel was able to find a way for any given P (UTM), (read as, “P of UTMforProgram of Universal Truth Machine”), actually to write down a complicated polynomial that has a solution iff (=if and only if), G is true, where G stands for a Gödel-sentence. So, if G’s truth is a necessary condition for the truth of a given polynomial, then P (UTM) has to answer first that G is true in order to secure the truth of the said polynomial. But, interestingly, P (UTM) could never answer that G was true. This necessarily implies that there is at least one truth a P (UTM), however large it may be, cannot speak out. Daya Krishna and Karl Potter’s controversy regarding the construal of India’s Philosophies dates back to the time of Potter’s publication of “Presuppositions of India’s Philosophies” (1963, Englewood Cliffs Prentice-Hall Inc.) In attacking many of India’s philosophies, Daya Krishna appears to have unwittingly touched a crucial point: how can there be the knowledge of a ‘non-cognitive’ mok?a? [‘mok?a’ is the final state of existence of an individual away from Social Contract]—See this author’s Indian Social Contract and its Dissolution (2008) mok?a does not permit the knowledge of one’s own self in the ordinary way with threefold distinction, i.e., subject–knowledge-object or knower–knowledge–known. But what is important is to demonstrate whether such ‘knowledge’ of non-cognitive mok?a state can be logically shown, in a language, to be possible to attain, and that there is no contradiction involved in such demonstration, because, no one can possibly express the ‘experience-itself’ in language. Hence, if such ‘knowledge’ can be shown to be logically not impossible in language, then, not only Daya Krishna’s arguments against ‘non-cognitive mok?a’ get refuted but also it would show the possibility of achieving ‘completeness’ in its truest sense, as opposed to Gödel’s ‘Incompleteness’. In such circumstances, man would himself become a Universal Truth Machine. This is because the final state of mok?a is construed as the state of complete knowledge in Advaita. This possibility of ‘completeness’ is set in this paper in the backdrop of ?rī ?a?karācārya’s Advaitic (Non-dualistic) claim involved in the mahāvākyas (extra-ordinary propositions). (Mahāvākyas that ?a?kara refers to are basically taken from different Upani?ads. For example, “Aham Brahmāsmi” is from B?hadāra?yaka Upanisad, and “Tattvamasi” is from Chāndogya Upani?ad. ?rī ?a?karācārya has written extensively. His main works include his Commentary on Brahma-Sūtras, on major Upani?ads, and on ?rīmadBhagavadGītā, called Bhā?yas of them, respectively. Almost all these works are available in English translation published by Advaita Ashrama, 5 Dehi Entally Road, Calcutta, 700014.) On the other hand, the ‘Incompleteness’ of Gödel is due to the intervening G-sentence, which has an adverse self-referential element. Gödel’s incompleteness theorem in its mathematical form with an elaborate introduction by R.W. Braithwaite can be found in Meltzer (Kurt Gödel: on formally undecidable propositions of principia mathematica and related systems. Oliver &; Boyd, Edinburgh, 1962). The present author believes first that semantic content cannot be substituted by any amount of arithmoquining, (Arithmoquining or arithmatization means, as Braithwaite says,—“Gödel’s novel metamathematical method is that of attaching numbers to the signs, to the series of signs (formulae) and to the series of series of signs (“proof-schemata”) which occur in his formal system…Gödel invented what might be called co-ordinate metamathematics…”) Meltzer (1962 p. 7). In Antone (2006) it is said “The problem is that he (Gödel) tries to replace an abstract version of the number (which can exist) with the concept of a real number version of that abstract notion. We can state the abstraction of what the number needs to be, [the arithmoquining of a number cannot be a proof-pair and an arithmoquine] but that is a concept that cannot be turned into a specific number, because by definition no such number can exist.”.), especially so where first-hand personal experience is called for. Therefore, what ultimately rules is the semanticity as in a first-hand experience. Similar points are voiced, albeit implicitly, in Antone (Who understands Gödel’s incompleteness theorem, 2006). (“…it is so important to understand that Gödel’s theorem only is true with respect to formal systems—which is the exact opposite of the analogous UTM (Antone (2006) webpage 2. And galatomic says in the same discussion chain that “saying” that it ((is)) only true for formal systems is more significant… We only know the world through “formal” categories of understanding… If the world as it is in itself has no incompleteness problem, which I am sure is true, it does not mean much, because that is not the world of time and space that we experience. So it is more significant that formal systems are incomplete than the inexperiencable ‘World in Itself’ has no such problem.—galatomic”) Antone (2006) webpage 2. Nevertheless galatomic certainly, but unwittingly succeeds in highlighting the possibility of experiencing the ‘completeness’ Second, even if any formal system including the system of Advaita of ?a?kara is to be subsumed or interpreted under Gödel’s theorem, or Tarski’s semantic unprovability theses, the ultimate appeal would lie to the point of human involvement in realizing completeness since any formal system is ‘Incomplete’ always by its very nature as ‘objectual’, and fails to comprehend the ‘subject’ within its fold.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号