共查询到20条相似文献,搜索用时 265 毫秒
1.
Thomas Kristensen 《Network Security》2004,2004(2):15
A big hole in a big wall. The most interesting vulnerability is certainly the vulnerability in the world's most widely deployed commercial firewall, Check Point Firewall-1, which can be exploited to gain control of the firewall. 相似文献
2.
Thomas Kristenson 《Network Security》2004,2004(3):19-20
Two serious flaws in popular client firewall software have been uncovered in February 相似文献
3.
Thomas Kristenson 《Network Security》2004,2004(1):19-20
Zero-day attacks and unpatched flaws
A vulnerability discovered in Internet Explorer at the beginning of December 2003 was an early Xmas present to scammers, who could exploit it to lure unsuspecting users into disclosing credit card information via an attack known as “phishing”. It didn't take long for the Web bandits to try out their present and it has yet to be taken away from them… 相似文献4.
5.
6.
Jakob Balle 《Network Security》2004,2004(5):18-20
More and more vulnerabilities appear to be exploited at a more rapid speed than they used to. During the last two months, we have seen exploitation of the ICQ hole in certain ISS products and the LSASS and PCT vulnerabilities in Microsoft Windows. All were exploited within very short time after information about the vulnerabilities was published. 相似文献
7.
8.
9.
10.
12.
If a technology (or idea) does not achieve mainstream status quickly enough, it dies. Video on demand (interactive TV), the information superhighway (ISDN), and massively parallel supercomputing may be examples. These ideas are okay, but they could die for lack of legs. At present, consumers are simply shunning them, illustrating the power of Information Age mainstreaming. A corollary to this law is that a technology (or idea) thrives, even if it is a bad technology or idea, as long as it quickly achieves mainstream status. Microsoft Windows, Java, C++ and others illustrate the overwhelming power of mainstreaming. It's positive feedback. Simply put, the rich get richer, especially when they hold a monopoly. In the Information Age, the definition of wealth includes domination of standards as well as having cash in the bank. The problem with software is that software companies don't get paid unless they reap a profit within the time limit set by the mainstreaming law. Commercial software companies have to hit the big time, or else 相似文献
13.
A. Rezaee Jordehi 《Neural computing & applications》2014,25(6):1329-1335
Big bang–big crunch (BBBC) algorithm is a fairly novel gradient-free optimisation algorithm. It is based on theories of evolution of the universe, namely the big bang and big crunch theory. The big challenge in BBBC is that it is easily trapped in local optima. In this paper, chaotic-based strategies are incorporated into BBBC to tackle this challenge. Five various chaotic-based BBBC strategies with three different chaotic map functions are investigated and the best one is selected as the proposed chaotic strategy for BBBC. The results of applying the proposed chaotic BBBC to different unimodal and multimodal benchmark functions vividly show that chaotic-based BBBC yields quality solutions. It significantly outperforms conventional BBBC, cuckoo search optimisation and gravitational search algorithm. 相似文献
14.
Wendy Arianne Günther Mohammad H. Rezazade Mehrizi Marleen Huysman Frans Feldberg 《The Journal of Strategic Information Systems》2017,26(3):191-209
Big data has been considered to be a breakthrough technological development over recent years. Notwithstanding, we have as yet limited understanding of how organizations translate its potential into actual social and economic value. We conduct an in-depth systematic review of IS literature on the topic and identify six debates central to how organizations realize value from big data, at different levels of analysis. Based on this review, we identify two socio-technical features of big data that influence value realization: portability and interconnectivity. We argue that, in practice, organizations need to continuously realign work practices, organizational models, and stakeholder interests in order to reap the benefits from big data. We synthesize the findings by means of an integrated model. 相似文献
15.
Trends in big data analytics 总被引:1,自引:0,他引:1
Karthik Kambatla Giorgos Kollias Vipin Kumar Ananth Grama 《Journal of Parallel and Distributed Computing》2014
One of the major applications of future generation parallel and distributed systems is in big-data analytics. Data repositories for such applications currently exceed exabytes and are rapidly increasing in size. Beyond their sheer magnitude, these datasets and associated applications’ considerations pose significant challenges for method and software development. Datasets are often distributed and their size and privacy considerations warrant distributed techniques. Data often resides on platforms with widely varying computational and network capabilities. Considerations of fault-tolerance, security, and access control are critical in many applications (Dean and Ghemawat, 2004; Apache hadoop). Analysis tasks often have hard deadlines, and data quality is a major concern in yet other applications. For most emerging applications, data-driven models and methods, capable of operating at scale, are as-yet unknown. Even when known methods can be scaled, validation of results is a major issue. Characteristics of hardware platforms and the software stack fundamentally impact data analytics. In this article, we provide an overview of the state-of-the-art and focus on emerging trends to highlight the hardware, software, and application landscape of big-data analytics. 相似文献
16.
At present, the development of health care industry is also very vigorous and prosperous, and has become one of the most widely developed industries in the world. Medical centers and service centers in various regions have begun to transform from medical model to health care model. This field programmable gate array has great advantages in this respect, and it is also one of the principles of patient-centered nursing. With the vigorous development of machine learning, its application scope is more and more extensive, and its application in medicine is also very common. People use machine learning to process big data in the medical field. In order to better manage patient data and realize patient-centered, we must analyze a large number of health data. The traditional management tools are not enough to support the analysis of modern data. Therefore, we should use advanced big data processing technology for relevant data processing, and use updated tools to meet the current medical needs. The signal processing based big data evaluation is to be done through FPGA. The proposed system contains three process these process are executed through the machine learning based. The first process preprocessing is used eliminate the noise of the image or irrelevant data avoided. The second process feature selection based decision tree technique used and then after the final process classification stage based machine learning technique is used to analysis of the big data accuracy level. FPGA based machine technique used to achieve the better result of the proposed system. 相似文献
17.
18.
Roger Clarke 《Information Systems Journal》2016,26(1):77-90
The ‘big data’ literature, academic as well as professional, has a very strong focus on opportunities. Far less attention has been paid to the threats that arise from repurposing data, consolidating data from multiple sources, applying analytical tools to the resulting collections, drawing inferences, and acting on them. On the basis of a review of quality factors in ‘big data’ and ‘big data analytics’, illustrated by means of scenario analysis, this paper draws attention to the moral and legal responsibility of computing researchers and professionals to temper their excitement, and apply reality checks to their promotional activities. 相似文献
19.
大数据是这两年非常热门的词汇,大数据应用已经开始被人们所关注。本文试着介绍大数据的由来、大数据的特点以及大数据的应用,并对大数据给信息技术带来的问题及大数据的发展趋势进行分析。 相似文献