首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   8228篇
  免费   518篇
  国内免费   7篇
工业技术   8753篇
  2024年   9篇
  2023年   107篇
  2022年   100篇
  2021年   325篇
  2020年   199篇
  2019年   210篇
  2018年   231篇
  2017年   267篇
  2016年   351篇
  2015年   294篇
  2014年   377篇
  2013年   596篇
  2012年   533篇
  2011年   717篇
  2010年   507篇
  2009年   478篇
  2008年   484篇
  2007年   429篇
  2006年   379篇
  2005年   305篇
  2004年   254篇
  2003年   187篇
  2002年   168篇
  2001年   105篇
  2000年   99篇
  1999年   114篇
  1998年   124篇
  1997年   101篇
  1996年   90篇
  1995年   53篇
  1994年   61篇
  1993年   39篇
  1992年   47篇
  1991年   27篇
  1990年   40篇
  1989年   24篇
  1988年   29篇
  1987年   27篇
  1986年   24篇
  1985年   20篇
  1984年   28篇
  1983年   16篇
  1982年   17篇
  1981年   14篇
  1980年   20篇
  1978年   12篇
  1977年   15篇
  1976年   22篇
  1975年   13篇
  1969年   11篇
排序方式: 共有8753条查询结果,搜索用时 15 毫秒
131.
Alongside the emergence of the use of fieldwork studies for design there has been a discussion on how best these studies can inform system development. Concerns have been expressed as to whether their most appropriate contribution is a list of requirements or design recommendations. This article explores a recurrent issue that has emerged from fieldwork studies in Computer-Supported Cooperative Work, awareness, and with respect to a particular system development project discusses some of the implications for the development and deployment of one particular kind of technology—image recognition systems—in particular, organizational settings. In the setting in question—surveillance centers or operations rooms—staff utilize a range of practices to maintain awareness. Rather than extending field studies so that they can better assist design, it may be considered how workplace studies can contribute to a respecification of key concepts, like awareness, that are critical to an understanding of how technologies are used and deployed in everyday environments.  相似文献   
132.
ABSTRACT

In this study, we first show that while both the perceived usefulness and perceived enjoyment of enterprise social networks impact employees’ intentions for continuous participation, the utilitarian value significantly outpaces its hedonic value. Second, we prove that the network’s utilitarian value is constituted by its digital infrastructure characteristics: versatility, adaptability, interconnectedness and invisibility-in-use. The study is set within a software engineering company and bases on quantitative survey research, applying partial least squares structural equation modeling.  相似文献   
133.
The evolution of the web has outpaced itself: A growing wealth of information and increasingly sophisticated interfaces necessitate automated processing, yet existing automation and data extraction technologies have been overwhelmed by this very growth. To address this trend, we identify four key requirements for web data extraction, automation, and (focused) web crawling: (1) interact with sophisticated web application interfaces, (2) precisely capture the relevant data to be extracted, (3) scale with the number of visited pages, and (4) readily embed into existing web technologies. We introduce OXPath as an extension of XPath for interacting with web applications and extracting data thus revealed—matching all the above requirements. OXPath’s page-at-a-time evaluation guarantees memory use independent of the number of visited pages, yet remains polynomial in time. We experimentally validate the theoretical complexity and demonstrate that OXPath’s resource consumption is dominated by page rendering in the underlying browser. With an extensive study of sublanguages and properties of OXPath, we pinpoint the effect of specific features on evaluation performance. Our experiments show that OXPath outperforms existing commercial and academic data extraction tools by a wide margin.  相似文献   
134.
Query optimizers rely on statistical models that succinctly describe the underlying data. Models are used to derive cardinality estimates for intermediate relations, which in turn guide the optimizer to choose the best query execution plan. The quality of the resulting plan is highly dependent on the accuracy of the statistical model that represents the data. It is well known that small errors in the model estimates propagate exponentially through joins, and may result in the choice of a highly sub-optimal query execution plan. Most commercial query optimizers make the attribute value independence assumption: all attributes are assumed to be statistically independent. This reduces the statistical model of the data to a collection of one-dimensional synopses (typically in the form of histograms), and it permits the optimizer to estimate the selectivity of a predicate conjunction as the product of the selectivities of the constituent predicates. However, this independence assumption is more often than not wrong, and is considered to be the most common cause of sub-optimal query execution plans chosen by modern query optimizers. We take a step towards a principled and practical approach to performing cardinality estimation without making the independence assumption. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution over all the attributes in the database into small, usually two-dimensional distributions, without a significant loss in estimation accuracy. We show how to efficiently construct such a graphical model from the database using only two-way join queries, and we show how to perform selectivity estimation in a highly efficient manner. We integrate our algorithms into the PostgreSQL DBMS. Experimental results indicate that estimation errors can be greatly reduced, leading to orders of magnitude more efficient query execution plans in many cases. Optimization time is kept in the range of tens of milliseconds, making this a practical approach for industrial-strength query optimizers.  相似文献   
135.
Support for generic programming was added to the Java language in 2004, representing perhaps the most significant change to one of the most widely used programming languages today. Researchers and language designers anticipated this addition would relieve many long-standing problems plaguing developers, but surprisingly, no one has yet measured how generics have been adopted and used in practice. In this paper, we report on the first empirical investigation into how Java generics have been integrated into open source software by automatically mining the history of 40 popular open source Java programs, traversing more than 650 million lines of code in the process. We evaluate five hypotheses and research questions about how Java developers use generics. For example, our results suggest that generics sometimes reduce the number of type casts and that generics are usually adopted by a single champion in a project, rather than all committers. We also offer insights into why some features may be adopted sooner and others features may be held back.  相似文献   
136.
When implementing a propagator for a constraint, one must decide about variants: When implementing min, should one also implement max? Should one implement linear constraints both with unit and non-unit coefficients? Constraint variants are ubiquitous: implementing them requires considerable (if not prohibitive) effort and decreases maintainability, but will deliver better performance than resorting to constraint decomposition. This paper shows how to use views to derive propagator variants, combining the efficiency of dedicated propagator implementations with the simplicity and effortlessness of decomposition. A model for views and derived propagators is introduced. Derived propagators are proved to be perfect in that they inherit essential properties such as correctness and domain and bounds consistency. Techniques for systematically deriving propagators such as transformation, generalization, specialization, and type conversion are developed. The paper introduces an implementation architecture for views that is independent of the underlying constraint programming system. A detailed evaluation of views implemented in Gecode shows that derived propagators are efficient and that views often incur no overhead. Views have proven essential for implementing Gecode, substantially reducing the amount of code that needs to be written and maintained.  相似文献   
137.
Libraries, as we know them today, can be defined by the term Library 1.0. This defines the way resources are kept on shelves or at a computer behind a login. These resources can be taken from a shelf, checked out by the library staff, taken home for a certain length of time and absorbed, and then returned to the library for someone else to avail of. Library 1.0 is a one-directional service that takes people to the information they require. Library 2.0 – or L2 as it is now more commonly addressed as – aims to take the information to the people by bringing the library service to the Internet and getting the users more involved by encouraging feedback participation. This paper presents an overview of Library 2.0 and introduces web 2.0 concepts.  相似文献   
138.
The Network Mobility (NEMO) protocol is needed to support the world-wide mobility of aircraft mobile networks across different access networks in the future IPv6 based aeronautical telecommunications network (ATN). NEMO suffers from the constraint that all traffic has to be routed via the home agent though. The already existing correspondent router (CR) protocol solves this triangular routing problem and permits to route packets on a direct path between the mobile network and the ground based correspondent nodes. We identify security deficiencies of this protocol that make it unsuitable for use within the ATN. We therefore propose a new route optimization procedure based on the CR protocol that provides a higher level of security. We evaluate our new protocol in three ways. We first conduct a simulation based handover performance study using an implementation of a realistic aeronautical access technology. We then investigate the mobility signaling overhead. Finally, we specify a threat model applicable for the aeronautical environment and use it to perform a security analysis of both the old and our new protocol. It is shown that our protocol is not only more secure but also provides better handover latency, smaller overhead in the aeronautical scenario and a higher level of resilience when compared to the original CR protocol.  相似文献   
139.
We study a motion planning problem where items have to be transported from the top room of a tower to the bottom of the tower, while simultaneously other items have to be transported in the opposite direction. Item sets are moved in two baskets hanging on a rope and pulley. To guarantee stability of the system, the weight difference between the contents of the two baskets must always stay below a given threshold. We prove that it is $\varPi_{2}^{p}$ -complete to decide whether some given initial situation of the underlying discrete system can lead to a given goal situation. Furthermore we identify several polynomially solvable special cases of this reachability problem, and we also settle the computational complexity of a number of related questions.  相似文献   
140.
This study sought to assess sediment contamination by trace metals (cadmium, chromium, cobalt, copper, manganese, nickel, lead and zinc), to localize contaminated sites and to identify environmental risk for aquatic organisms in Wadis of Kebir Rhumel basin in the Northeast of Algeria. Water and surficial sediments (0-5 cm) were sampled in winter, spring, summer and autumn from 37 sites along permanent Wadis of the Kebir Rhumel basin. Sediment trace metal contents were measured by Flame Atomic Absorption Spectroscopy. Trace metals median concentrations in sediments followed a decreasing order: Mn > Zn > Pb > Cr > Cu > Ni > Co > Cd. Extreme values (dry weights) of the trace metals are as follows: 0.6-3.4 microg/g for Cd, 10-216 microg/g for Cr, 9-446 microg/g for Cu, 3-20 microg/g for Co, 105-576 microg/g for Mn, 10-46 microg/g for Ni, 11-167 microg/g for Pb, and 38-641 microg/g for Zn. According to world natural concentrations, all sediments collected were considered as contaminated by one or more elements. Comparing measured concentrations with American guidelines (Threshold Effect Level: TEL and Probable Effect Level: PEL) showed that biological effects could be occasionally observed for cadmium, chromium, lead and nickel levels but frequently observed for copper and zinc levels. Sediment quality was shown to be excellent for cobalt and manganese but medium to bad for cadmium, chromium, copper, lead, nickel and zinc regardless of sites.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号