首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2908篇
  免费   107篇
  国内免费   39篇
工业技术   3054篇
  2023年   12篇
  2022年   15篇
  2021年   61篇
  2020年   28篇
  2019年   55篇
  2018年   66篇
  2017年   44篇
  2016年   57篇
  2015年   63篇
  2014年   81篇
  2013年   170篇
  2012年   137篇
  2011年   207篇
  2010年   191篇
  2009年   204篇
  2008年   212篇
  2007年   177篇
  2006年   141篇
  2005年   163篇
  2004年   94篇
  2003年   101篇
  2002年   100篇
  2001年   61篇
  2000年   56篇
  1999年   42篇
  1998年   62篇
  1997年   46篇
  1996年   49篇
  1995年   33篇
  1994年   32篇
  1993年   42篇
  1992年   31篇
  1991年   20篇
  1990年   20篇
  1989年   14篇
  1988年   21篇
  1987年   19篇
  1986年   9篇
  1985年   22篇
  1984年   16篇
  1983年   19篇
  1982年   8篇
  1981年   17篇
  1980年   11篇
  1979年   6篇
  1977年   5篇
  1976年   4篇
  1974年   2篇
  1973年   3篇
  1964年   1篇
排序方式: 共有3054条查询结果,搜索用时 31 毫秒
101.
In a mobile environment, querying a database at a stationary server from a mobile client is expensive due to the limited bandwidth of a wireless channel and the instability of the wireless network. We address this problem by maintaining a materialized view in a mobile client's local storage. Such a materialized view can be considered as a data warehouse. The materialized view contains results of common queries in which the mobile client is interested. In this paper, we address the view update problem for maintaining the consistency between a materialized view at a mobile client and the database server. The content of a materialized view could become incoherent with that at the database server when the content of the database server and/or when the location of the client is changed. Existing view update mechanisms are ‘push-based’. The server is responsible for notifying all clients whose views might be affected by the changes in database or the mobility of the client. This is not appropriate in a mobile environment due to the frequent wireless channel disconnection. Furthermore, it is not easy for a server to keep track of client movements to update individual client location-dependent views. We propose a ‘pull-based’ approach that allows a materialized view to be updated at a client in an incremental manner, requiring a client to request changes to its view from the server. We demonstrate the feasibility of our approach with experimental results. Received 27 January 1999 / Revised 26 November 1999 / Accepted 17 April 2000  相似文献   
102.
We are interested in information management for decision support applications, especially those that monitor distributed, heterogeneous databases to assess time-critical decisions. Users of such applications can easily be overwhelmed with data that may change rapidly, may conflict, and may be redundant. Developers are faced with a dilemma: either filter out most information and risk excluding critical items, or gather possibly irrelevant or redundant information, and overwhelm the decision maker. This paper describes a solution to this dilemma called decision-centric information monitoring (DCIM). First, we observe that decision support systems should monitor only information that can potentially change some decision. We present an architecture for DCIM that meets the requirements implied by this observation. We describe techniques for identifying the highest value information to monitor and techniques for monitoring that information despite autonomy, distribution, and heterogeneity of data sources. Finally, we present lessons learned from building LOOKOUT, which is to our knowledge the first implementation of a top-to-bottom system performing decision-centric information monitoring.  相似文献   
103.
Replica Placement Strategies in Data Grid   总被引:1,自引:0,他引:1  
Replication is a technique used in Data Grid environments that helps to reduce access latency and network bandwidth utilization. Replication also increases data availability thereby enhancing system reliability. The research addresses the problem of replication in Data Grid environment by investigating a set of highly decentralized dynamic replica placement algorithms. Replica placement algorithms are based on heuristics that consider both network latency and user requests to select the best candidate sites to place replicas. Due to dynamic nature of Grid, the candidate site holds replicas currently may not be the best sites to fetch replicas in subsequent periods. Therefore, a replica maintenance algorithm is proposed to relocate replicas to different sites if the performance metric degrades significantly. The study of our replica placement algorithms is carried out using a model of the EU Data Grid Testbed 1 [Bell et al. Comput. Appl., 17(4), 2003] sites and their associated network geometry. We validate our replica placement algorithms with total file transfer times, the number of local file accesses, and the number of remote file accesses.  相似文献   
104.
We have developed a unique device, a dynamic diamond anvil cell (dDAC), which repetitively applies a time-dependent load/pressure profile to a sample. This capability allows studies of the kinetics of phase transitions and metastable phases at compression (strain) rates of up to 500 GPa/s (approximately 0.16 s(-1) for a metal). Our approach adapts electromechanical piezoelectric actuators to a conventional diamond anvil cell design, which enables precise specification and control of a time-dependent applied load/pressure. Existing DAC instrumentation and experimental techniques are easily adapted to the dDAC to measure the properties of a sample under the varying load/pressure conditions. This capability addresses the sparsely studied regime of dynamic phenomena between static research (diamond anvil cells and large volume presses) and dynamic shock-driven experiments (gas guns, explosive, and laser shock). We present an overview of a variety of experimental measurements that can be made with this device.  相似文献   
105.
Although recent years have seen significant advances in the spatial resolution possible in the transmission electron microscope (TEM), the temporal resolution of most microscopes is limited to video rate at best. This lack of temporal resolution means that our understanding of dynamic processes in materials is extremely limited. High temporal resolution in the TEM can be achieved, however, by replacing the normal thermionic or field emission source with a photoemission source. In this case the temporal resolution is limited only by the ability to create a short pulse of photoexcited electrons in the source, and this can be as short as a few femtoseconds. The operation of the photo-emission source and the control of the subsequent pulse of electrons (containing as many as 5 x 10(7) electrons) create significant challenges for a standard microscope column that is designed to operate with a single electron in the column at any one time. In this paper, the generation and control of electron pulses in the TEM to obtain a temporal resolution <10(-6)s will be described and the effect of the pulse duration and current density on the spatial resolution of the instrument will be examined. The potential of these levels of temporal and spatial resolution for the study of dynamic materials processes will also be discussed.  相似文献   
106.
Although developments on software agents have led to useful applications in automation of routine tasks such as electronic mail filtering, there is a scarcity of research that empirically evaluates the performance of a software agent versus that of a human reasoner, whose problem-solving capabilities the agent embodies. In the context of a game of a chance, namely Yahtzee©, we identified strategies deployed by expert human reasoners and developed a decision tree for agent development. This paper describes the computer implementation of the Yahtzee game as well as the software agent. It also presents a comparison of the performance of humans versus an automated agent. Results indicate that, in this context, the software agent embodies human expertise at a high level of fidelity.  相似文献   
107.
成本时间限制下的网格分类调度算法研究*   总被引:6,自引:0,他引:6  
在网格环境中,由于资源广域分布、异构、动态且有多个管理域,调度一组具有多QoS需求如成本、时间的独立任务是一个非常重要的问题。针对网格任务的成本和执行时间要求,提出了一种基于网格经济模型,根据实际执行成本和预算成本进行分类的网格分类优化调度算法。模拟实际网格任务调度实验表明,该算法能很好地满足网格环境中不同用户的需求。  相似文献   
108.
Pipeline architectures provide a versatile and efficient mechanism for constructing visualizations, and they have been implemented in numerous libraries and applications over the past two decades. In addition to allowing developers and users to freely combine algorithms, visualization pipelines have proven to work well when streaming data and scale well on parallel distributed-memory computers. However, current pipeline visualization frameworks have a critical flaw: they are unable to manage time varying data. As data flows through the pipeline, each algorithm has access to only a single snapshot in time of the data. This prevents the implementation of algorithms that do any temporal processing such as particle tracing; plotting over time; or interpolation, fitting, or smoothing of time series data. As data acquisition technology improves, as simulation time-integration techniques become more complex, and as simulations save less frequently and regularly, the ability to analyze the time-behavior of data becomes more important. This paper describes a modification to the traditional pipeline architecture that allows it to accommodate temporal algorithms. Furthermore, the architecture allows temporal algorithms to be used in conjunction with algorithms expecting a single time snapshot, thus simplifying software design and allowing adoption into existing pipeline frameworks. Our architecture also continues to work well in parallel distributed-memory environments. We demonstrate our architecture by modifying the popular VTK framework and exposing the functionality to the ParaView application. We use this framework to apply time-dependent algorithms on large data with a parallel cluster computer and thereby exercise a functionality that previously did not exist.  相似文献   
109.
OBJECTIVE: This study examined the way in which the type and preexisting strength of association between an auditory icon and a warning event affects the ease with which the icon/event pairing can be learned and retained. BACKGROUND: To be effective, an auditory warning must be audible, identifiable, interpretable, and heeded. Warnings consisting of familiar environmental sounds, or auditory icons, have potential to facilitate identification and interpretation. The ease with which pairings between auditory icons and warning events can be learned and retained is likely to depend on the type and strength of the preexisting icon/event association. METHOD: Sixty-three participants each learned eight auditory-icon/denotative-referent pairings and attempted to recall them 4 weeks later. Three icon/denotative-referent association types (direct, related, and unrelated) were employed. Participants rated the strength of the association for each pairing on a 7-point scale. RESULTS: The number of errors made while learning pairings was greater for unrelated than for either related or direct associations, whereas the number of errors made while attempting to recall pairings 4 weeks later was greater for unrelated than for related associations and for related than for direct associations. Irrespective of association type, both learning and retention performance remained at very high levels, provided the strength of the association was rated greater than 5. CONCLUSION: This suggests that strong preexisting associations are used to facilitate learning and retention of icon/denotative-referent pairings. APPLICATION: The practical implication of this study is that auditory icons having either direct or strong, indirect associations with warning events should be preferred.  相似文献   
110.
In radiotherapy treatment planning, tumor volumes and anatomical structures are manually contoured for dose calculation, which takes time for clinicians. This study examines the use of semi-automated segmentation of CT images. A few high curvature points are manually drawn on a CT slice. Then Fourier interpolation is used to complete the contour. Consequently, optical flow, a deformable image registration method, is used to map the original contour to other slices. This technique has been applied successfully to contour anatomical structures and tumors. The maximum difference between the mapped contours and manually drawn contours was 6 pixels, which is similar in magnitude to difference one would see in manually drawn contours by different clinicians. The technique fails when the region to contour is topologically different between two slices. A solution is recommended to manually delineate contours on a sparse subset of slices and then map in both directions to fill the remaining slices.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号