首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2196篇
  免费   148篇
  国内免费   1篇
工业技术   2345篇
  2024年   4篇
  2023年   22篇
  2022年   19篇
  2021年   114篇
  2020年   67篇
  2019年   96篇
  2018年   87篇
  2017年   55篇
  2016年   83篇
  2015年   58篇
  2014年   123篇
  2013年   157篇
  2012年   155篇
  2011年   175篇
  2010年   128篇
  2009年   111篇
  2008年   136篇
  2007年   96篇
  2006年   95篇
  2005年   67篇
  2004年   51篇
  2003年   54篇
  2002年   51篇
  2001年   36篇
  2000年   27篇
  1999年   31篇
  1998年   53篇
  1997年   28篇
  1996年   17篇
  1995年   15篇
  1994年   17篇
  1993年   26篇
  1992年   8篇
  1991年   6篇
  1990年   7篇
  1989年   9篇
  1988年   10篇
  1987年   6篇
  1986年   2篇
  1985年   4篇
  1984年   9篇
  1983年   8篇
  1982年   4篇
  1981年   2篇
  1980年   5篇
  1977年   2篇
  1976年   3篇
  1974年   2篇
  1973年   1篇
  1926年   1篇
排序方式: 共有2345条查询结果,搜索用时 312 毫秒
41.
Recently, periodic pattern mining from time series data has been studied extensively. However, an interesting type of periodic pattern, called partial periodic (PP) correlation in this paper, has not been investigated. An example of PP correlation is that power consumption is high either on Monday or Tuesday but not on both days. In general, a PP correlation is a set of offsets within a particular period such that the data at these offsets are correlated with a certain user-desired strength. In the above example, the period is a week (7 days), and each day of the week is an offset of the period. PP correlations can provide insightful knowledge about the time series and can be used for predicting future values. This paper introduces an algorithm to mine time series for PP correlations based on the principal component analysis (PCA) method. Specifically, given a period, the algorithm maps the time series data to data points in a multidimensional space, where the dimensions correspond to the offsets within the period. A PP correlation is then equivalent to correlation of data when projected to a subset of the dimensions. The algorithm discovers, with one sequential scan of data, all those PP correlations (called minimum PP correlations) that are not unions of some other PP correlations. Experiments using both real and synthetic data sets show that the PCA-based algorithm is highly efficient and effective in finding the minimum PP correlations. Zhen He is a lecturer in the Department of Computer Science at La Trobe University. His main research areas are database systems optimization, time series mining, wireless sensor networks, and XML information retrieval. Prior to joining La Trobe University, he worked as a postdoctoral research associate in the University of Vermont. He holds Bachelors, Honors and Ph.D degrees in Computer Science from the Australian National University. X. Sean Wang received his Ph.D degree in Computer Science from the University of Southern California in 1992. He is currently the Dorothean Chair Professor in Computer Science at the University of Vermont. He has published widely in the general area of databases and information security, and was a recipient of the US National Science Foundation Research Initiation and CAREER awards. His research interests include database systems, information security, data mining, and sensor data processing. Byung Suk Lee is associate professor of Computer Science at the University of Vermont. His main research areas are database systems, data modeling, and information retrieval. He held positions in industry and academia: Gold Star Electric, Bell Communications Research, Datacom Global Communications, University of St. Thomas, and currently University of Vermont. He was also a visiting professor at Dartmouth College and a participating guest at Lawrence Livermore National Laboratory. He served on international conferences as a program committee member, a publicity chair, and a special session organizer, and also on US federal funding proposal review panel. He holds a BS degree from Seoul National University, MS from Korea Advanced Institute of Science and Technology, and Ph.D from Stanford University. Alan C. H. Ling is an assistant professor at Department of Computer Science in University of Vermont. His research interests include combinatorial design theory, coding theory, sequence designs, and applications of design theory.  相似文献   
42.
Custom software development and maintenance is one of the key expenses associated with developing automated systems for mass customization. This paper presents a method for reducing the risk associated with this expense by developing a flexible environment for determining and executing dynamic workflow paths. Strategies for developing an autonomous agent-based framework and for identifying and creating web services for specific process tasks are presented. The proposed methods are outlined in two different case studies to illustrate the approach for both a generic process with complex workflow paths and a more specific sequential engineering process.  相似文献   
43.
We present an interactive algorithm for continuous collision detection between deformable models. We introduce multiple techniques to improve the culling efficiency and the overall performance of continuous collision detection. First, we present a novel formulation for continuous normal cones and use these normal cones to efficiently cull large regions of the mesh as part of self-collision tests. Second, we introduce the concept of “procedural representative triangles” to remove all redundant elementary tests between nonadjacent triangles. Finally, we exploit the mesh connectivity and introduce the concept of “orphan sets” to eliminate redundant elementary tests between adjacent triangle primitives. In practice, we can reduce the number of elementary tests by two orders of magnitude. These culling techniques have been combined with bounding volume hierarchies and can result in one order of magnitude performance improvement as compared to prior collision detection algorithms for deformable models. We highlight the performance of our algorithm on several benchmarks, including cloth simulations, N-body simulations, and breaking objects.  相似文献   
44.
45.
Fluorescence yield near-edge spectroscopy (FYNES) above the carbon K edge and temperature programmed reaction spectroscopy (TPRS) have been used as the methods for characterizing the reactivity and structure of adsorbed aniline and aniline derived species on the Ni(100) and Ni(111) surfaces over an extended range of temperatures and hydrogen pressures. The Ni(100) surface shows appreciably higher hydrogenolysis activity towards adsorbed aniline than the Ni(111) surface. Hydrogenolysis of aniline on the Ni(100) surface results in benzene formation at 470 K, both in reactive hydrogen atmospheres and in vacuum. External hydrogen significantly enhances the hydrogenolysis activity for aniline on the Ni(100) surface. Based on spectroscopic evidence, we believe that the dominant aniline hydrogenolysis reaction is preceded by partial hydrogenation of the aromatic ring of aniline in the presence of 0.001 Torr of external hydrogen on the (100) surface. In contrast, very little adsorbed aniline undergoes hydrogen induced C-N bond activation on the Ni(111) surface for hydrogen pressures as high as 10–7 Torr below 500 K. Thermal dehydrogenation of aniline dominates with increasing temperature on the Ni(111) surface, resulting in the formation of a previously observed polymeric layer which is stable up to 820 K. Aniline is adsorbed at a smaller angle relative to the Ni(111) surface than the Ni(100) surface at temperatures below the hydrogenolysis temperature. We believe that the proximity and strong -interaction between the aromatic ring of the aniline and the surface is one major factor which controls the competition between dehydrogenation and hydrogen addition. In this case the result is a substantial enhancement of aniline dehydrogenation relative to hydrogenation on the Ni(111) surface.  相似文献   
46.
Aggregation induced emission (AIE) has attracted considerable interest for the development of fluorescence probes. However, controlling the bioconjugation and cellular labeling of AIE dots is a challenging problem. Here, this study reports a general approach for preparing small and bioconjugated AIE dots for specific labeling of cellular targets. The strategy is based on the synthesis of oxetane‐substituted AIEgens to generate compact and ultrastable AIE dots via photo‐crosslinking. A small amount of polymer enriched with oxetane groups is cocondensed with most of the AIEgens to functionalize the nanodot surface for subsequent streptavidin bioconjugation. Due to their small sizes, good stability, and surface functionalization, the cell‐surface markers and subcellular structures are specifically labeled by the AIE dot bioconjugates. Remarkably, stimulated emission depletion imaging with AIE dots is achieved for the first time, and the spatial resolution is significantly enhanced to ≈95 nm. This study provides a general approach for small functional molecules for preparing small sized and ultrastable nanodots.  相似文献   
47.
48.
Congenital facial anomalies, such as microtia (malformation of the external ear), lead to significant psychosocial effects starting from early childhood. Three-dimensional (3D) scanning and advanced manufacturing are being investigated as a cheaper and more personalised method of fabricating reconstructive treatments for patients compared to traditional approaches. To date, most case studies have used expensive 3D scanners, yet, there is potential for low-cost devices to provide comparable results. This study aimed to investigate these different approaches. Both ears of 16 adult participants were scanned with three devices: Artec Spider (Artec Group), Intel® RealSense? (Intel), and the Apple iPhone® 7 (Apple Inc.) combined with photogrammetry using 90, 60 and 30 photographs. The scanning time, processing time, accuracy, completeness, resolution and repeatability of each technique were assessed using the Artec Spider as a reference scanner. Our results show that the iPhone had the longest processing time however, this decreased nine-fold when reducing the number of photos from 90 to 30. There was no significant difference in the accuracy, completeness or repeatability of the iPhone scans with 90 photographs (1.4?±?0.6?mm, 79.9%, 1.0?±?0.1?mm), 60 photographs (1.2?±?0.2, 79.3%, 0.9?±?0.2?mm) or 30 photographs (1.2?±?0.3?mm, 74.3%, 1.0?±?0.2?mm). The Intel RealSesne performed significantly worse in each parameter (1.8?±?03?mm; 46.6%, 1.4?±?0.5). Additionally, the RealSense had significantly lower resolution with not enough detail captured for the application. These results demonstrate that the ear can be accurately 3D scanned using iPhone photographs. We would recommend capturing between 30 and 60 photographs of the ear to create a scan that is accurate but without the downfall of long processing time. Using these methods we aim to provide a more comfortable setting for the patient and a lower-cost and more personalised ear prosthesis.  相似文献   
49.
A major feature of the emerging geo-social networks is the ability to notify a user when any of his friends (also called buddies) happens to be geographically in proximity. This proximity service is usually offered by the network itself or by a third party service provider (SP) using location data acquired from the users. This paper provides a rigorous theoretical and experimental analysis of the existing solutions for the location privacy problem in proximity services. This is a serious problem for users who do not trust the SP to handle their location data and would only like to release their location information in a generalized form to participating buddies. The paper presents two new protocols providing complete privacy with respect to the SP and controllable privacy with respect to the buddies. The analytical and experimental analysis of the protocols takes into account privacy, service precision, and computation and communication costs, showing the superiority of the new protocols compared to those appeared in the literature to date. The proposed protocols have also been tested in a full system implementation of the proximity service.  相似文献   
50.
The development of instructional content using Information Technologies is an expensive, time-consuming and complex process that requires new methodologies. It was in this context that the concept of Learning Objects (LOs) was proposed in order to promote reuse. However, this goal is not yet fully attained and new contributions to increase reuse are still welcome. Besides, if content is conveyed in LOs that are easier to reuse, they must be combined and sequenced in order to build more elaborated and complex content. This paper presents a strategy to deal with these problems based on the definition of small LOs here called Component Objects (COs). These COs are structured and combined according to a conceptual metamodel, which is the basis for the definition of conceptual schemas representing the existing material, including not only content but also practice. This strategy for searching, extracting, and sequencing COs, supports a teacher to better control the implementation of complex content, reducing errors in the authoring process. This approach includes a specification language and an algorithm for semi-automatic sequencing learning content and practice. Finally, a case study that shows the proposed approach and some results of using the algorithm are presented.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号