首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1434篇
  免费   195篇
  国内免费   230篇
工业技术   1859篇
  2024年   8篇
  2023年   28篇
  2022年   62篇
  2021年   53篇
  2020年   41篇
  2019年   45篇
  2018年   46篇
  2017年   49篇
  2016年   40篇
  2015年   77篇
  2014年   77篇
  2013年   108篇
  2012年   98篇
  2011年   99篇
  2010年   67篇
  2009年   101篇
  2008年   109篇
  2007年   124篇
  2006年   102篇
  2005年   74篇
  2004年   74篇
  2003年   50篇
  2002年   41篇
  2001年   44篇
  2000年   20篇
  1999年   17篇
  1998年   20篇
  1997年   19篇
  1996年   11篇
  1995年   19篇
  1994年   27篇
  1993年   16篇
  1992年   10篇
  1991年   3篇
  1990年   12篇
  1989年   9篇
  1988年   3篇
  1987年   2篇
  1986年   5篇
  1985年   11篇
  1984年   6篇
  1983年   5篇
  1982年   3篇
  1981年   4篇
  1979年   5篇
  1978年   2篇
  1977年   4篇
  1975年   3篇
  1959年   1篇
  1957年   1篇
排序方式: 共有1859条查询结果,搜索用时 31 毫秒
1.
学术文献的摘要是对文献主要内容的浓缩,摘要不同部分的语步具有不同的信息,语步的自动识别和抽取对于学术摘要的后续研究有着重要的应用价值,而目前语步识别的研究相对较少,并且相关算法的效果还需要提高。针对上述问题,该文提出了一种基于ERNIE-BiGRU模型的语步识别算法。该算法首先结合中文句法分析理论提出基于句法依存关系的多语步结构拆分法,对学术文献摘要多语步结构进行自动拆分,获得多个单语步结构;然后构建用于训练的单语步结构语料库,并利用知识增强语义表示预训练模型,训练出句子级词向量;最后将训练出的单语步结构词向量信息输入双向门限循环单元(BiGRU)进行摘要语步自动化识别,取得了良好的效果。实验结果表明,该算法具有较好的鲁棒性和较高的识别精度,在结构化和非结构化摘要上的识别准确率分别达到了96.57%和93.75%。  相似文献   
2.
现阶段的语义解析方法大部分都基于组合语义,这类方法的核心就是词典。词典是词汇的集合,词汇定义了自然语言句子中词语到知识库本体中谓词的映射。语义解析一直面临着词典中词汇覆盖度不够的问题。针对此问题,该文在现有工作的基础上,提出了基于桥连接的词典学习方法,该方法能够在训练中自动引入新的词汇并加以学习,为了进一步提高新学习到的词汇的准确度,该文设计了新的词语—二元谓词的特征模板,并使用基于投票机制的核心词典获取方法。该文在两个公开数据集(WebQuestions和Free917)上进行了对比实验,实验结果表明,该文方法能够学习到新的词汇,提高词汇的覆盖度,进而提升语义解析系统的性能,特别是召回率。  相似文献   
3.
Organic devices like organic light emitting diodes (OLEDs) or organic solar cells degrade fast when exposed to ambient air. Hence, thin-films acting as permeation barriers are needed for their protection. Atomic layer deposition (ALD) is known to be one of the best technologies to reach barriers with a low defect density at gentle process conditions. As well, ALD is reported to be one of the thinnest barrier layers, with a critical thickness – defining a continuous barrier film – as low as 5–10 nm for ALD processed Al2O3. In this work, we investigate the barrier performance of Al2O3 films processed by ALD at 80 °C with trimethylaluminum and ozone as precursors. The coverage of defects in such films is investigated on a 5 nm thick Al2O3 film, i.e. below the critical thickness, on calcium using atomic force microscopy (AFM). We find for this sub-critical thickness regime that all spots giving raise to water ingress on the 20 × 20 μm2 scan range are positioned on nearly flat surface sites without the presence of particles or large substrate features. Hence below the critical thickness, ALD leaves open or at least weakly covered spots even on feature-free surface sites. The thickness dependent performance of these barrier films is investigated for thicknesses ranging from 15 to 100 nm, i.e. above the assumed critical film thickness of this system. To measure the barrier performance, electrical calcium corrosion tests are used in order to measure the water vapor transmission rate (WVTR), electrodeposition is used in order to decorate and count defects, and dark spot growth on OLEDs is used in order to confirm the results for real devices. For 15–25 nm barrier thickness, we observe an exponential decrease in defect density with barrier thickness which explains the likewise observed exponential decrease in WVTR and OLED degradation rate. Above 25 nm, a further increase in barrier thickness leads to a further exponential decrease in defect density, but an only sub-exponential decrease in WVTR and OLED degradation rate. In conclusion, the performance of the thin Al2O3 permeation barrier is dominated by its defect density. This defect density is reduced exponentially with increasing barrier thickness for alumina thicknesses of up to at least 25 nm.  相似文献   
4.
仲红艳 《微机发展》2006,16(11):59-61
轻量级容器的解耦模式被称为“控制反转”或者“依赖注入”,组件之间的依赖关系由容器(运行环境)在运行期决定,从而在相当程度上降低了组件之间的耦合。该文详细论述了这种解耦模式的原理,以及依赖注入的3种主要形式,并且对几种形式进行了对比总结。  相似文献   
5.
Mining process models with non-free-choice constructs   总被引:6,自引:0,他引:6  
Process mining aims at extracting information from event logs to capture the business process as it is being executed. Process mining is particularly useful in situations where events are recorded but there is no system enforcing people to work in a particular way. Consider for example a hospital where the diagnosis and treatment activities are recorded in the hospital information system, but where health-care professionals determine the “careflow.” Many process mining approaches have been proposed in recent years. However, in spite of many researchers’ persistent efforts, there are still several challenging problems to be solved. In this paper, we focus on mining non-free-choice constructs, i.e., situations where there is a mixture of choice and synchronization. Although most real-life processes exhibit non-free-choice behavior, existing algorithms are unable to adequately deal with such constructs. Using a Petri-net-based representation, we will show that there are two kinds of causal dependencies between tasks, i.e., explicit and implicit ones. We propose an algorithm that is able to deal with both kinds of dependencies. The algorithm has been implemented in the ProM framework and experimental results shows that the algorithm indeed significantly improves existing process mining techniques.  相似文献   
6.
本文基于静态相关性分析和动态调整相结合的方法,提出了一种逻辑程序的执行模型,它不仅开发了“与“并行,同进也开发了一定的“或“并行,从而有效地加速了逻辑程序的执行。  相似文献   
7.
We present a method for recovering from syntax errors encountered during parsing. The method provides a form of minimum distance repair, has linear time complexity, and is completely automatic. A formal method is presented for evaluating the performance of error recovery methods, based on global minimum-distance error correction. The minimum-distance error recovery method achieves a theoretically best performance on 80% of Pascal programs in the weighted Ripley-Druseikis collection. Comparisons of performance with other error recovery methods are given.  相似文献   
8.
Intensional negative adjectives alleged , artificial , fake , false , former , and toy are unusual adjectives that depending on context may or may not be restricting functions. A formal theory of their semantics, pragmatics, and context that uniformly accounts for their complex mathematical and computational characteristics and captures some peculiarities of individual adjectives is presented.
Such adjectives are formalized as new concept builders, negation‐like functions that operate on the values of intensional properties of the concepts denoted by their arguments and yield new concepts whose intensional properties have values consistent with the negation of the old values. Understanding these new concepts involves semantics, pragmatics and context‐dependency of natural language. It is argued that intensional negative adjectives can be viewed as a special‐purpose, weaker, conntext‐dependent negationin natural language. The theory explains and predicts many inferences licensed by expressions involving such adjectives. Implementation of sample examples demonstrates its computational feasibility. Computation of context‐dependent interpretation is discussed.
The theory allows one to enhance a knowledge representation system with similar concept building, negation‐like, context‐dependent functions, the availability of which appears to be a distinct characteristic of natural languages.  相似文献   
9.
The processing of images obtained from satellites often involves highly repetitive calculations on very large amounts of data. This processing is extremely time consuming when these calculations are performed on sequential machines. Parallel computers are well suited to handling computationally expensive operations such as higher order interpolations on large data sets. This paper decribes work undertaken to develop parallel implementations of a set of resampling procedures on an Alliant VFX/4. Each resampling procedure implemented has been optimised in three stages. First, the algorithm has been restructured, where two-dimensional resampling is performed by two one-dimensional resampling operations. Second, each procedure has been reprogrammed in such a way that the autoparallelisation provided by the FX/Fortran compiler has been exploited. Thirdly, data dependent analysis of each procedure has been performed in order to achieve full optimization of each procedure; each procedure has been restructured where appropriate to circumvent vectorisation and concurrency inhibiting data dependencies. The nature and extent of the code optimization achieved for each procedure is presented in this paper. The original code for the most computationally expensive procedure, as targeted at a sequential machine, was found to have an execution time of 4900 seconds on the Alliant VFX/4 when compiled with regular compiler optimization options. Following algorithmic redesigning and reprogramming of the code, as indicated in stage 1 and stage 2, the execution time was reduced to 248 seconds. Restructuring of the code following data dependency analysis indicated in stage 3 in order to avoid data dependencies and allow concurrency and vectorisation, further reduced execution time to 162 seconds. The consequence of this work is that higher-order resampling methods which had not previous been practical are now routinely performed on the Alliant VFX/4 at the University of Dundee.  相似文献   
10.
Reviews the book, The chemically dependent: Phases of treatment and recovery edited by Barbara C. Wallace (see record 1992-98403-000). While this book is ambitious, interesting, educational, and useful, it is also disappointing, repetitious, and incomplete. Because it tries to accomplish so much, it may appear to have succeeded too little. This book is organized around, and explicative of, several basic ideas which might have been controversial if not heretical had this book been published ten years ago. Section I, purporting to link specific "phases of recovery" to particular forms and functions of treatment, will certainly be useful for novice clinicians but falls short of its overstated goals and is thereby disappointing. Section II is a collection of moderately redundant chapters describing the etiology and treatment of substance abusers from the viewpoints of psychoanalytic, psychodynamic, ego psychology, and object-relations theorists and therapists. Section III focuses on cognitive-behavioral, self-help, and relapse-prevention treatments. Section IV is quite uneven in quality of writing and applicability of content, and could have benefited from closer editorial scrutiny or selectivity. The final section focuses on special needs of particular subpopulations of substance abusers: African-Americans, prison inmates, HIV/AIDS patients, persons who are homeless, those who have been sexually and physically abused, and others. According to the reviewer this is not the best book on substance abuse treatment, but it does present some clinically useful ideas and it is worth reading. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号