首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7368篇
  免费   501篇
  国内免费   10篇
工业技术   7879篇
  2024年   8篇
  2023年   77篇
  2022年   319篇
  2021年   332篇
  2020年   215篇
  2019年   226篇
  2018年   260篇
  2017年   289篇
  2016年   329篇
  2015年   227篇
  2014年   381篇
  2013年   605篇
  2012年   494篇
  2011年   595篇
  2010年   464篇
  2009年   465篇
  2008年   436篇
  2007年   366篇
  2006年   284篇
  2005年   217篇
  2004年   190篇
  2003年   165篇
  2002年   182篇
  2001年   97篇
  2000年   74篇
  1999年   77篇
  1998年   72篇
  1997年   44篇
  1996年   40篇
  1995年   45篇
  1994年   31篇
  1993年   29篇
  1992年   25篇
  1991年   13篇
  1990年   16篇
  1989年   19篇
  1988年   13篇
  1987年   10篇
  1986年   14篇
  1985年   18篇
  1984年   18篇
  1983年   8篇
  1982年   18篇
  1981年   10篇
  1980年   11篇
  1979年   9篇
  1978年   10篇
  1977年   10篇
  1975年   6篇
  1973年   5篇
排序方式: 共有7879条查询结果,搜索用时 0 毫秒
71.
In this paper we present two tests which can decide whether a given pointx 0 N is locally efficient or not with respect to a given finite set of real valued continuously differentiable functions defined on N . Examples indicate that the tests may fail on a nowhere dense set.  相似文献   
72.
Two-Photon Lithography, thanks to its very high sub-diffraction resolution, has become the lithographic technique par excellence in applications requiring small feature sizes and complex 3D pattering. Despite this, the fabrication times required for extended structures remain much longer than those of other competing techniques (UV mask lithography, nanoimprinting, etc.). Its low throughput prevents its wide adoption in industrial applications. To increase it, over the years different solutions have been proposed, although their usage is difficult to generalize and may be limited depending on the specific application. A promising strategy to further increase the throughput of Two-Photon Lithography, opening a concrete window for its adoption in industry, lies in its combination with holography approaches: in this way it is possible to generate dozens of foci from a single laser beam, thus parallelizing the fabrication of periodic structures, or to engineer the intensity distribution on the writing plane in a complex way, obtaining 3D microstructures with a single exposure. Here, the fundamental concepts behind high-speed Two-Photon Lithography and its combination with holography are discussed, and the literary production of recent years that exploits such techniques is reviewed, and contextualized according to the topic covered.  相似文献   
73.
Bacterial trapping using nanonets is a ubiquitous immune defense mechanism against infectious microbes. These nanonets can entrap microbial cells, effectively arresting their dissemination and rendering them more vulnerable to locally secreted microbicides. Inspired by this evolutionarily conserved anti-infective strategy, a series of 15 to 16 residue-long synthetic β-hairpin peptides is herein constructed with the ability to self-assemble into nanonets in response to the presence of bacteria, enabling spatiotemporal control over microbial killing. Using amyloid-specific K114 assay and confocal microscopy, the membrane components lipoteichoic acid and lipopolysaccharide are shown to play a major role in determining the amyloid-nucleating capacity as triggered by Gram-positive and Gram-negative bacteria respectively. These nanonets displayed both trapping and killing functionalities, hence offering a direct improvement from the trap-only biomimetics in literature. By substituting a single turn residue of the non-amyloidogenic BTT1 peptide, the nanonet-forming BTT1-3A analog is produced with comparable antimicrobial potency. With the same sequence manipulation approach, BTT2-4A analog modified from BTT2 peptide showed improved antimicrobial potency against colistin-resistant clinical isolates. The peptide nanonets also demonstrated robust stability against proteolytic degradation, and promising in vivo efficacy and biosafety profile. Overall, these bacteria-responsive peptide nanonets are promising clinical anti-infective alternatives for circumventing antibiotic resistance.  相似文献   
74.
Software Quality Journal - The number of electronic control units (ECU) installed in vehicles is increasingly high. Manufacturers must improve the software quality and reduce cost by proposing...  相似文献   
75.
The investigation of possible failures in composite materials is a matter of very great importance, and the Tsai-Wu criterion is an effective criterion for analyzing those flaws in anisotropic materials and defining whether the material at a given load will or will not suffer structural failure. In this study, an optimization procedure is proposed to minimize the maximum value of Tsai-Wu of laminated composite tubes subject to axial loading. Artificial neural networks and genetic algorithms are chosen as optimization tools. The results of this study show that the developed algorithm converges faster. Then, the maximum Tsai-Wu value is used as the objective function and the fiber orientations are the constraints in the optimization process. The results yielded by them are compared and discussed. Optimal results are compared with respect to the usual initial design. The design approach is recommended for structures where composites are the key load-carrying members such as orthopedic prosthesis.  相似文献   
76.
This paper describes an inverse procedure to determine the constitutive constants and the friction conditions in the machining processes using Finite Elements (FE) simulations. In general, the FE modeling of machining processes is an effective tool to analyze the materials machinability under different cutting conditions. However, the use of reliable rheological and friction models represents the basis of a correct numerical investigation. The presented inverse procedure was based on the numerical results obtained using a commercial FE code and was developed considering a specific optimization problem, in which the objective functions that have to be minimized is the experimental/numerical error. This problem was performed by a routine developed in a commercial optimization software. In order to verify the goodness and the robustness of the methodology, it was applied on a Super Duplex Stainless Steel (SDSS) and on an Austenitic Stainless Steel (AUSS) orthogonal machining processes. This work, then, was focused on the identification of the Johnson-Cook (JC) coefficients (A,B,C, n and m) and on the calibration of a Coulomb friction model, in the specific cases of the SAF2507 SDSS and of an AISI 316 Based AUSS Alloy (AISI 316 ASBA). The identification phases were performed considering forces and temperatures experimental data, collected in two specific experimental tasks in which different orthogonal cutting tests were carried out under different cutting parameters conditions.  相似文献   
77.
The purpose of this paper is to establish a basis for a criticality analysis, considered here as a prerequisite, a first required step to review the current maintenance programs, of complex in‐service engineering assets. Review is understood as a reality check, a testing of whether the current maintenance activities are well aligned to actual business objectives and needs. This paper describes an efficient and rational working process and a model resulting in a hierarchy of assets, based on risk analysis and cost–benefit principles, which will be ranked according to their importance for the business to meet specific goals. Starting from a multicriteria analysis, the proposed model converts relevant criteria impacting equipment criticality into a single score presenting the criticality level. Although detailed implementation of techniques like Root Cause Failure Analysis and Reliability Centered Maintenance will be recommended for further optimization of the maintenance activities, the reasons why criticality analysis deserves the attention of engineers and maintenance and reliability managers are precisely explained here. A case study is presented to help the reader understand the process and to operationalize the model. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
78.
We review quantum information processing with cold neutral particles, that is, atoms or polar molecules. First, we analyze the best suited degrees of freedom of these particles for storing quantum information, and then we discuss both single- and two-qubit gate implementations. We focus our discussion mainly on collisional quantum gates, which are best suited for atom-chip-like devices, as well as on gate proposals conceived for optical lattices. Additionally, we analyze schemes both for cold atoms confined in optical cavities and hybrid approaches to entanglement generation, and we show how optimal control theory might be a powerful tool to enhance the speed up of the gate operations as well as to achieve high fidelities required for fault tolerant quantum computation.  相似文献   
79.
Diabetes therapy management in AAL environments, such as old people and diabetes patients homes, is a very difficult task since many factors affect a patient’s blood sugar levels. Factors such as illness, treatments, physical and psychological stress, physical activity, drugs, intravenous fluids and change in the meal plan cause unpredictable and potentially dangerous fluctuations in blood sugar levels. Right now, operations related to dosage are based on insulin infusion protocol boards, which are provided by physicians to the patients. These boards are not considering very influential factors such as glycemic index from the diet, consequently patients need to estimate the dosage leading to dose error, which culminates in hyperglycemia and hypoglycemia episode. Therefore, right insulin infusion calculation needs to be supported by the next generation of personal-care devices. For this reason, a personal device has been developed to assist and consider more factors in the insulin therapy dosage calculation. The proposed solution is based on Internet of things in order to, on the one hand, support a patient’s profile management architecture based on personal RFID cards and, on the other hand, provide global connectivity between the developed patient’s personal device based on 6LoWPAN, nurses/physicians desktop application to manage personal health cards, glycemic index information system, and patient’s web portal. This solution has been evaluated by a multidisciplinary group formed by patients, physicians, and nurses.  相似文献   
80.
Orthogonal variant moments features in image analysis   总被引:1,自引:0,他引:1  
Moments are statistical measures used to obtain relevant information about a certain object under study (e.g., signals, images or waveforms), e.g., to describe the shape of an object to be recognized by a pattern recognition system. Invariant moments (e.g., the Hu invariant set) are a special kind of these statistical measures designed to remain constant after some transformations, such as object rotation, scaling, translation, or image illumination changes, in order to, e.g., improve the reliability of a pattern recognition system. The classical moment invariants methodology is based on the determination of a set of transformations (or perturbations) for which the system must remain unaltered. Although very well established, the classical moment invariants theory has been mainly used for processing single static images (i.e. snapshots) and the use of image moments to analyze images sequences or video, from a dynamic point of view, has not been sufficiently explored and is a subject of much interest nowadays. In this paper, we propose the use of variant moments as an alternative to the classical approach. This approach presents clear differences compared to the classical moment invariants approach, that in specific domains have important advantages. The difference between the classical invariant and the proposed variant approach is mainly (but not solely) conceptual: invariants are sensitive to any image change or perturbation for which they are not invariant, so any unexpected perturbation will affect the measurements (i.e. is subject to uncertainty); on the contrary, a variant moment is designed to be sensitive to a specific perturbation, i.e., to measure a transformation, not to be invariant to it, and thus if the specific perturbation occurs it will be measured; hence any unexpected disturbance will not affect the objective of the measurement confronting thus uncertainty. Furthermore, given the fact that the proposed variant moments are orthogonal (i.e. uncorrelated) it is possible to considerably reduce the total inherent uncertainty. The presented approach has been applied to interesting open problems in computer vision such as shape analysis, image segmentation, tracking object deformations and object motion tracking, obtaining encouraging results and proving the effectiveness of the proposed approach.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号