首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4805篇
  免费   237篇
  国内免费   9篇
工业技术   5051篇
  2023年   43篇
  2022年   36篇
  2021年   130篇
  2020年   85篇
  2019年   103篇
  2018年   117篇
  2017年   106篇
  2016年   143篇
  2015年   129篇
  2014年   169篇
  2013年   301篇
  2012年   308篇
  2011年   357篇
  2010年   304篇
  2009年   288篇
  2008年   288篇
  2007年   245篇
  2006年   212篇
  2005年   192篇
  2004年   165篇
  2003年   145篇
  2002年   132篇
  2001年   73篇
  2000年   83篇
  1999年   67篇
  1998年   104篇
  1997年   74篇
  1996年   69篇
  1995年   50篇
  1994年   58篇
  1993年   50篇
  1992年   30篇
  1991年   31篇
  1990年   25篇
  1989年   32篇
  1988年   20篇
  1987年   25篇
  1986年   23篇
  1985年   23篇
  1984年   22篇
  1983年   23篇
  1982年   27篇
  1981年   16篇
  1980年   18篇
  1979年   19篇
  1978年   11篇
  1977年   17篇
  1976年   14篇
  1974年   8篇
  1973年   7篇
排序方式: 共有5051条查询结果,搜索用时 31 毫秒
91.
Full‐field identification methods are increasingly used to adequately identify constitutive parameters to describe the mechanical behavior of materials. This paper investigates the more recently introduced one‐step method of integrated digital image correlation (IDIC) with respect to the most commonly used two‐step method of finite element model updating (FEMU), which uses a subset‐based DIC algorithm. To make the comparison as objective as possible, both methods are implemented in the most equivalent manner and use the same FE model. Various virtual test cases are studied to assess the performance of both methods when subjected to different error sources: (1) systematic errors, (2) poor initial guesses for the constitutive parameters, (3) image noise, (4) constitutive model errors, and (5) experimental errors. Results show that, despite the mathematical similarity of both methods, IDIC produces less erroneous and more reliable results than FEMU, particularly for more challenging test cases exhibiting small displacements, complex kinematics, misalignment of the specimen, and image noise. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
92.
A selective acoustic activation of defects based on the concept of local defect resonance enables to enhance considerably the intensity of defect vibrations and makes it possible to reduce the input acoustic powers to the levels permissible for noncontact nondestructive inspection. Since for cm-size defects in composite materials, the LDR frequencies lie in the low kHz-range, the resonant noncontact activation shifts to an audible frequency range and can be provided by conventional sonic equipment. In this paper, the feasibility of the resonant noncontact inspection is validated for the most “problematic” methodologies of nonlinear, thermosonic and shearosonic NDE that usually require an elevated acoustic power and, therefore, a reliable contact between the specimen and the transducer. In contrast, the noncontact versions developed employ commercial loudspeakers which can simultaneously insonify large areas and be applied for a contactless sonic inspection of different materials and various scale components.  相似文献   
93.
Diamond‐dispersed copper matrix (Cu/D) composite materials with different interfacial configurations are fabricated through powder metallurgy and their thermal performances are evaluated. An innovative solution to chemically bond copper (Cu) to diamond (D) has been investigated and compared to the traditional Cu/D bonding process involving carbide‐forming additives such as boron (B) or chromium (Cr). The proposed solution consists of coating diamond reinforcements with Cu particles through a gas–solid nucleation and growth process. The Cu particle‐coating acts as a chemical bonding agent at the Cu–D interface during hot pressing, leading to cohesive and thermally conductive Cu/D composites with no carbide‐forming additives. Investigation of the microstructure of the Cu/D materials through scanning electron microscopy, transmission electron microscopy, and atomic force microscopy analyses is coupled with thermal performance evaluations through thermal diffusivity, dilatometry, and thermal cycling. Cu/D composites fabricated with 40 vol% of Cu‐coated diamonds exhibit a thermal conductivity of 475 W m?1 K?1 and a thermal expansion coefficient of 12 × 10?6 °C?1. These promising thermal performances are superior to that of B‐carbide‐bonded Cu/D composites and similar to that of Cr‐carbide‐bonded Cu/D composites fabricated in this study. Moreover, the Cu/D composites fabricated with Cu‐coated diamonds exhibit higher thermal cycling resistance than carbide‐bonded materials, which are affected by the brittleness of the carbide interphase upon repeated heating and cooling cycles. The as‐developed materials can be applicable as heat spreaders for thermal management of power electronic packages. The copper‐carbon chemical bonding solution proposed in this article may also be found interesting to other areas of electronic packaging, such as brazing solders, direct bonded copper substrates, and polymer coatings.
  相似文献   
94.
In the information age, the storage and accessibility of data is of vital importance. There are several possibilities to fulfill this task. Magnetic storage of data is a well‐established method and the range of materials used is continuously extended. In this study, the magnetic remanence of thermally sprayed tungsten carbide–cobalt (WCCo)‐coatings in dependence of their thickness is examined. Two magnetic fields differing in value and geometry are imprinted into the coatings and the resulting remanence field is measured. It is found that there are two effects, which in combination determine the effective value of the magnetic remanence usable for magnetic data storage.
  相似文献   
95.
96.
It is shown that several recursive least squares (RLS) type equalization algorithms such as, e.g., decisiondirected schemes and orthogonalized constant modulus algorithms, possess a common algorithmic structure and are therefore rather straightforwardly implemented on an triangular array (filter structure) for RLS estimation with inverse updating. While the computational complexity for such algorithms isO(N 2), whereN is the problem size, the throughput rate for the array implementation isO(1), i.e., independent of the problem size. Such a throughput rate cannot be achieved with standard (Gentleman-Kung-type) RLS/QR-updating arrays because of feedback loops in the computational schemes.  相似文献   
97.
We demonstrate controlled transport of superparamagnetic beads in the opposite direction of a laminar flow. A permanent magnet assembles 200 nm magnetic particles into about 200 μm long bead chains that are aligned in parallel to the magnetic field lines. Due to a magnetic field gradient, the bead chains are attracted towards the wall of a microfluidic channel. A rotation of the permanent magnet results in a rotation of the bead chains in the opposite direction to the magnet. Due to friction on the surface, the bead chains roll along the channel wall, even in counter-flow direction, up to at a maximum counter-flow velocity of 8 mm s−1. Based on this approach, magnetic beads can be accurately manoeuvred within microfluidic channels. This counter-flow motion can be efficiently be used in Lab-on-a-Chip systems, e.g. for implementing washing steps in DNA purification.  相似文献   
98.
A procedure to find the optimal design of a flywheel with a split-type hub is presented. Since cost plays a decisive role in stationary flywheel energy storage applications, a trade-off between energy and cost is required. Applying a scaling technique, the multi-objective design problem is reduced to the maximization of the energy-per-cost ratio as the single objective. Both an analytical and a finite element model were studied. The latter was found to be more than three orders of magnitude more computationally expensive than the analytical model, while the analytical model can only be regarded as a coarse approximation. Multifidelity approaches were examined to reduce the computational expense while retaining the high accuracy and large modeling depth of the finite element model. Using a surrogate-based optimization strategy, the computational cost was reduced to only one third in comparison to using only the finite element model. A nonlinear interior-point method was employed to find the optimal rim thicknesses and rotational speed. The benefits of the split-type hub architecture were demonstrated.  相似文献   
99.
During the past few years, several works have been done to derive string kernels from probability distributions. For instance, the Fisher kernel uses a generative model M (e.g. a hidden Markov model) and compares two strings according to how they are generated by M. On the other hand, the marginalized kernels allow the computation of the joint similarity between two instances by summing conditional probabilities. In this paper, we adapt this approach to edit distance-based conditional distributions and we present a way to learn a new string edit kernel. We show that the practical computation of such a kernel between two strings x and x built from an alphabet Σ requires (i) to learn edit probabilities in the form of the parameters of a stochastic state machine and (ii) to calculate an infinite sum over Σ* by resorting to the intersection of probabilistic automata as done for rational kernels. We show on a handwritten character recognition task that our new kernel outperforms not only the state of the art string kernels and string edit kernels but also the standard edit distance used by a neighborhood-based classifier.  相似文献   
100.
The SHARC framework for data quality in Web archiving   总被引:1,自引:0,他引:1  
Web archives preserve the history of born-digital content and offer great potential for sociologists, business analysts, and legal experts on intellectual property and compliance issues. Data quality is crucial for these purposes. Ideally, crawlers should gather coherent captures of entire Web sites, but the politeness etiquette and completeness requirement mandate very slow, long-duration crawling while Web sites undergo changes. This paper presents the SHARC framework for assessing the data quality in Web archives and for tuning capturing strategies toward better quality with given resources. We define data quality measures, characterize their properties, and develop a suite of quality-conscious scheduling strategies for archive crawling. Our framework includes single-visit and visit?Crevisit crawls. Single-visit crawls download every page of a site exactly once in an order that aims to minimize the ??blur?? in capturing the site. Visit?Crevisit strategies revisit pages after their initial downloads to check for intermediate changes. The revisiting order aims to maximize the ??coherence?? of the site capture(number pages that did not change during the capture). The quality notions of blur and coherence are formalized in the paper. Blur is a stochastic notion that reflects the expected number of page changes that a time-travel access to a site capture would accidentally see, instead of the ideal view of a instantaneously captured, ??sharp?? site. Coherence is a deterministic quality measure that counts the number of unchanged and thus coherently captured pages in a site snapshot. Strategies that aim to either minimize blur or maximize coherence are based on prior knowledge of or predictions for the change rates of individual pages. Our framework includes fairly accurate classifiers for change predictions. All strategies are fully implemented in a testbed and shown to be effective by experiments with both synthetically generated sites and a periodic crawl series for different Web sites.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号