首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1820篇
  免费   75篇
工业技术   1895篇
  2024年   2篇
  2023年   19篇
  2022年   63篇
  2021年   62篇
  2020年   29篇
  2019年   47篇
  2018年   57篇
  2017年   44篇
  2016年   50篇
  2015年   51篇
  2014年   58篇
  2013年   102篇
  2012年   104篇
  2011年   156篇
  2010年   126篇
  2009年   98篇
  2008年   132篇
  2007年   113篇
  2006年   71篇
  2005年   69篇
  2004年   58篇
  2003年   42篇
  2002年   48篇
  2001年   41篇
  2000年   27篇
  1999年   29篇
  1998年   17篇
  1997年   19篇
  1996年   20篇
  1995年   12篇
  1994年   19篇
  1993年   16篇
  1992年   17篇
  1990年   14篇
  1989年   6篇
  1988年   4篇
  1987年   12篇
  1986年   6篇
  1985年   6篇
  1984年   4篇
  1983年   4篇
  1982年   3篇
  1981年   2篇
  1980年   3篇
  1977年   3篇
  1976年   3篇
  1975年   1篇
  1974年   1篇
  1970年   1篇
  1969年   1篇
排序方式: 共有1895条查询结果,搜索用时 15 毫秒
21.
Bonded concrete overlays of asphalt pavements (BCOAs) are becoming a common rehabilitation technique used for distressed hot mix asphalt (HMA) roadways. The original design procedures were based primarily on data from instrumented pavements and finite element modelling. They were governed by the assumption that the failure mechanism was a function of the overlay thickness. However, field observations have indicated that the actual failure modes are dictated by slab size. The newly developed Bonded Concrete Overlay of Asphalt Mechanistic-Empirical design procedure (BCOA-ME) presented here is valid for overlays that are between 2.5 and 6.5 in (64–154 mm), and includes five primary enhancements to the Portland Cement Association and Colorado Department of Transportation procedures that have been traditionally used: 1.) the failure mode is dictated by the joint spacing; 2.) a new structural model for longitudinal cracking for 6-ft × 6-ft (1.8 m × 1.8 m) concrete overlays has been developed to better predict the critical stresses; 3.) the stress adjustment factors have been calibrated with performance data; 4.) the equivalent temperature gradients used as design input are defined based on the pavement structure and geographical location of the project; and 5.) the effect of temperature change on underlying HMA stiffness is considered. Finally, validation studies were completed on the new procedure and comparisons made between the revised procedure and actual performance data for five separate projects showed reasonable results. A sensitivity analysis also revealed that the predicted thickness obtained using the revised procedure was sensitive to HMA thickness, the modulus of rupture of the Portland cement concrete, and the level of traffic, as would be expected.  相似文献   
22.
Acetaminophen (paracetamol) is available in a wide range of oral formulations designed to meet the needs of the population across the age-spectrum, but for people with impaired swallowing, i.e. dysphagia, both solid and liquid medications can be difficult to swallow without modification. The effect of a commercial polysaccharide thickener, designed to be added to fluids to promote safe swallowing by dysphagic patients, on rheology and acetaminophen dissolution was tested using crushed immediate-release tablets in water, effervescent tablets in water, elixir and suspension. The inclusion of the thickener, comprised of xanthan gum and maltodextrin, had a considerable impact on dissolution; acetaminophen release from modified medications reached 12–50% in 30?min, which did not reflect the pharmacopeia specification for immediate release preparations. Flow curves reflect the high zero-shear viscosity and the apparent yield stress of the thickened products. The weak gel nature, in combination with high G' values compared to G'' (viscoelasticity) and high apparent yield stress, impact drug release. The restriction on drug release from these formulations is not influenced by the theoretical state of the drug (dissolved or dispersed), and the approach typically used in clinical practice (mixing crushed tablets into pre-prepared thickened fluid) cannot be improved by altering the order of incorporation or mixing method.  相似文献   
23.
24.
On modern architectures, the performance of 32-bit operations is often at least twice as fast as the performance of 64-bit operations. By using a combination of 32-bit and 64-bit floating point arithmetic, the performance of many dense and sparse linear algebra algorithms can be significantly enhanced while maintaining the 64-bit accuracy of the resulting solution. The approach presented here can apply not only to conventional processors but also to other technologies such as Field Programmable Gate Arrays (FPGA), Graphical Processing Units (GPU), and the STI Cell BE processor. Results on modern processor architectures and the STI Cell BE are presented.

Program summary

Program title: ITER-REFCatalogue identifier: AECO_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECO_v1_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 7211No. of bytes in distributed program, including test data, etc.: 41 862Distribution format: tar.gzProgramming language: FORTRAN 77Computer: desktop, serverOperating system: Unix/LinuxRAM: 512 MbytesClassification: 4.8External routines: BLAS (optional)Nature of problem: On modern architectures, the performance of 32-bit operations is often at least twice as fast as the performance of 64-bit operations. By using a combination of 32-bit and 64-bit floating point arithmetic, the performance of many dense and sparse linear algebra algorithms can be significantly enhanced while maintaining the 64-bit accuracy of the resulting solution.Solution method: Mixed precision algorithms stem from the observation that, in many cases, a single precision solution of a problem can be refined to the point where double precision accuracy is achieved. A common approach to the solution of linear systems, either dense or sparse, is to perform the LU factorization of the coefficient matrix using Gaussian elimination. First, the coefficient matrix A is factored into the product of a lower triangular matrix L and an upper triangular matrix U. Partial row pivoting is in general used to improve numerical stability resulting in a factorization PA=LU, where P is a permutation matrix. The solution for the system is achieved by first solving Ly=Pb (forward substitution) and then solving Ux=y (backward substitution). Due to round-off errors, the computed solution, x, carries a numerical error magnified by the condition number of the coefficient matrix A. In order to improve the computed solution, an iterative process can be applied, which produces a correction to the computed solution at each iteration, which then yields the method that is commonly known as the iterative refinement algorithm. Provided that the system is not too ill-conditioned, the algorithm produces a solution correct to the working precision.Running time: seconds/minutes  相似文献   
25.
Motorcycle protective clothing can be uncomfortably hot during summer, and this experiment was designed to evaluate the physiological significance of that burden. Twelve males participated in four, 90-min trials (cycling 30 W) across three environments (25, 30, 35 °C [all 40% relative humidity]). Clothing was modified between full and minimal injury protection. Both ensembles were tested at 25 °C, with only the more protective ensemble investigated at 30 and 35 °C. At 35 °C, auditory canal temperature rose at 0.02 °C min?1 (SD 0.005), deviating from all other trials (p < 0.05). The thresholds for moderate (>38.5 °C) and profound hyperthermia (>40.0 °C) were predicted to occur within 105 min (SD 20.6) and 180 min (SD 33.0), respectively. Profound hyperthermia might eventuate in ~10 h at 30 °C, but should not occur at 25 °C. These outcomes demonstrate a need to enhance the heat dissipation capabilities of motorcycle clothing designed for summer use in hot climates, but without compromising impact protection.

Practitioner’s Summary:

Motorcycle protective clothing can be uncomfortably hot during summer. This experiment was designed to evaluate the physiological significance of this burden across climatic states. In the heat, moderate (>38.5 °C) and profound hyperthermia (>40.0 °C) were predicted to occur within 105 and 180 min, respectively.  相似文献   

26.
In early or preparatory design stages, an architect or designer sketches out rough ideas, not only about the object or structure being considered, but its relation to its spatial context. This is an iterative process, where the sketches are not only the primary means for testing and refining ideas, but also for communicating among a design team and to clients. Hence, sketching is the preferred media for artists and designers during the early stages of design, albeit with a major drawback: sketches are 2D and effects such as view perturbations or object movement are not supported, thereby inhibiting the design process. We present an interactive system that allows for the creation of a 3D abstraction of a designed space, built primarily by sketching in 2D within the context of an anchoring design or photograph. The system is progressive in the sense that the interpretations are refined as the user continues sketching. As a key technical enabler, we reformulate the sketch interpretation process as a selection optimization from a set of context‐generated canvas planes in order to retrieve a regular arrangement of planes. We demonstrate our system (available at http:/geometry.cs.ucl.ac.uk/projects/2016/smartcanvas/ ) with a wide range of sketches and design studies.  相似文献   
27.
We define the notion of controlled hybrid language that allows information share and interaction between a controlled natural language (specified by a context-free grammar) and a controlled visual language (specified by a Symbol-Relation grammar). We present the controlled hybrid language INAUT, used to represent nautical charts of the French Naval and Hydrographic Service (SHOM) and their companion texts (Instructions nautiques).  相似文献   
28.
29.
Statistical tests are often performed to discover which experimental variables are reacting to specific treatments. Time-series statistical models usually require the researcher to make assumptions with respect to the distribution of measured responses which may not hold. Randomization tests can be applied to data in order to generate null distributions non-parametrically. However, large numbers of randomizations are required for the precise p-values needed to control false discovery rates. When testing tens of thousands of variables (genes, chemical compounds, or otherwise), significant q-value cutoffs can be extremely small (on the order of 10−5 to 10−8). This requires high-precision p-values, which in turn require large numbers of randomizations. The NVIDIA® Compute Unified Device Architecture® (CUDA®) platform for General Programming on the Graphics Processing Unit (GPGPU) was used to implement an application which performs high-precision randomization tests via Monte Carlo sampling for quickly screening custom test statistics for experiments with large numbers of variables, such as microarrays, Next-Generation sequencing read counts, chromatographical signals, or other abundance measurements. The software has been shown to achieve up to more than 12 fold speedup on a Graphics Processing Unit (GPU) when compared to a powerful Central Processing Unit (CPU). The main limitation is concurrent random access of shared memory on the GPU. The software is available from the authors.  相似文献   
30.
Outsourcing continues to capture the attention of researchers as more companies move to outsourcing models as part of their business practice. Two areas frequently researched and reported in the literature are the reasons why a company decides to outsource, and outsourcing success factors. This paper describes an in-depth, longitudinal case study that explores both the reasons why the company decided to outsource and factors that impact on success. The paper describes how Alpha, a very large Australian communications company, approached outsourcing and how its approach matured over a period of 9 years. The paper concludes that although a number of reasons are proposed for a company's decision to outsource, lowering costs was the predominant driver in this case. We also describe other factors identified as important for outsourcing success such as how contracts are implemented, the type of outsourcing partner arrangement, and outsourcing vendor capabilities.
Robert JacobsEmail:
  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号