全文获取类型
收费全文 | 9554篇 |
免费 | 599篇 |
国内免费 | 16篇 |
学科分类
工业技术 | 10169篇 |
出版年
2024年 | 20篇 |
2023年 | 110篇 |
2022年 | 361篇 |
2021年 | 485篇 |
2020年 | 302篇 |
2019年 | 363篇 |
2018年 | 411篇 |
2017年 | 365篇 |
2016年 | 439篇 |
2015年 | 332篇 |
2014年 | 466篇 |
2013年 | 811篇 |
2012年 | 677篇 |
2011年 | 785篇 |
2010年 | 532篇 |
2009年 | 548篇 |
2008年 | 495篇 |
2007年 | 449篇 |
2006年 | 326篇 |
2005年 | 279篇 |
2004年 | 258篇 |
2003年 | 202篇 |
2002年 | 193篇 |
2001年 | 93篇 |
2000年 | 88篇 |
1999年 | 78篇 |
1998年 | 111篇 |
1997年 | 74篇 |
1996年 | 62篇 |
1995年 | 57篇 |
1994年 | 50篇 |
1993年 | 42篇 |
1992年 | 36篇 |
1991年 | 40篇 |
1990年 | 30篇 |
1989年 | 25篇 |
1988年 | 17篇 |
1987年 | 19篇 |
1986年 | 9篇 |
1985年 | 17篇 |
1984年 | 23篇 |
1983年 | 14篇 |
1982年 | 15篇 |
1980年 | 11篇 |
1979年 | 6篇 |
1978年 | 6篇 |
1977年 | 10篇 |
1976年 | 8篇 |
1975年 | 6篇 |
1974年 | 5篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
41.
Montserrat Bóo Francisco Argüello Javier D. Bruguera Emilio L. Zapata 《The Journal of VLSI Signal Processing》1997,17(1):57-73
A rate 1/n binary generic convolutional encoder is a shift-register circuit where the inputs are information bits and the outputs are blocks of n bits generated as linear combinations on the appropriate shift register contents. The decoding of the outputs of a convolutional encoder can be carried out by the well-known Viterbi algorithm. The communication pattern of the Viterbi Algorithm is given as a graph, called trellis, associated to the state diagram of the corresponding encoder. In this paper we present a methodology that permits the efficient mapping of the Viterbi algorithm onto a column of an arbitrary number of processors. This is done through the representation of the data flow by using mathematical operators which present an inmediate hardware projection. A single operator string has been obtained to represent a generic encoder through the study of the data flow of free-forward encoders and feed-back encoders. The formal model developed is employed for the partitioning of the computations among an arbitrary number of processors in such a way that the data are recirculated opimizing the use of the processors and the communications. As a result, we obtain a highly regular and modular architecture suitable for VLSI implementation. 相似文献
42.
Participating media with an inhomogeneous index of refraction make light follow curved paths. Simulating this in a global
illumination environment has usually been neglected due to the complexity of the calculations involved, sacrificing accurate
physical simulations for efficient visual results.
This paper aims to simulate non-linear media in a more reasonable time than previous works without losing physical correctness.
Accuracy is achieved by solving the Eikonal equation of geometrical optics, which describes the path followed by a light beam
that traverses a non-linear medium. This equation is used in the context of a photon mapping extension. 相似文献
43.
Luis Armando Rosas Rivera Norma F. Hubele PhD Frederick P. Lawrence PhD 《Computers & Industrial Engineering》1995,29(1-4):55-58
Process capability indices (PCIs) are used in industry to assess percentages of nonconforming parts. An underlying assumption is that the output process measurements are distributed as normal random variables. When normal distributions are assumed, but different distributions are present - such as skew, heavy-tailed, and short-tailed distributions - the percentages of nonconforming parts are significantly different than the computed PCIs indicate. Data arising from nonnormal distributions can sometimes be transformed to conform to the normality assumption and the PCI's computed for the transformed data. In this paper, the effect of the transformation on the estimate of nonconforming parts is examined for three examples of nonnormal distributions - gamma, lognormal, and Weibull. The results of this experimental analysis suggest that data transformation can be useful for estimating an interval for Cpk values and the number of nonconforming parts. 相似文献
44.
Campillo Mauricio Sedaghati Ramin Drew Robin A. L. Alfonso Ismeli Prez Luis 《Engineering with Computers》2021,38(3):1767-1785
Engineering with Computers - In this article, a methodology based on Discrete Element Method (DEM) and Finite Elements Method (FEM) combined with modified Approximate Periodic Boundary Condition... 相似文献
45.
Nowadays, cities are the most relevant type of human settlement and their population has been endlessly growing for decades. At the same time, we are witnessing an explosion of digital data that capture many different aspects and details of city life. This allows detecting human mobility patterns in urban areas with more detail than ever before. In this context, based on the fusion of mobility data from different and heterogeneous sources, such as public transport, transport‐network connectivity and Online Social Networks, this study puts forward a novel approach to uncover the actual land use of a city. Unlike previous solutions, our work avoids a time‐invariant approach and it considers the temporal factor based on the assumption that urban areas are not used by citizens all the time in the same manner. We have tested our solution in two different cities showing high accuracy rates. 相似文献
46.
The Journal of Supercomputing - This article presents a set of linear regression models to predict the impact of task migration on different objectives, like performance and energy consumption. It... 相似文献
47.
Quesada-Barriuso Pablo Blanco Heras Dora Argüello Francisco 《The Journal of supercomputing》2021,77(9):10040-10052
The Journal of Supercomputing - The high computational cost of the superpixel segmentation algorithms for hyperspectral remote sensing images makes them ideal candidates for parallel computation.... 相似文献
48.
Moutafis Panagiotis García-García Francisco Mavrommatis George Vassilakopoulos Michael Corral Antonio Iribarne Luis 《Distributed and Parallel Databases》2021,39(3):733-784
Distributed and Parallel Databases - Given two datasets of points (called Query and Training), the Group (K) Nearest-Neighbor (GKNN) query retrieves (K) points of the Training with the smallest sum... 相似文献
49.
Tierny J Daniels J Nonato LG Pascucci V Silva CT 《IEEE transactions on visualization and computer graphics》2012,18(10):1650-1663
Creating high-quality quad meshes from triangulated surfaces is a highly nontrivial task that necessitates consideration of various application specific metrics of quality. In our work, we follow the premise that automatic reconstruction techniques may not generate outputs meeting all the subjective quality expectations of the user. Instead, we put the user at the center of the process by providing a flexible, interactive approach to quadrangulation design. By combining scalar field topology and combinatorial connectivity techniques, we present a new framework, following a coarse to fine design philosophy, which allows for explicit control of the subjective quality criteria on the output quad mesh, at interactive rates. Our quadrangulation framework uses the new notion of Reeb atlas editing, to define with a small amount of interactions a coarse quadrangulation of the model, capturing the main features of the shape, with user prescribed extraordinary vertices and alignment. Fine grain tuning is easily achieved with the notion of connectivity texturing, which allows for additional extraordinary vertices specification and explicit feature alignment, to capture the high-frequency geometries. Experiments demonstrate the interactivity and flexibility of our approach, as well as its ability to generate quad meshes of arbitrary resolution with high-quality statistics, while meeting the user’s own subjective requirements. 相似文献
50.
Julián Luengo José A. Sáez Francisco Herrera 《Soft Computing - A Fusion of Foundations, Methodologies and Applications》2012,16(5):863-881
Fuzzy rule-based classification systems (FRBCSs) are known due to their ability to treat with low quality data and obtain
good results in these scenarios. However, their application in problems with missing data are uncommon while in real-life
data, information is frequently incomplete in data mining, caused by the presence of missing values in attributes. Several
schemes have been studied to overcome the drawbacks produced by missing values in data mining tasks; one of the most well
known is based on preprocessing, formerly known as imputation. In this work, we focus on FRBCSs considering 14 different approaches
to missing attribute values treatment that are presented and analyzed. The analysis involves three different methods, in which
we distinguish between Mamdani and TSK models. From the obtained results, the convenience of using imputation methods for
FRBCSs with missing values is stated. The analysis suggests that each type behaves differently while the use of determined
missing values imputation methods could improve the accuracy obtained for these methods. Thus, the use of particular imputation
methods conditioned to the type of FRBCSs is required. 相似文献