首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11848篇
  免费   1101篇
  国内免费   12篇
工业技术   12961篇
  2024年   39篇
  2023年   163篇
  2022年   151篇
  2021年   410篇
  2020年   335篇
  2019年   327篇
  2018年   529篇
  2017年   516篇
  2016年   610篇
  2015年   480篇
  2014年   624篇
  2013年   1133篇
  2012年   1080篇
  2011年   988篇
  2010年   625篇
  2009年   583篇
  2008年   649篇
  2007年   589篇
  2006年   470篇
  2005年   336篇
  2004年   317篇
  2003年   267篇
  2002年   257篇
  2001年   152篇
  2000年   148篇
  1999年   98篇
  1998年   123篇
  1997年   118篇
  1996年   89篇
  1995年   83篇
  1994年   77篇
  1993年   64篇
  1992年   44篇
  1991年   31篇
  1990年   25篇
  1989年   29篇
  1988年   29篇
  1987年   14篇
  1986年   25篇
  1985年   32篇
  1984年   30篇
  1983年   18篇
  1982年   23篇
  1981年   25篇
  1980年   23篇
  1979年   27篇
  1978年   13篇
  1977年   10篇
  1976年   12篇
  1973年   11篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
81.
In this paper, a new approximation to off-line signature verification is proposed based on two-class classifiers using an expert decisions ensemble. Different methods to extract sets of local and a global features from the target sample are detailed. Also a normalization by confidence voting method is used in order to decrease the final equal error rate (EER). Each set of features is processed by a single expert, and on the other approach proposed, the decisions of the individual classifiers are combined using weighted votes. Experimental results are given using a subcorpus of the large MCYT signature database for random and skilled forgeries. The results show that the weighted combination outperforms the individual classifiers significantly. The best EER obtained were 6.3 % in the case of skilled forgeries and 2.31 % in the case of random forgeries.  相似文献   
82.
Fuzzy rule-based classification systems (FRBCSs) are known due to their ability to treat with low quality data and obtain good results in these scenarios. However, their application in problems with missing data are uncommon while in real-life data, information is frequently incomplete in data mining, caused by the presence of missing values in attributes. Several schemes have been studied to overcome the drawbacks produced by missing values in data mining tasks; one of the most well known is based on preprocessing, formerly known as imputation. In this work, we focus on FRBCSs considering 14 different approaches to missing attribute values treatment that are presented and analyzed. The analysis involves three different methods, in which we distinguish between Mamdani and TSK models. From the obtained results, the convenience of using imputation methods for FRBCSs with missing values is stated. The analysis suggests that each type behaves differently while the use of determined missing values imputation methods could improve the accuracy obtained for these methods. Thus, the use of particular imputation methods conditioned to the type of FRBCSs is required.  相似文献   
83.
Adaptive anisotropic refinement of finite element meshes allows one to reduce the computational effort required to achieve a specified accuracy of the solution of a PDE problem. We present a new approach to adaptive refinement and demonstrate that this allows one to construct algorithms which generate very flexible and efficient anisotropically refined meshes, even improving the convergence order compared to adaptive isotropic refinement if the problem permits.  相似文献   
84.
Model-based testing is focused on testing techniques which rely on the use of models. The diversity of systems and software to be tested implies the need for research on a variety of models and methods for test automation. We briefly review this research area and introduce several papers selected from the 22nd International Conference on Testing Software and Systems (ICTSS).  相似文献   
85.
Membrane Computing is a discipline aiming to abstract formal computing models, called membrane systems or P systems, from the structure and functioning of the living cells as well as from the cooperation of cells in tissues, organs, and other higher order structures. This framework provides polynomial time solutions to NP-complete problems by trading space for time, and whose efficient simulation poses challenges in three different aspects: an intrinsic massively parallelism of P systems, an exponential computational workspace, and a non-intensive floating point nature. In this paper, we analyze the simulation of a family of recognizer P systems with active membranes that solves the Satisfiability problem in linear time on different instances of Graphics Processing Units (GPUs). For an efficient handling of the exponential workspace created by the P systems computation, we enable different data policies to increase memory bandwidth and exploit data locality through tiling and dynamic queues. Parallelism inherent to the target P system is also managed to demonstrate that GPUs offer a valid alternative for high-performance computing at a considerably lower cost. Furthermore, scalability is demonstrated on the way to the largest problem size we were able to run, and considering the new hardware generation from Nvidia, Fermi, for a total speed-up exceeding four orders of magnitude when running our simulations on the Tesla S2050 server.  相似文献   
86.
We discuss how the standard Cost-Benefit Analysis should be modified in order to take risk (and uncertainty) into account. We propose different approaches used in finance (Value at Risk, Conditional Value at Risk, Downside Risk Measures, and Efficiency Ratio) as useful tools to model the impact of risk in project evaluation. After introducing the concepts, we show how they could be used in CBA and provide some simple examples to illustrate how such concepts can be applied to evaluate the desirability of a new project infrastructure.  相似文献   
87.
An assessment was made of the microbiological quality of the final product (different retail cuts), produced by two different retail supermarket groups (A & B). The influence of sanitary conditions on the microbiological quality of the final product was evaluated, and the possible influences on shelf life were determined. Supermarket group A (Sup group A) received carcasses with significantly lower surface counts of micro-organisms than supermarket group B (Sup group B), while the latter had a more efficient overall sanitation programme than supermarket group A. Five cuts were monitored for the presence of total aerobic counts, psychrotrophic counts, lactobacilli, Enterobacteriaceae and numbers of Pseudomonadaceae present. A shelf life study was also executed by repeating these enumerations on the same meat samples after refrigerated storage at 5°C for 2 and 4 days, respectively. It is generally accepted that a good refrigeration or chilling regime will preserve the inherent meat quality, but in this study it was found that neither served as a guarantee of quality. The more stringent hygiene at retail level of Sup group B yielded consistently lower mean counts of the different bacterial groups for all the meat cuts monitored and, thus meat with an extended shelf life. The total count (at 30°C) on meat cuts was the highest, followed by the psychrotrophs, the Pseudomonadaceae the Enterobacteriaeae and the lactobacilli. Minced meat generally had the highest mean aerobic total microbial counts. This count on minced meat might be a suitable indicator for monitoring the overall sanitary condition of a retail premises. The results re-emphasized the multi-factorial complexity of fresh meat quality and shelf life. The microbial quality of the raw material (carcasses), the maintenance of the cold chain, sanitary condition of premises, equipment and personnel surfaces and general management practices are factors that collectively determine the microbiological quality of the product.  相似文献   
88.
The influence of different centralised pre-packaging systems (PVC, modified atmosphere packaging (MAP), 25% CO(2) and 75% O(2), vacuum skin packaging (VSP) and the mother bag concept, 100% CO(2)) on the shelf-life (0, 7, 14 and 21 days at 0°C) of fresh pork was determined using microbiological, colour, odour and acceptability characteristics. All the packaging treatments were equally efficient for the first 4 days of retail display. In the extended shelf-life study (7, 14 and 21 days) the mother bag centralised packaging system gave the most promising shelf-life results (21 days) and was also judged superior in terms of odour. Modified atmosphere packaging (14 days) and VSP (7 days) may be considered as other possible options.  相似文献   
89.
The statistical properties of the EEG and the MEG are described mathematically as the result of randomly distributed dipoles. These dipoles represent the interactions of cortical neurons. For certain dipole distributions, the first- and second-order moments of the electric and magnetic fields are derived analytically. If the dipoles are in a spherical volume conductor and have no preference for any direction, the variance of a differentially measured EEG-signal is only a function of the electrode distance. In this paper, the theoretically derived variance function will be compared with EEG- and MEG-measurements. It is shown that a dipole with a fixed position and a randomly fluctuating amplitude is an adequate model for the alpha-rhythm. An expression for the covariance between the magnetic field and a differentially measured EEG-signal is derived. This covariance is considered as a function of the magnetometer position, and is compared with the measurements of Chapman et al. [23]. The theory can be used to obtain a (spatial) covariance matrix of the background noise, which occurs in evoked potential measurements. Such a covariance matrix can be used to obtain a maximum likelihood estimator of the dipole parameters in evoked potential studies, to evaluate the merits of the so-called "Laplacian derivation," and for the interpolation of electromagnetic data.  相似文献   
90.
This study attempted to identify the major sources of work-related stress among telephone operators, with special emphasis on computer monitoring and telephone surveillance. A cross-sectional random sample of over 700 telephone operators participated in a questionnaire survey (response rate = 88%). The survey included items designed to measure perceived stress, management practices, specific job stressors and monitoring preferences. Call-time pressure items were most strongly linked to job stress by operators, with 70% reporting that difficulty in serving a customer well and still keeping call-time down contributed to their feelings of stress to a large or very large extent. About 55% of operators reported that telephone monitoring contributed to their feelings of job stress. If given the opportunity, 44% of operators stated they would prefer not to be monitored by telephone at all, while 23% stated they would prefer some monitoring; 33% had no preference. The setting of inappropriate individual-call-time objectives, which may be consistently unachievable for some operators and which create conflict between management demands for quantity and quality and also between workers values concerning quality and productivity demands, appears to be the most stress-inducing aspect of the job. In terms of telephone surveillance, the issues of timeliness and specificity of feedback appear to be less important than call-time pressure.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号