首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   9729篇
  免费   872篇
  国内免费   90篇
工业技术   10691篇
  2024年   35篇
  2023年   190篇
  2022年   297篇
  2021年   615篇
  2020年   570篇
  2019年   717篇
  2018年   803篇
  2017年   755篇
  2016年   767篇
  2015年   450篇
  2014年   739篇
  2013年   1084篇
  2012年   695篇
  2011年   777篇
  2010年   493篇
  2009年   426篇
  2008年   259篇
  2007年   195篇
  2006年   155篇
  2005年   102篇
  2004年   105篇
  2003年   62篇
  2002年   60篇
  2001年   30篇
  2000年   24篇
  1999年   26篇
  1998年   26篇
  1997年   19篇
  1996年   24篇
  1995年   23篇
  1994年   12篇
  1993年   16篇
  1992年   11篇
  1991年   19篇
  1990年   17篇
  1989年   12篇
  1988年   8篇
  1987年   7篇
  1986年   8篇
  1985年   8篇
  1984年   14篇
  1983年   12篇
  1982年   4篇
  1981年   3篇
  1980年   2篇
  1979年   6篇
  1978年   3篇
  1977年   2篇
  1973年   2篇
  1967年   1篇
排序方式: 共有10000条查询结果,搜索用时 31 毫秒
101.
With the high availability of digital video contents on the internet, users need more assistance to access digital videos. Various researches have been done about video summarization and semantic video analysis to help to satisfy these needs. These works are developing condensed versions of a full length video stream through the identification of the most important and pertinent content within the stream. Most of the existing works in these areas are mainly focused on event mining. Event mining from video streams improves the accessibility and reusability of large media collections, and it has been an active area of research with notable recent progress. Event mining includes a wide range of multimedia domains such as surveillance, meetings, broadcast, news, sports, documentary, and films, as well as personal and online media collections. Due to the variety and plenty of Event mining techniques, in this paper we suggest an analytical framework to classify event mining techniques and to evaluate them based on important functional measures. This framework could lead to empirical and technical comparison of event mining methods and development of more efficient structures at future.  相似文献   
102.
Robot manufacturers will be required to demonstrate objectively that all reasonably foreseeable hazards have been identified in any robotic product design that is to be marketed commercially. This is problematic for autonomous mobile robots because conventional methods, which have been developed for automatic systems do not assist safety analysts in identifying non-mission interactions with environmental features that are not directly associated with the robot’s design mission, and which may comprise the majority of the required tasks of autonomous robots. In this paper we develop a new variant of preliminary hazard analysis that is explicitly aimed at identifying non-mission interactions by means of new sets of guidewords not normally found in existing variants. We develop the required features of the method and describe its application to several small trials conducted at Bristol Robotics Laboratory in the 2011–2012 period.  相似文献   
103.
Cloud computing techniques take the form of distributed computing by utilizing multiple computers to execute computing simultaneously on the service side. To process the increasing quantity of multimedia data, numerous large-scale multimedia data storage computing techniques in the cloud computing have been developed. Of all the techniques, Hadoop plays a key role in the cloud computing. Hadoop, a computing cluster formed by low-priced hardware, can conduct the parallel computing of petabytes of multimedia data. Hadoop features high-reliability, high-efficiency, and high-scalability. The numerous large-scale multimedia data computing techniques include not only the key core techniques, Hadoop and MapReduce, but also the data collection techniques, such as File Transfer Protocol and Flume. In addition, distributed system configuration allocation, automatic installation, and monitoring platform building and management techniques are all included. As a result, only with the integration of all the techniques, a reliable large-scale multimedia data platform can be offered. In this paper, we introduce how cloud computing can make a breakthrough by proposing a multimedia social network dataset on Hadoop platform and implementing a prototype version. Detailed specifications and design issues are discussed as well. An important finding of this article is that we can save more time if we conduct the multimedia social networking analysis using Cloud Hadoop Platform rather than using a single computer. The advantages of cloud computing over the traditional data processing practices are fully demonstrated in this article. The applicable framework designs and the tools available for the large-scale data processing are also proposed. We show the experimental multimedia data including data sizes and processing time.  相似文献   
104.
A new variant of Differential Evolution (DE), called ADE-Grid, is presented in this paper which adapts the mutation strategy, crossover rate (CR) and scale factor (F) during the run. In ADE-Grid, learning automata (LA), which are powerful decision making machines, are used to determine the proper value of the parameters CR and F, and the suitable strategy for the construction of a mutant vector for each individual, adaptively. The proposed automata based DE is able to maintain the diversity among the individuals and encourage them to move toward several promising areas of the search space as well as the best found position. Numerical experiments are conducted on a set of twenty four well-known benchmark functions and one real-world engineering problem. The performance comparison between ADE-Grid and other state-of-the-art DE variants indicates that ADE-Grid is a viable approach for optimization. The results also show that the proposed ADE-Grid improves the performance of DE in terms of both convergence speed and quality of final solution.  相似文献   
105.
In the present study, the Group method of data handling (GMDH) network was utilized to predict the scour depth below pipelines. GMDH network was developed using back propagation. Input parameters that were considered as effective parameters on the scour depth included those of sediment size, geometry of pipeline, and approaching flow characteristics. Training and testing performances of the GMDH networks have been carried out using nondimensional data sets that were collected from the literature. These data sets are related to the two main situations of pipelines scour experiments namely clear-water and live-bed conditions. The testing results of performances were compared with the support vector machines (SVM) and existing empirical equations. The GMDH network indicated that using of back propagation produced lower error of scour depth prediction than those obtained using the SVM and empirical equations. Also, the effects of many input parameters on the scour depth have been investigated.  相似文献   
106.
In this paper, a novel algorithm for image encryption based on hash function is proposed. In our algorithm, a 512-bit long external secret key is used as the input value of the salsa20 hash function. First of all, the hash function is modified to generate a key stream which is more suitable for image encryption. Then the final encryption key stream is produced by correlating the key stream and plaintext resulting in both key sensitivity and plaintext sensitivity. This scheme can achieve high sensitivity, high complexity, and high security through only two rounds of diffusion process. In the first round of diffusion process, an original image is partitioned horizontally to an array which consists of 1,024 sections of size 8 × 8. In the second round, the same operation is applied vertically to the transpose of the obtained array. The main idea of the algorithm is to use the average of image data for encryption. To encrypt each section, the average of other sections is employed. The algorithm uses different averages when encrypting different input images (even with the same sequence based on hash function). This, in turn, will significantly increase the resistance of the cryptosystem against known/chosen-plaintext and differential attacks. It is demonstrated that the 2D correlation coefficients (CC), peak signal-to-noise ratio (PSNR), encryption quality (EQ), entropy, mean absolute error (MAE) and decryption quality can satisfy security and performance requirements (CC <0.002177, PSNR <8.4642, EQ >204.8, entropy >7.9974 and MAE >79.35). The number of pixel change rate (NPCR) analysis has revealed that when only one pixel of the plain-image is modified, almost all of the cipher pixels will change (NPCR >99.6125 %) and the unified average changing intensity is high (UACI >33.458 %). Moreover, our proposed algorithm is very sensitive with respect to small changes (e.g., modification of only one bit) in the external secret key (NPCR >99.65 %, UACI >33.55 %). It is shown that this algorithm yields better security performance in comparison to the results obtained from other algorithms.  相似文献   
107.
We have been developed novel catalysts for gasification of biomass with much higher energy efficiency than conventional methods (non-catalyst, dolomite, commercial steam reforming Ni catalyst). From the result of the gasification of cellulose over novel Rh/CeO2/SiO2 catalysts, it is found that the gasification process consists of the reforming of tar and the combustion of solid carbon. We also tested novel Rh/CeO2/SiO2 in the gasification with air, pyrogasification, and steam reforming of cedar wood. As a result, Rh/CeO2/SiO2 gave higher yield of syngas than the conventional steam reforming Ni catalyst. Furthermore, we compared the performance between single and dual bed reactors. Single bed reactor was effective in the gasification of cedar, however, it was not suitable for the gasification of rice straw since a rapid deactivation was observed. Gasification of rice straw, jute stick, baggase using the fluidized dual-bed reactor and Rh/CeO2/SiO2 was also investigated. Especially, the catalyst stability in the gasification of rice straw clearly was enhanced by using the fluidized dual bed reactor.  相似文献   
108.
Joining of sintered Si3N4 was performed using a high-temperature brazing technique. Ni-based brazing alloys having the same Ni:Cr ratio as AWS BNi-5 (Ni·18Cr·19Si (at. %)) but different Si content were used as the brazing filler metals. Joining experiments were performed at 1220°C under a N2 partial pressure of 15 Pa for different times between 5 to 15 min. The highest room-temperature four-point bend strength of the joints was 115 MPa, whereas 220 MPa was achieved when the joints were tested at 900°C. The high strength of the experimental joints was attributed to the reduction in residual stresses and formation of a CrN reaction layer at the ceramic/filler metal interface.  相似文献   
109.
The cultivation of toxic lignocellulosic hydrolyzates has become a challenging research topic in recent decades. Although several cultivation methods have been proposed, numerous questions have arisen regarding their industrial applications. The current work deals with a solution to this problem which has a good potential application on an industrial scale. A toxic dilute-acid hydrolyzate was continuously cultivated using a high-cell-density flocculating yeast in a single and serial bioreactor which was equipped with a settler to recycle the cells back to the bioreactors. No prior detoxification was necessary to cultivate the hydrolyzates, as the flocks were able to detoxify it in situ. The experiments were successfully carried out at dilution rates up to 0.52 h−1. The cell concentration inside the bioreactors was between 23 and 35 g-DW/L, while the concentration in the effluent of the settlers was 0.32 ± 0.05 g-DW/L. An ethanol yield of 0.42–0.46 g/g-consumed sugar was achieved, and the residual sugar concentration was less than 6% of the initial fermentable sugar (glucose, galactose and mannose) of 35.2 g/L.  相似文献   
110.
In clustering algorithm, one of the main challenges is to solve the global allocation of the clusters instead of just local tuning of the partition borders. Despite this, all external cluster validity indexes calculate only point-level differences of two partitions without any direct information about how similar their cluster-level structures are. In this paper, we introduce a cluster level index called centroid index. The measure is intuitive, simple to implement, fast to compute and applicable in case of model mismatch as well. To a certain extent, we expect it to generalize other clustering models beyond the centroid-based k-means as well.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号