首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   99篇
  免费   3篇
  国内免费   2篇
工业技术   104篇
  2023年   3篇
  2022年   2篇
  2021年   6篇
  2020年   2篇
  2019年   7篇
  2018年   9篇
  2017年   2篇
  2016年   7篇
  2015年   4篇
  2014年   7篇
  2013年   4篇
  2012年   13篇
  2011年   11篇
  2010年   3篇
  2009年   8篇
  2008年   2篇
  2007年   3篇
  2006年   2篇
  2004年   1篇
  2000年   1篇
  1999年   1篇
  1997年   2篇
  1994年   1篇
  1985年   1篇
  1984年   2篇
排序方式: 共有104条查询结果,搜索用时 23 毫秒
21.
Water splitting for H2 production by absorbing sunlight is broadly used as a common technique to counter existing energy crisis and environmental problems, caused by extreme use of fossil fuels. We report a versatile and facile method to fabricate ordered Silicon nanohorns (SiNHs) by employing prefabricated metal nano-gap template on Si. The close-packed monolayer is used to develop the nanohole template, which enables the generation of SiNHs via metal-assisted controlled chemical etching. By varying monolayer parameters and etching sequences, SiNHs with desired dimensions were obtained. Growth along the crystalline plane of the base substrate ?100?, with a consistent bent at the tip of the SiNH, has been observed. The resulting SiNHs exhibited enhanced photoelectrochemical properties, with short-circuit photocurrent density more than four times higher than that of the planer Si along with enhanced trapping of photogenerated carriers. A photocurrent density of ~4.8 mA/cm2 was observed at a potential of -1 V vs. RHE. Further, the electrochemical impedance study (EIS) was carried out to understand the photoelectrochemical activity and charge transfer kinetics of the SiNHs system. These nanostructures enhance light absorption and may be one of the low-cost alternatives for optical devices, sensors, and hydrogen evolution.  相似文献   
22.
In wireless sensor networks (WSNs), clustering has been shown to be an efficient technique to improve scalability and network lifetime. In clustered networks, clustering creates unequal load distribution among cluster heads (CHs) and cluster member (CM) nodes. As a result, the entire network is subject to premature death because of the deficient active nodes within the network. In this paper, we present clustering‐based routing algorithms that can balance out the trade‐off between load distribution and network lifetime “green cluster‐based routing scheme.” This paper proposes a new energy‐aware green cluster‐based routing algorithm to preventing premature death of large‐scale dense WSNs. To deal with the uncertainty present in network information, a fuzzy rule‐based node classification model is proposed for clustering. Its primary benefits are flexibility in selecting effective CHs, reliability in distributing CHs overload among the other nodes, and reducing communication overhead and cluster formation time in highly dense areas. In addition, we propose a routing scheme that balances the load among sensors. The proposed scheme is evaluated through simulations to compare our scheme with the existing algorithms available in the literature. The numerical results show the relevance and improved efficiency of our scheme.  相似文献   
23.

Microgrid is a novel small-scale system of the centralized electricity for a small-scale community such as villages and commercial area. Microgrid consists of micro-sources like distribution generator, solar and wind units. A microgrid is consummate specific purposes like reliability, cost reduction, emission reduction, efficiency improvement, use of renewable sources and continuous energy source. In the microgrid, the Energy Management System is having a problem of Economic Load Dispatch (ELD) and Combined Economic Emission Dispatch (CEED) and it is optimized by meta-heuristic techniques. The key objective of this paper is to solve the Combined Economic Emission Dispatch (CEED) problem to obtain optimal system cost. The CEED is the procedure to scheduling the generating units within their bounds together with minimizing the fuel cost and emission values. The newly introduced Interior Search Algorithm (ISA) is applied for the solution of ELD and CEED problem. The minimization of total cost and total emission is obtained for four different scenarios like all sources included all sources without solar energy, all sources without wind energy and all sources without solar and wind energy. In both scenarios, the result shows the comparison of ISA with the Reduced Gradient Method (RGM), Ant Colony Optimization (ACO) technique and Cuckoo Search Algorithm (CSA) for the two different cases which are ELD without emission and CEED with emission. The results are calculated for different Power Demand of 24 h. The results obtained to ISA give comparatively better cost reduction as compared with RGM, ACO and CSA which shows the effectiveness of the given algorithm.

  相似文献   
24.
A nine-level hybrid symmetric cascaded multilevel converter (MLC) fed induction motor drive is proposed in this paper. The proposed converter is capable of producing nine output voltage levels by using the same number of power cells as that of conventional five-level symmetric cascaded H-bridge converter. Each phase in this configuration consists of one five-level transistor-clamped H-Bridge (TCHB) power cell and one three-level H-bridge power cell with equal dc link voltages, and they are connected in cascade. Due to cascade connection and equal dc link voltage, the power shared by each power cell is nearly equal. Near-equal power sharing enables the feature of improving input current quality by using an appropriate phase-shifting multi-winding transformer at the converter input. In this paper, the operation of the converter is explained using staircase and hybrid multi-carrier sine PWM techniques. Further, a detailed analysis for the variations in the dc link capacitor voltages and the dc link mid-point voltage in TCHB power cell is carried out, and the analytical expressions thus obtained are presented. The performance of proposed system is analysed by simulating a 500 hp induction motor drive system in MATLAB/Simulink environment. A laboratory prototype is also developed to validate the claims experimentally.  相似文献   
25.
Attacks on computer systems are now attracting increased attention. While the current trends in software vulnerability discovery indicate that the number of newly discovered vulnerabilities continues to be significant, the time between the public disclosure of vulnerabilities and the release of an automated exploit is shrinking. Thus, assessing the vulnerability exploitability risk is critical because this allows decision-makers to prioritize among vulnerabilities, allocate resources to patch and protect systems from these vulnerabilities, and choose between alternatives. Common vulnerability scoring system (CVSS) metrics have become the de facto standard for assessing the severity of vulnerabilities. However, the CVSS exploitability measures assign subjective values based on the views of experts. Two of the factors in CVSS, Access Vector and Authentication, are the same for almost all vulnerabilities. CVSS does not specify how the third factor, Access Complexity, is measured, and hence it is unknown whether it considers software properties as a factor. In this work, we introduce a novel measure, Structural Severity, which is based on software properties, namely attack entry points, vulnerability location, the presence of the dangerous system calls, and reachability analysis. These properties represent metrics that can be objectively derived from attack surface analysis, vulnerability analysis, and exploitation analysis. To illustrate the proposed approach, 25 reported vulnerabilities of Apache HTTP server and 86 reported vulnerabilities of Linux Kernel have been examined at the source code level. The results show that the proposed approach, which uses more detailed information, can objectively measure the risk of vulnerability exploitability and results can be different from the CVSS base scores.  相似文献   
26.
In recent year, the problem of clustering in microarray data has been gaining significant attention. However most of the clustering methods attempt to find the group of genes where the number of cluster is known a priori. This fact motivated us to develop a new real-coded improved differential evolution based automatic fuzzy clustering algorithm which automatically evolves the number of clusters as well as the proper partitioning of a gene expression data set. To improve the result further, the clustering method is integrated with a support vector machine, a well-known technique for supervised learning. A fraction of the gene expression data points selected from different clusters based on their proximity to the respective centers, is used for training the SVM. The clustering assignments of the remaining gene expression data points are thereafter determined using the trained classifier. The performance of the proposed clustering technique has been demonstrated on five gene expression data sets by comparing it with the differential evolution based automatic fuzzy clustering, variable length genetic algorithm based fuzzy clustering and well known Fuzzy C-Means algorithm. Statistical significance test has been carried out to establish the statistical superiority of the proposed clustering approach. Biological significance test has also been carried out using a web based gene annotation tool to show that the proposed method is able to produce biologically relevant clusters of genes. The processed data sets and the matlab version of the software are available at http://bio.icm.edu.pl/~darman/IDEAFC-SVM/.  相似文献   
27.
MgO nano-rods of several microns in length and 50–100 nm in width were prepared by calcining nesquehonite phase, obtained by simple precipitation using (NH4)2CO3 under ambient condition. The MgO nano-rod with reasonably high surface area (75–120 m2 g−1) exhibits strong activity in solvent-free base catalyzed Claisen-Schmidt condensation giving 99% conversion in 2 h and is easily recyclable with no significant change in catalytic activity. Presence of numerous basic sites of different strengths (surface hydroxyl groups, low coordinate O2− sites) is attributed to the observed effect.  相似文献   
28.
Restrained shrinkage cracking is a critical issue that raises the concern of widespread use of high-performance concrete (HPC) in bridge deck. Present studies were undertaken to compare the different HPC and propose concept to use local field data to define a threshold for cracking potential. We developed 18 HPC mixtures, suitable for bridge decks in shrinkage-prone locations, using supplementary cementitious materials - fly ash, slag, silica fume, and metakaolin; and local aggregates with three different w/cm: 0.40, 0.35, and 0.30. Basic properties as well as shrinkage and cracking properties were evaluated. In addition to comparing among HPC performance, a correlation was made between commonly measured parameters such as strength, shrinkage, and modulus of elasticity with cracking onset obtained from ring tests. Finally, field data from no-crack and well performing bridges were used to define a threshold safe limit. This concept can be used for design of HPC mixtures to reduce cracking potential from materials point of view for any other locations.  相似文献   
29.
Journal of Materials Science - We have reported a novel route to develop highly conductive graphene sheets using camphor as a natural precursor followed by nitrogen doping via low-temperature...  相似文献   
30.
Data broadcasting is an efficient method to disseminate information to a large group of requesters with common interests. Performing such broadcasts typically involve the determination of a broadcast schedule intended to maximize the quality of service provided by the broadcast system. Earlier studies have proposed solutions to this problem in the form of heuristics and local search techniques designed to achieve minimal deadline misses or maximal utility. An often ignored factor in these studies is the possibility of the data items being not available locally, but rather have to be fetched from data servers distributed over a network, thereby inducing a certain level of stochasticity in the actual time required to serve a data item. This stochasticity is introduced on behalf of the data servers which themselves undergo a dynamic management of serving data requests. In this paper we revisit the problem of real time data broadcasting under such a scenario. We investigate the efficiency of heuristics that embed the stochastic nature of the problem in their design and compare their performance with those proposed for non-stochastic broadcast scheduling. Further, we extend our analysis to understand the various factors in the problem structure that influence these heuristics, and are often exploited by a better performing one.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号