首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   8206篇
  免费   386篇
  国内免费   52篇
工业技术   8644篇
  2024年   19篇
  2023年   146篇
  2022年   248篇
  2021年   405篇
  2020年   298篇
  2019年   339篇
  2018年   420篇
  2017年   360篇
  2016年   360篇
  2015年   238篇
  2014年   324篇
  2013年   670篇
  2012年   366篇
  2011年   515篇
  2010年   360篇
  2009年   333篇
  2008年   300篇
  2007年   220篇
  2006年   202篇
  2005年   143篇
  2004年   113篇
  2003年   120篇
  2002年   92篇
  2001年   64篇
  2000年   74篇
  1999年   78篇
  1998年   451篇
  1997年   212篇
  1996年   180篇
  1995年   109篇
  1994年   85篇
  1993年   115篇
  1992年   43篇
  1991年   47篇
  1990年   37篇
  1989年   24篇
  1988年   43篇
  1987年   37篇
  1986年   32篇
  1985年   30篇
  1984年   26篇
  1983年   25篇
  1982年   28篇
  1981年   23篇
  1980年   32篇
  1979年   17篇
  1978年   24篇
  1977年   58篇
  1976年   114篇
  1975年   21篇
排序方式: 共有8644条查询结果,搜索用时 562 毫秒
101.
OBJECTIVE: We examined the effects of aprotinin on graft patency, prevalence of myocardial infarction, and blood loss in patients undergoing primary coronary surgery with cardiopulmonary bypass. METHODS: Patients from 13 international sites were randomized to receive intraoperative aprotinin (n = 436) or placebo (n = 434). Graft angiography was obtained a mean of 10.8 days after the operation. Electrocardiograms, cardiac enzymes, and blood loss and replacement were evaluated. RESULTS: In 796 assessable patients, aprotinin reduced thoracic drainage volume by 43% (P < .0001) and requirement for red blood cell administration by 49% (P < .0001). Among 703 patients with assessable saphenous vein grafts, occlusions occurred in 15.4% of aprotinin-treated patients and 10.9% of patients receiving placebo (P = .03). After we had adjusted for risk factors associated with vein graft occlusion, the aprotinin versus placebo risk ratio decreased from 1.7 to 1.05 (90% confidence interval, 0.6 to 1.8). These factors included female gender, lack of prior aspirin therapy, small and poor distal vessel quality, and possibly use of aprotinin-treated blood as excised vein perfusate. At United States sites, patients had characteristics more favorable for graft patency, and occlusions occurred in 9.4% of the aprotinin group and 9.5% of the placebo group (P = .72). At Danish and Israeli sites, where patients had more adverse characteristics, occlusions occurred in 23.0% of aprotinin- and 12.4% of placebo-treated patients (P = .01). Aprotinin did not affect the occurrence of myocardial infarction (aprotinin: 2.9%; placebo: 3.8%) or mortality (aprotinin: 1.4%; placebo: 1.6%). CONCLUSIONS: In this study, the probability of early vein graft occlusion was increased by aprotinin, but this outcome was promoted by multiple risk factors for graft occlusion.  相似文献   
102.
103.
Peer-to-Peer networks attracted a significant amount of interest because of their capacity for resource sharing and content distribution. Content distribution applications allow personal computers to function in a coordinated manner as a distributed storage medium by contributing, searching, and obtaining digital content. Searching in unstructured P2P networks is an important problem, which has received considerable research attention. Acceptable searching techniques must provide large coverage rate, low traffic load, and optimum latency. This paper reviews flooding-based search techniques in unstructured P2P networks. It then analytically compares their coverage rate, and traffic overloads. Our simulation experiments have validated analytical results.  相似文献   
104.

Volleyball premier league (VPL) simulating some phenomena of volleyball game has been presented recently. This powerful algorithm uses such racing and interplays between teams within a season. Furthermore, the algorithm imitates the coaching procedure within a game. Therefore, some volleyball metaphors, including substitution, coaching, and learning, are used to find a better solution prepared by the VPL algorithm. However, the learning phase has the largest effect on the performance of the VPL algorithm, in which this phase can lead to making the VPL stuck in optimal local solution. Therefore, this paper proposed a modified VPL using sine cosine algorithm (SCA). In which the SCA operators have been applied in the learning phase to obtain a more accurate solution. So, we have used SCA operators in VPL to grasp their advantages resulting in a more efficient approach for finding the optimal solution of the optimization problem and avoid the limitations of the traditional VPL algorithm. The propounded VPLSCA algorithm is tested on the 25 functions. The results captured by the VPLSCA have been compared with other metaheuristic algorithms such as cuckoo search, social-spider optimization algorithm, ant lion optimizer, grey wolf optimizer, salp swarm algorithm, whale optimization algorithm, moth flame optimization, artificial bee colony, SCA, and VPL. Furthermore, the three typical optimization problems in the field of designing engineering have been solved using the VPLSCA. According to the obtained results, the proposed algorithm shows very reasonable and promising results compared to others.

  相似文献   
105.

Ultra-high-performance concrete (UHPC) is a recent class of concrete with improved durability, rheological and mechanical and durability properties compared to traditional concrete. The production cost of UHPC is considerably high due to a large amount of cement used, and also the high price of other required constituents such as quartz powder, silica fume, fibres and superplasticisers. To achieve specific requirements such as desired production cost, strength and flowability, the proportions of UHPC’s constituents must be well adjusted. The traditional mixture design of concrete requires cumbersome, costly and extensive experimental program. Therefore, mathematical optimisation, design of experiments (DOE) and statistical mixture design (SMD) methods have been used in recent years, particularly for meeting multiple objectives. In traditional methods, simple regression models such as multiple linear regression models are used as objective functions according to the requirements. Once the model is constructed, mathematical programming and simplex algorithms are usually used to find optimal solutions. However, a more flexible procedure enabling the use of high accuracy nonlinear models and defining different scenarios for multi-objective mixture design is required, particularly when it comes to data which are not well structured to fit simple regression models such as multiple linear regression. This paper aims to demonstrate a procedure integrating machine learning (ML) algorithms such as Artificial Neural Networks (ANNs) and Gaussian Process Regression (GPR) to develop high-accuracy models, and a metaheuristic optimisation algorithm called Particle Swarm Optimisation (PSO) algorithm for multi-objective mixture design and optimisation of UHPC reinforced with steel fibers. A reliable experimental dataset is used to develop the models and to justify the final results. The comparison of the obtained results with the experimental results validates the capability of the proposed procedure for multi-objective mixture design and optimisation of steel fiber reinforced UHPC. The proposed procedure not only reduces the efforts in the experimental design of UHPC but also leads to the optimal mixtures when the designer faces strength-flowability-cost paradoxes.

  相似文献   
106.
Recently, medical image compression becomes essential to effectively handle large amounts of medical data for storage and communication purposes. Vector quantization (VQ) is a popular image compression technique, and the commonly used VQ model is Linde–Buzo–Gray (LBG) that constructs a local optimal codebook to compress images. The codebook construction was considered as an optimization problem, and a bioinspired algorithm was employed to solve it. This article proposed a VQ codebook construction approach called the L2‐LBG method utilizing the Lion optimization algorithm (LOA) and Lempel Ziv Markov chain Algorithm (LZMA). Once LOA constructed the codebook, LZMA was applied to compress the index table and further increase the compression performance of the LOA. A set of experimentation has been carried out using the benchmark medical images, and a comparative analysis was conducted with Cuckoo Search‐based LBG (CS‐LBG), Firefly‐based LBG (FF‐LBG) and JPEG2000. The compression efficiency of the presented model was validated in terms of compression ratio (CR), compression factor (CF), bit rate, and peak signal to noise ratio (PSNR). The proposed L2‐LBG method obtained a higher CR of 0.3425375 and PSNR value of 52.62459 compared to CS‐LBG, FA‐LBG, and JPEG2000 methods. The experimental values revealed that the L2‐LBG process yielded effective compression performance with a better‐quality reconstructed image.  相似文献   
107.
The Journal of Supercomputing - Wireless sensor networks (WSNs) are typically deployed environments, often very hostile and without assistance. A certain level of security must be provided....  相似文献   
108.
Neural Computing and Applications - Security is one of the primary concerns when designing wireless networks. Along detecting user identity, it is also important to detect the devices at the...  相似文献   
109.
Neural Computing and Applications - This paper presents an adaptive fuzzy fault-tolerant tracking control for a class of unknown multi-variable nonlinear systems, with external disturbances,...  相似文献   
110.

In the era of Industry 4.0, the ease of access to precise measurements in real-time and the existence of machine-learning (ML) techniques will play a vital role in building practical tools to isolate inefficiencies in energy-intensive processes. This paper aims at developing an abnormal event diagnosis (AED) tool based on ML techniques for monitoring the operation of industrial processes. This tool makes it easier for operators to accomplish their tasks and to make quick and accurate decisions to ensure highly efficient processes. One of the most popular ML techniques for AED is the multivariate statistical control (MSC) method; it only requires the dataset of the normal operating conditions (NOC) to detect and identify the variables that contribute to abnormal events (AEs). Despite the popularity of MSC, it is challenging to select the appropriate method for detecting and isolating all possible abnormalities a complex industrial process can experience. To address this limitation and improve efficiency, we have developed a generic methodology that integrates different ML techniques into a unified multiagent based approach, the selected ML techniques are supposed to be built using only the normal operating condition. For the sake of demonstration, we chose a combination of two ML methods: principal component analysis and k-nearest neighbors (k-NN). The k-NN was integrated into the proposed multiagent to take into account the nonlinearity and multimodality that frequently occur in industrial processes. In addition, we modified a k-NN method proposed in the literature to reduce computation time during real-time detection and isolation. Finally, the proposed methodology was successfully validated to monitor the energy efficiency of a reboiler located in a thermomechanical pulp mill.

  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号