首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4129篇
  免费   280篇
  国内免费   26篇
工业技术   4435篇
  2024年   10篇
  2023年   96篇
  2022年   170篇
  2021年   285篇
  2020年   200篇
  2019年   209篇
  2018年   277篇
  2017年   226篇
  2016年   229篇
  2015年   154篇
  2014年   240篇
  2013年   380篇
  2012年   223篇
  2011年   259篇
  2010年   175篇
  2009年   143篇
  2008年   106篇
  2007年   89篇
  2006年   92篇
  2005年   50篇
  2004年   75篇
  2003年   43篇
  2002年   42篇
  2001年   43篇
  2000年   32篇
  1999年   33篇
  1998年   56篇
  1997年   39篇
  1996年   41篇
  1995年   31篇
  1994年   31篇
  1993年   31篇
  1992年   32篇
  1991年   27篇
  1990年   14篇
  1989年   26篇
  1988年   31篇
  1987年   26篇
  1986年   19篇
  1985年   24篇
  1984年   15篇
  1983年   15篇
  1982年   13篇
  1981年   10篇
  1979年   11篇
  1978年   12篇
  1977年   7篇
  1976年   12篇
  1975年   8篇
  1974年   7篇
排序方式: 共有4435条查询结果,搜索用时 281 毫秒
41.

The main purpose of this work is to develop a spectrally accurate collocation method for solving weakly singular integral equations of the second kind with nonsmooth solutions in high dimensions. The proposed spectral collocation method is based on a multivariate Jacobi approximation in the frequency space. The essential idea is to adopt a smoothing transformation for the spectral collocation method to circumvent the curse of singularity at the beginning of time. As such, the singularity of the numerical approximation can be tailored to that of the singular solutions. A rigorous convergence analysis is provided and confirmed by numerical tests with nonsmooth solutions in two dimensions. The results in this paper seem to be the first spectral approach with a theoretical justification for high-dimensional nonlinear weakly singular Volterra type equations with nonsmooth solutions.

  相似文献   
42.
Recently, medical image compression becomes essential to effectively handle large amounts of medical data for storage and communication purposes. Vector quantization (VQ) is a popular image compression technique, and the commonly used VQ model is Linde–Buzo–Gray (LBG) that constructs a local optimal codebook to compress images. The codebook construction was considered as an optimization problem, and a bioinspired algorithm was employed to solve it. This article proposed a VQ codebook construction approach called the L2‐LBG method utilizing the Lion optimization algorithm (LOA) and Lempel Ziv Markov chain Algorithm (LZMA). Once LOA constructed the codebook, LZMA was applied to compress the index table and further increase the compression performance of the LOA. A set of experimentation has been carried out using the benchmark medical images, and a comparative analysis was conducted with Cuckoo Search‐based LBG (CS‐LBG), Firefly‐based LBG (FF‐LBG) and JPEG2000. The compression efficiency of the presented model was validated in terms of compression ratio (CR), compression factor (CF), bit rate, and peak signal to noise ratio (PSNR). The proposed L2‐LBG method obtained a higher CR of 0.3425375 and PSNR value of 52.62459 compared to CS‐LBG, FA‐LBG, and JPEG2000 methods. The experimental values revealed that the L2‐LBG process yielded effective compression performance with a better‐quality reconstructed image.  相似文献   
43.
Geologists interpret seismic data to understand subsurface properties and subsequently to locate underground hydrocarbon resources. Channels are among the most important geological features interpreters analyze to locate petroleum reservoirs. However, manual channel picking is both time consuming and tedious. Moreover, similar to any other process dependent on human intervention, manual channel picking is error prone and inconsistent. To address these issues, automatic channel detection is both necessary and important for efficient and accurate seismic interpretation. Modern systems make use of real-time image processing techniques for different tasks. Automatic channel detection is a combination of different mathematical methods in digital image processing that can identify streaks within the images called channels that are important to the oil companies. In this paper, we propose an innovative automatic channel detection algorithm based on machine learning techniques. The new algorithm can identify channels in seismic data/images fully automatically and tremendously increases the efficiency and accuracy of the interpretation process. The algorithm uses deep neural network to train the classifier with both the channel and non-channel patches. We provide a field data example to demonstrate the performance of the new algorithm. The training phase gave a maximum accuracy of 84.6% for the classifier and it performed even better in the testing phase, giving a maximum accuracy of 90%.  相似文献   
44.
The Journal of Supercomputing - The Internet of Things is a rapidly evolving technology in which interconnected computing devices and sensors share data over the network to decipher different...  相似文献   
45.
Neural Computing and Applications - Preserving red-chili quality is of utmost importance in which the authorities demand quality techniques to detect, classify, and prevent it from impurities. For...  相似文献   
46.
One of the important aspects in achieving better performance for transient stability assessment (TSA) of power systems employing computational intelligence (CI) techniques is by incorporating feature reduction techniques. For small power system the number of features may be small but when larger systems are considered the number of features increased as the size of the systems increases. Apart from employing faster CI techniques to achieve faster and accurate TSA of power system, feature reduction techniques are needed in reducing the input features while preserving the needed information so as to make faster training of the CI technique. This paper presents feature reductions techniques used, namely correlation analysis and principle component analysis, in reducing number of input features presented to two CI techniques for TSA, namely probabilistic neural network (PNN) and least squares support vector machines (LS-SVM). The proposed feature reduction techniques are implemented and tested on the IEEE 39-bus test system and 87-bus Malaysia’s power system. Numerical results are presented to demonstrate the performance of the feature reduction techniques and its effects on the accuracies and time taken for training the two CI techniques.  相似文献   
47.
In a world in which millions of people express their opinions about commercial products in blogs, wikis, fora, chats and social networks, the distillation of knowledge from this huge amount of unstructured information can be a key factor for marketers who want to create an image or identity in the minds of their customers for their product, brand or organization. Opinion mining for product positioning, in fact, is getting a more and more popular research field but the extraction of useful information from social media is not a simple task. In this work we merge AI and Semantic Web techniques to extract, encode and represent this unstructured information. In particular, we use Sentic Computing, a multi-disciplinary approach to opinion mining and sentiment analysis, to semantically and affectively analyze text and encode results in a semantic aware format according to different web ontologies. Eventually we represent this information as an interconnected knowledge base which is browsable through a multi-faceted classification website.  相似文献   
48.
Autonomous mapping of HL7 RIM and relational database schema   总被引:1,自引:0,他引:1  
Healthcare systems need to share information within and across the boundaries in order to provide better care to the patients. For this purpose, they take advantage of the full potential of current state of the art in healthcare standards providing interoperable solutions. HL7 V3 specification is an international message exchange and interoperability standard. HL7 V3 messages exchanged between healthcare applications are ultimately recorded into local healthcare databases, mostly in relational databases. In order to bring these relational databases in compliance with HL7, mappings between HL7 RIM (Reference Information Model) and relational database schema are required. Currently, RIM and database mapping is largely performed manually, therefore it is tedious, time consuming, error prone and expensive process. It is a challenging task to determine all correspondences between RIM and schema automatically because of extreme heterogeneity issues in healthcare databases. To reduce the amount of manual efforts as much as possible, autonomous mapping approaches are required. This paper proposes a technique that addresses the aforementioned mapping issue and aligns healthcare databases to HL7 V3 RIM specifications. Furthermore, the proposed technique has been implemented as a working application and tested on real world healthcare systems. The application loads the target healthcare schema and then identifies the most appropriate match for tables and the associated fields in the schema by using domain knowledge and the matching rules defined in the Mapping Knowledge Repository. These rules are designed to handle the complexity of semantics found in healthcare databases. The GUI allows users to view and edit/re-map the correspondences. Once all the mappings are defined, the application generates Mapping Specification, which contains all the mapping information i.e. database tables and fields with associated RIM classes and attributes. In order to enable the transactions, the application is facilitated with the autonomous code generation from the Mapping Specification. The Code Generator component focuses primarily on generating custom classes and hibernate mapping files against the runtime system to retrieve and parse the data from the data source—thus allows bi-directional HL7 to database communication, with minimum programming required. Our experimental results show 35–65% accuracy on real laboratory systems, thus demonstrating the promise of the approach. The proposed scheme is an effective step in bringing the clinical databases in compliance with RIM, providing ease and flexibility.  相似文献   
49.
A series of (Cd1–x Zn x )S powders phosphors and thin layers prepared by thermal evaporation of solid solution were studied. The phosphors used were 41% ZnS: 59% CdS with a cobalt concentration from 0 to 0.325%. The analysis of the structure of films of different thicknesses using X-ray diffraction technique confirms that the calculated relative intensities of the planes show considerable differences from the experimental results. For film thicknesses 70 nm thea axis is normal to the substrate, while at greater thicknesses (273 nm) thec axis is practically normal to the substrate. The effect of the electron beam on the solid solution indicates that layers decomposed leaving the grain boundaries decorated by metallic cadmium and zinc particles.  相似文献   
50.
The productivity of agricultural produce is fairly dependent on the availability of nutrients and efficient use. Magnesium (Mg2+) is an essential macronutrient of living cells and is the second most prevalent free divalent cation in plants. Mg2+ plays a role in several physiological processes that support plant growth and development. However, it has been largely forgotten in fertilization management strategies to increase crop production, which leads to severe reductions in plant growth and yield. In this review, we discuss how the Mg2+ shortage induces several responses in plants at different levels: morphological, physiological, biochemical and molecular. Additionally, the Mg2+ uptake and transport mechanisms in different cellular organelles and the role of Mg2+ transporters in regulating Mg2+ homeostasis are also discussed. Overall, in this review, we critically summarize the available information about the responses of Mg deficiency on plant growth and development, which would facilitate plant scientists to create Mg2+-deficiency-resilient crops through agronomic and genetic biofortification.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号