首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   123篇
  免费   2篇
工业技术   125篇
  2022年   2篇
  2021年   3篇
  2019年   6篇
  2018年   2篇
  2017年   2篇
  2016年   3篇
  2015年   1篇
  2014年   8篇
  2013年   7篇
  2012年   6篇
  2011年   9篇
  2010年   8篇
  2009年   4篇
  2008年   10篇
  2007年   6篇
  2006年   9篇
  2005年   8篇
  2004年   5篇
  2002年   3篇
  2001年   2篇
  1999年   1篇
  1998年   2篇
  1995年   2篇
  1994年   1篇
  1993年   1篇
  1992年   1篇
  1991年   1篇
  1986年   2篇
  1985年   1篇
  1984年   1篇
  1982年   1篇
  1981年   2篇
  1980年   1篇
  1979年   1篇
  1977年   1篇
  1976年   1篇
  1974年   1篇
排序方式: 共有125条查询结果,搜索用时 93 毫秒
1.
The cost of maintaining a software system over a long period of time far exceeds its initial development cost. Much of the maintenance cost is attributed to the time required by new developers to understand legacy systems. High-level structural information helps maintainers navigate through the numerous low-level components and relations present in the source code. Modularization tools can be used to produce subsystem decompositions from the source code but do not typically produce high-level architectural relations between the newly found subsystems. Controlling subsystem interactions is one important way in which the overall complexity of software maintenance can be reduced.We have developed a tool, called ARIS (Architecture Relation Inference System), that enables software engineers to define rules and relations for regulating subsystem interactions. These rules and relations are called Interconnection Styles and are definedusing a visual notation. The style definition is used by our tool to infer subsystem-level relations in designs being reverse engineered from source code.In this paper we describe our tool and its underlying techniques and algorithms. Using a case study, we describe how ARIS is used to reverse engineer high-level structural information from a real application.  相似文献   
2.
Extraction-Transformation-loading (ETL) tools are pieces of software responsible for the extraction of data from several sources, their cleansing, customization and insertion into a data warehouse. Literature and personal experience have guided us to conclude that the problems concerning the ETL tools are primarily problems of complexity, usability and price. To deal with these problems we provide a uniform metamodel for ETL processes, covering the aspects of data warehouse architecture, activity modeling, contingency treatment and quality management. The ETL tool we have developed, namely , is capable of modeling and executing practical ETL scenarios by providing explicit primitives for the capturing of common tasks. provides three ways to describe an ETL scenario: a graphical point-and-click front end and two declarative languages: XADL (an XML variant), which is more verbose and easy to read and SADL (an SQL-like language) which has a quite compact syntax and is, thus, easier for authoring.  相似文献   
3.
Estimating average throughput and packet transmission delay for worst case scenario (cell edge users) is crucial for LTE cell planners in order to preserve strict QoS for delay sensitive applications. Cell planning techniques emphasize mostly on cell range (coverage) and throughput predictions but not on delay. Cell edge users mostly suffer from throughput reduction due to bad coverage and consequently unexpected uplink transmission delays. To estimate cell edge throughput a common practice on international literature is the use of simulation results. However simulations are never accurate since MAC scheduler is a vendor specific software implementation and not 3GPP explicitly specified. This paper skips simulations and proposes an IP transmission delay and average throughput analytical estimation using mathematical modeling based on probability delay analysis, thus offering to cell planners a useful tool for analytical estimation of uplink average IP transmission.  相似文献   
4.
The microbiological and physicochemical changes of industrially fermented Halkidiki and Conservolea green table olives were determined. Samples were analysed to monitor the population of lactic acid bacteria (LAB), yeasts and Enterobacteriaceae, together with changes in pH, acidity, salinity, colour, lactic acid, acetic acid and ethanol. LAB and yeast species diversity was evaluated at the beginning (1 day), middle (75 days) and final (135 days) stages of fermentation by RAPD-PCR genomic fingerprinting. Results revealed vigorous lactic acid processes as indicated by the dominance of LAB over yeasts. No Enterobacteriaceae could be detected after 30 days. Lactiplantibacillus plantarum (formerly Lactobacillus plantarum) dominated in the beginning of fermentation in both varieties. In the end, Lactiplantibacillus pentosus (formerly Lactobacillus pentosus) and Pediococcus ethanolidurans prevailed in Halkidiki and Conservolea varieties, respectively. As for yeasts, Kluyveromyces lactis/marxianus and Pichia manshurica prevailed at the onset of fermentation in Halkidiki and Conservolea varieties, whereas in the end Pichia membranifaciens dominated in both varieties.  相似文献   
5.
A sediment column study was carried out to demonstrate the bioremediation of chloroethene- and nickel-contaminated sediment in a single anaerobic step under sulfate-reducing conditions. Four columns (one untreated control column and three experimental columns) with sediment from a chloroethene- and nickel-contaminated site were investigated for 1 year applying different treatments. By stimulating the activity of sulfate-reducing bacteria by the addition of sulfate as supplementary electron acceptor, complex anaerobic communities were maintained with lactate as electron donor (with or without methanol), which achieved complete dehalogenation of tetra- and trichloroethenes (PCE and TCE) to ethene and ethane. A few weeks after sulfate addition, production of sulfide increased, indicating an increasing activity of sulfate-reducing bacteria. The nickel concentration in the effluent of one nickel-spiked column was greatly reduced, likely due to the enhanced sulfide production, causing precipitation of nickel sulfide. At the end of the study, 94% of the initial amount of nickel added to that column was recovered in the sediment As compared to the untreated (nonspiked) control column, all chloroethene-spiked columns ladditions of PCE and TCE) showed a permanent release of small chloride ion quantities (approximately 0.5-0.7 mM chloride), which were detected in the effluents a few weeks after sulfide production was observed for the first time. The formation of ethene and ethane as final products after dechlorination of PCE and TCE was detected in some effluents and in some gas phases of the columns. Other metabolites or intermediates (such as DCE isomers) were only detected sporadically in negligible quantities. The results of this study demonstrated thatmicrobial activity stimulated under sulfate-reducing conditions can have a beneficial effect on both the precipitation of heavy metals and the complete dechlorination of organochlorines. The strongly negative redox potential created by the activity of sulfate-reducing bacteria may be one factor responsible for stimulating the activity of the dehalogenating bacteria in the test columns.  相似文献   
6.
7.
正建筑师Theoni Xanthi获得新塞浦路斯国际建筑竞赛一等奖。此次竞赛分为两个阶段,第一阶段共收到129份参赛作品,第二阶段入围7份作品。项目耗资7 500万欧元,新的考古博物馆将容纳塞浦路斯考古机构所有的重要收藏;同时,博物馆作为"文化之岛"——尼科西亚市的多功能文化设施,功能包括文物部办公室、会议室、图书馆、咖啡馆及餐厅、文化活动平台及一个达42 000平方米的开放的底层景观平台。考古博物馆坐落在离尼科西亚中世纪城墙不远的地方,处于城市绿地与城市中心区的中间地带。因此,作为应对,  相似文献   
8.
An efficient novel strategy for color-based image retrieval is introduced. It is a hybrid approach combining a data compression scheme based on self-organizing neural networks with a nonparametric statistical test for comparing vectorial distributions. First, the color content in each image is summarized by representative RGB-vectors extracted using the Neural-Gas network. The similarity between two images is then assessed as commonality between the corresponding representative color distributions and quantified using the multivariate Wald–Wolfowitz test. Experimental results drawn from the application to a diverse collection of color images show a significantly improved performance (approximately 10–15% higher) relative to both the popular, simplistic approach of color histogram and the sophisticated, computationally demanding technique of Earth Mover’s Distance.  相似文献   
9.
In numerically controlled systems (machine tools, plotters, flamecutters, etc.), interpolation is defined as the process of synthesizing a prescribed curve from a large number of small orthogonal steps. This paper investigates the evolution of interpolation algorithms from the early days of numerical control to the present.The algorithms presented are the most attractive computationally, since they rely on addition and subtraction alone for generating the next step. The DDA method, which dominated the premicroprocessor era, is still fully competitive when properly implemented on a general purpose computer. Now, however, its use is mostly confined to generating straight lines, where the degradation problem is absent. For higher degree curves, the pattern recognition approach provides superior accuracy.  相似文献   
10.
Rarely have two independently conceived devices turned out to have so perfectly matching characteristics as the digital computer and the stepping motor. The nature of this basic compatibility, however, must be thoroughly understood if pitfalls are to be avoided in the selection and use of stepping motors in digital control systems. This paper attempts to consolidate in one place enough information to allow intelligent decisions to be made in this regard. Topics covered are the principle of operation of variable reluctance and permanent magnet motors, electrohydraulic stepping motors and the design of low and high performance stepping motor drives.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号