首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   155篇
  免费   4篇
工业技术   159篇
  2024年   1篇
  2023年   2篇
  2022年   4篇
  2021年   6篇
  2020年   3篇
  2019年   1篇
  2018年   2篇
  2017年   3篇
  2016年   5篇
  2015年   3篇
  2014年   4篇
  2013年   10篇
  2012年   4篇
  2011年   9篇
  2010年   5篇
  2009年   7篇
  2008年   4篇
  2007年   9篇
  2006年   4篇
  2005年   10篇
  2004年   4篇
  2003年   2篇
  2002年   1篇
  2001年   5篇
  2000年   4篇
  1999年   4篇
  1998年   5篇
  1997年   1篇
  1996年   3篇
  1995年   3篇
  1994年   3篇
  1993年   3篇
  1991年   3篇
  1989年   2篇
  1988年   1篇
  1987年   1篇
  1985年   2篇
  1983年   1篇
  1982年   2篇
  1981年   1篇
  1979年   2篇
  1978年   3篇
  1977年   2篇
  1976年   5篇
排序方式: 共有159条查询结果,搜索用时 15 毫秒
51.
The objective of this research was to characterize the performance of granulated activated carbon (GAC) as a carrier for Pseudomonas ADP in a non‐sterile continuous fluidized bed reactor for atrazine degradation under anoxic conditions. The GAC was compared with two non‐adsorbing carriers: non‐adsorbing carbon particles (‘Baker product’) having the same surface area available for biofilm growth as the GAC, and sintered glass beads. The initial atrazine degradation efficiency was higher than 90% in the reactors with the non‐adsorbing carriers, but deteriorated to 20% with time due to contamination by foreign denitrifying bacteria. In contrast, no deterioration was observed in the biological granulated activated carbon (BGAC) reactor. A maximal atrazine volumetric and specific degradation rate of 0.820 ± 0.052 g atrazine dm?3 day?1 and 1.7 ± 0.4 g atrazine g?1 protein day?1 respectively were observed in the BGAC reactor. Concurrent atrazine biodegradation and desorption from the carrier was shown and an effluent concentration of 0.002 mg dm?3 (below the EPA standard) was achieved in the BGAC reactor. The advantages of the BGAC reactor over the non‐adsorbing carrier reactors can probably be explained by the adsorption–desorption mechanism providing favorable microenvironmental conditions for atrazine–degrading bacteria. Copyright © 2004 Society of Chemical Industry  相似文献   
52.
Deadlock detection is an important service that the run-time system of a parallel environment should provide. In parallel programs deadlock can occur when the different processes are waiting for various events, as opposed to concurrent systems, where deadlock occurs when processes wait for resources held by other processes. Therefore classical deadlock detection techniques such as checking for cycles in the wait-for graph are unapplicable. An alternative algorithm that checks whether all the processes are blocked is presented. This algorithm deals with situations in which the state transition from blocked to unblocked is indirect, as may happen when busy-waiting is used.  相似文献   
53.
In this paper we examine the issue of tool management in a flexible manufacturing cell. The type of system considered here is typical of mechanical manufacturing, in which large metallic parts are loaded on the machines and are not moved until processing completion. The architecture of the cell is characterized by the absence of on-board tool magazines on the machines. Although this permits the continuous maintenance and inspection of the tools and typically results in cost and workspace savings, it calls for more complex tool handling procedures. We present a heuristic to address the overall problem of assigning parts to machines, sequencing parts on each machine, and synchronizing tool movements. The results indicate that our method provides near-optimal solutions in terms of makespan and mean flow time. Further, we observe that the solution procedure is at least one order of magnitude faster than the approach currently used and also results in a much better mean flow time.  相似文献   
54.
Traffic offences present danger to the offender, and to others. This study examines differences in decision making and personality between traffic offenders and non-offenders. Fifty-one traffic offenders participating in penalty courses were compared to a control group of 36 drivers who were not penalized for traffic offences in the 5 years prior to the study. All participants performed the Iowa Gambling Task (IGT), a popular decision task employed for assessing cognitive impulsivity, and completed the "big five" personality questionnaire. The results showed that traffic offenders made fewer advantageous choices on the IGT; and an analysis with a formal cognitive model, the Expectancy Valance model, suggests that this results from offenders' high weighting of gains compared to losses. An examination of personality factors reveals that traffic offenders were more extraverted. The predictive power of IGT performance was comparable to that of the personality factor. These results demonstrate that the IGT can be useful for studying individual differences in risk taking in a real-world task, and combined with the EV model, identify the sources of these differences.  相似文献   
55.
This paper describes a new and efficient method for low bit-rate image coding which is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. It combines a binary space partition scheme with geometric wavelet (GW) tree approximation so as to efficiently capture curve singularities and provide a sparse representation of the image. The GW method successfully competes with state-of-the-art wavelet methods such as the EZW, SPIHT, and EBCOT algorithms. We report a gain of about 0.4 dB over the SPIHT and EBCOT algorithms at the bit-rate 0.0625 bits-per-pixels (bpp). It also outperforms other recent methods that are based on "sparse geometric representation." For example, we report a gain of 0.27 dB over the Bandelets algorithm at 0.1 bpp. Although the algorithm is computationally intensive, its time complexity can be significantely reduced by collecting a "global" GW n-term approximation to the image from a collection of GW trees, each constructed separately over tiles of the image.  相似文献   
56.
Injectable hydrogels are often preferred when designing carriers for cell therapy or developing new bio-ink formulations. Biosynthetic hydrogels, which are a class of materials made with a hybrid design strategy, can be advantageous for endowing injectability while maintaining biological activity of the material. The chemical modification required to make these gels injectable by specific crosslinking pathways can be challenging and also make the hydrogels inhospitable to cells. Therefore, most efforts to functionalize biosynthetic hydrogel precursors toward injectability in the presence of cells try to balance between chemical and biological functionality, in order to preserve cell compatibility while addressing the injectability design challenges. Accordingly, hydrogel crosslinking strategies have evolved to include the use of photoinitiated “click” chemistry or bio-orthogonal reactions with rapid gelation kinetics and minimal cyto-toxicity required when working with cell-compatible hydrogel systems. With many new injectable biosynthetic materials emerging, their impact in cell-based regenerative medicine and bioprinting is also becoming more apparent. This review covers the main strategies that are used to endow biosynthetic polymers with injectability through rapid, cyto-compatible physical or covalent crosslinking and the main considerations for using the resulting injectable hydrogels in cell therapy, tissue regeneration, and bioprinting.  相似文献   
57.
We apply well‐known quality engineering matrix techniques such as quality function deployment; Teoriya Resheniya Izobretatelskikh Zadatch (TRIZ); and failure mode, effects, and criticality analysis for characterizing, mapping, and preventing human error (or, at least, reducing damage caused by errors). Human errors (‘WHATs’, in the language of quality function deployment) are classified according to 10 characteristics, while 20 typical types (or protective layers)—‘HOWs’—in quality assurance systems are proposed for preventing/stopping/minimizing to some extent damage caused by the error. During the analysis of a specific system, any error is estimated according to its likelihood and severity, and every protective layer receives a score according to its effectiveness in preventing errors. Synergy or antagonism between protective layers may also be taken into account when calculating the effectiveness. The approach facilitates evaluation and comparison of the effectiveness of different quality assurance systems dealing with human errors. The authors emphasize the need to create a ‘recipe book’ based on a historical database, which will enable, after characterizing the potential human errors according to the 10 criteria mentioned earlier, application of the optimal prevention efforts. The proposed approach is illustrated by an example of product delivery errors analysis. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
58.
The performance of modern microprocessors is greatly affected by cache behavior, instruction scheduling, register allocation and loop overhead. High-level loop transformations such as fission, fusion, tiling, interchanging and outer loop unrolling (e.g., unroll and jam) are well known to be capable of improving all these aspects of performance. Difficulties arise because these machine characteristics and these optimizations are highly interdependent. Interchanging two loops might, for example, improve cache behavior but make it impossible to allocate registers in the inner loop. Similarly, unrolling or interchanging a loop might individually hurt performance but doing both simultaneously might help performance. Little work has been published on how to combine these transformations into an efficient and effective compiler algorithm. In this paper, we present a model that estimates total machine cycle time taking into account cache misses, software pipelining, register pressure and loop overhead. We then develop an algorithm to intelligently search through the various, possible transformations, using our machine model to select the set of transformations leading to the best overall performance. We have implemented this algorithm as part of the MIPSPro commercial compiler system. We give experimental results showing that our approach is both effective and efficient in optimizing numerical programs.  相似文献   
59.
This article examines the problem of uncertainty and its effects upon the decision making processes of top-level politicians. I use the Middle East Peace Process as a case to explore the handling of uncertainty in statecraft, and present a concept of statecraft as prudent risk taking, that is, carefully contemplated decisions taken with full awareness of the dangers involved as well as opportunities. I conclude by offering a set of professional approaches that may be used to improve statecraft as prudent risk taking.  相似文献   
60.
For single period inventory models with normally distributed, correlated individual demands we examine the problem of minimizing the cost of inventory centralization as a function of the covariance matrix. In a stable centralized setting there are no incentives for any party to break-away -- referred to as nonempty core conditions. For the allocated benefits in inventory centralization, nonempty core conditions are always satisfied. In this paper we discuss a step by step greedy optimization procedure which computes an optimal centralization solution. The procedure manipulates the correlations without changing the mean or the variance at each store. We do not just accept that in the centralized setting the parties are better-off but for the first time provide the analysis of how to maximize their collective benefits.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号