首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   43篇
  免费   0篇
工业技术   43篇
  2018年   1篇
  2015年   2篇
  2014年   1篇
  2013年   1篇
  2012年   2篇
  2011年   2篇
  2010年   2篇
  2007年   1篇
  2006年   5篇
  2005年   3篇
  2004年   3篇
  2003年   2篇
  2002年   2篇
  1998年   2篇
  1997年   2篇
  1996年   3篇
  1993年   1篇
  1991年   1篇
  1989年   1篇
  1985年   1篇
  1977年   1篇
  1972年   1篇
  1971年   1篇
  1968年   2篇
排序方式: 共有43条查询结果,搜索用时 500 毫秒
1.
This paper is the first attempt to successfully design efficient approximation algorithms for the single-machine weighted flow-time minimization problem when jobs have different release dates and weights equal to their processing times under the assumption that one job is fixed (i.e., the machine is unavailable during a fixed interval corresponding to the fixed job). Our work is motivated by an interesting algorithmic application to the generation of valid inequalities in a branch-and-cut method. Our analysis shows that the trivial FIFO sequence can lead to an arbitrary large worst-case performance bound. Hence, we modify this sequence so that a new 2-approximation solution can be obtained for every instance and we prove the tightness of this bound. Then, we propose a fully polynomial-time approximation algorithm with efficient running time for the considered problem. Especially, the complexity of our algorithm is strongly polynomial.  相似文献   
2.
H. Kellerer 《Computing》1991,46(3):183-191
The well-known, NP-complete problem of scheduling a set ofn independent jobs nonpreemptively onm identical parallel processors to minimize the maximum finish time is considered. Let ω0 be the finish time of an optimal schedule and ω the finish time of a schedule found by the Longest Processing Time (LPT-)heuristic. We will improve the Graham-bound for the LPT-heuristic (ω/ω0 ≤ 4/3 ? 1/3m) which is tight in general, by considering only jobs with similar processing times.  相似文献   
3.
Sir, Response to ‘Are all photon radiations similar in largeabsorbers?—A comparison of electron spectra’ byA. M. Kellerer and H. Roos When the ICRP adopted a quality factor—and subsequentlya radiation weighting factor—that gives equal weight todifferent photon radiations, it did not, necessarily imply thatequal  相似文献   
4.
5.
Telecommunications technologies are undergoing a major paradigm shift. Standards-based, off-the-shelf components and the Internet are gaining wide acceptance. The success of this move is strongly dependent upon the quality and availability of these technologies.Practical quality assurance in this environment can take advantage of the tools and methods developed when carrier-grade systems for the telecommunications market were being deployed. Besides standard test methods, availability-related methods for redundant hardware and software components are applied. Statistics are available that prove the success of this approach. The statistical data are derived from the deployment of the commercial product RTP4 Continuous Services, a standards-based high-availability middleware.Additional momentum has been gained in the Service Availability Forum (www.saforum.org), where the interface standards are validated and certified in independent test processes.  相似文献   
6.
Childhood leukemia (ICD 204-208 [1]) incidence rates in the different regions of Belarus are reported for a period before and after the Chernobyl accident (1982-1994). There are, at this point, no recognizable trends towards higher rates.  相似文献   
7.
Conventional X rays, i.e. X rays generating voltage between roughly 150 and 300 kV, are used in many radio-diagnostic procedures and also in radiobiological experiments. They release less energetic and, therefore, more densely ionising electrons than the high-energy gamma rays from 60Co or from the A bombs. Accordingly, they are considered to be somewhat more effective, especially at low doses. Various radiobiological studies, especially studies on chromosome aberrations have confirmed this assumption, but epidemiological investigations, e.g. the comparison of the excess relative risk for mammary cancer in the X-ray exposed patients and in the gamma-ray exposed A bomb survivors, have not demonstrated a similar difference. In view of the missing epidemiological evidence and largely for the reasons of practicality in radiation protection, the ICRP has recommended the radiation weighting factor unity equally for all photon radiations. However, in the discussion preceding the 2005 Recommendations of the ICRP, the issue remains controversial. In a recent paper, Harder et al. argue--with reference to an assessment by the German Radiation Protection Commission (SSK)--that the use of the same weighting factor for different photon energies can be justified more directly. For high-energy incident photons, they present the degraded photon spectra at different depths in a phantom, and they conclude that much of the difference between high-energy gamma rays and conventional X rays disappears in a large phantom. The present assessment, which is more direct, compares the spectra of electrons released (through pair production, Compton effect and photo effect) in a small and in a very large receptor for the incident photons of 150 keV, 1 MeV and 6 MeV. For the 1 Mev and 6 MeV photons, there is a substantial shift towards smaller electron energies in the large receptor, but the electron spectra remain much harder than those from the 150 keV incident photons. Furthermore, it is seen--in agreement with earlier conclusions by Straume--that for the broad gamma-ray spectrum from the A bombs there is no shift at all to lower energies within the body, but rather some degree of hardening of the radiation. The assumption that distinct differences between high-energy gamma rays and conventional X rays are restricted to small samples must, thus, be rejected. The attribution of the same effective quality factor or radiation weighting factor to all photon energies remains, therefore, an issue that is based on the considerations beyond dosimetry.  相似文献   
8.
The length changes caused by oxidation in air of Ti-6A1-4V were investigated at temperatures between 800° and 1040°C. In 3.1 mm-thick specimens a 60 min exposure at 950°C results in a net expansion of 0.7 pct. If oxidation and the corresponding expansion are restricted to one surface of a sheet metal specimen a bimetallic strip effect is obtained and the specimens deform into the shape of an arc. Several mechanisms can contribute to deformation during oxidation. The increase of the “c” lattice parameter with increasing oxygen content accounts for most of the observed volume expansion. Because oxygen stabilizesα, the surface layers contain a higher than equilibriumα concentration. Higher thermal expansion of α and its larger volume per unit cell cause additional deformation. These mechanisms apparently can introduce surface stresses up to several kg per sq mm which result in extensive creep deformation.  相似文献   
9.
Algorithms for multiprocessor scheduling with machine release times   总被引:6,自引:0,他引:6  
In this paper we present algorithms for the problem of scheduling n independent jobs on m identical machines. As a generalization of the classical multiprocessor scheduling problem each machine is available only at a machine dependent release time. Two objective functions are considered. To minimize the makespan, we develop a dual approximation algorithm with a worst case bound of 5/4. For the problem of maximizing the minimum completion time, we develop an algorithm, such that the minimum completion time in the schedule produced by this algorithm is at least 2/3 times the minimum completion time in the optimum schedule. The paper closes with some numerical results.  相似文献   
10.
Coordinated Multi-Point (CoMP) transmission and reception is a promising solution for managing interference and increasing performance in future wireless cellular systems. Due to its strict requirements in terms of capacity, latency, and synchronization among cooperating Base Stations (BSs), its successful deployment depends on the capability of the mobile backhaul network infrastructure.We deal with the feasibility of CoMP transmission/reception, in particular of Joint Transmission (JT). For this, we first evaluate which cluster sizes are reasonable from the wireless point-of-view to achieve the desired performance gains. Thereafter, we analyze how different backhaul topologies (e.g., mesh and tree structures) and backhaul network technologies (e.g., layer-2 switching and single-copy multicast capabilities) can support these desired clusters. We study for different traffic scenarios and backhaul connectivity levels, which part of the desired BS clusters are actually feasible according to the backhaul characteristics. We found out that a significant mismatch exists between the desired and feasible clusters. Neglecting this mismatch causes overheads in real JT implementations, which complicates or even prevents their deployment.Based on our findings, we propose a clustering system architecture that not only includes wireless information, as done in the state of the art, but also combines wireless and backhaul network feasibility information in a smart way. This avoids unnecessary signaling and User Equipment (UE) data exchange among BSs which are not eligible to take part in the cooperative cluster. Evaluations show that our scheme reduces the signaling and UE data exchange overhead by up to 85% compared to conventional clustering approaches, which do not take into account the backhaul network’s status.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号