首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Task composition in cloud manufacturing involves the selection of appropriate services from the cloud manufacturing platform and combining them to process the task with the purpose of achieving its expected performance. Calculation methods for achieving the performance expected by customers when the task has two or more composition patterns (e.g. sequential and switching pattern) are necessary because most tasks have multiple composition patterns in cloud manufacturing. Previous studies, however, have focused only on a single composition pattern. In this paper, we regard a task as a directed acyclic graph, and propose graph-based algorithms to obtain cost, execution time, quality and reliability of a task having multiple composition patterns. In addition, we model the task composition problem by introducing cost and execution time as performance attributes, and quality and reliability as basic attributes in the Kano model. Finally, an experiment to compare the performances of three metaheuristic algorithms (namely, variable neighbourhood search, genetic, and simulated annealing) is conducted to solve the problem. The experimental result shows that the variable neighbourhood search algorithm yields better and more stable solutions than the genetic algorithm and simulated annealing algorithms.  相似文献   

2.
This paper focuses on manufacturing environments where job processing times are uncertain. In these settings, scheduling decision makers are exposed to the risk that an optimal schedule with respect to a deterministic or stochastic model will perform poorly when evaluated relative to actual processing times. Since the quality of scheduling decisions is frequently judged as if processing times were known a priori, robust scheduling, i.e., determining a schedule whose performance (compared to the associated optimal schedule) is relatively insensitive to the potential realizations of job processing times, provides a reasonable mechanism for hedging against the prevailing processing time uncertainty. In this paper we focus on a two-machine flow shop environment in which the processing times of jobs are uncertain and the performance measure of interest is system makespan. We present a measure of schedule robustness that explicitly considers the risk of poor system performance over all potential realizations of job processing times. We discuss two alternative frameworks for structuring processing time uncertainty. For each case, we define the robust scheduling problem, establish problem complexity, discuss properties of robust schedules, and develop exact and heuristic solution approaches. Computational results indicate that robust schedules provide effective hedges against processing time uncertainty while maintaining excellent expected makespan performance  相似文献   

3.
Lead-free Solders in Microelectronics   总被引:91,自引:0,他引:91  
Practically all microelectronic assemblies in use today utilize Pb–Sn solders for interconnection. With the advent of chip scale packaging technologies, the usage of solder connections has increased. The most widely used Pb–Sn solder has the eutectic composition. Emerging environmental regulations worldwide, most notably in Europe and Japan, have targeted the elimination of Pb usage in electronic assemblies, due to the inherent toxicity of Pb. This has made the search for suitable “Pb-free” solders an important issue for microelectronics assembly. Approximately 70 Pb-free solder alloy compositions have been proposed thus far. There is a general lack of engineering information, and there is also significant disparity in the information available on these alloys. The issues involved can be divided into two broad categories: manufacturing and reliability/performance. A major factor affecting alloy selection is the melting point of the alloy, since this will have a major impact on the other polymeric materials used in microelectronic assembly and encapsulation. Other important manufacturing issues are cost, availability, and wetting characteristics. Reliability related properties include mechanical strength, fatigue resistance, coefficient of thermal expansion and intermetallic compound formation. The data available in the open literature have been reviewed and are summarized in this paper. Where data were not available, such as for corrosion and oxidation resistance, chemical thermodynamics was used to develop this information. While a formal alloy selection decision analysis methodology has not been developed, less formal approaches indicate that Sn-rich alloys will be the Pb-free solder alloys of choice, with three to four alloys being identified for each of the different applications. Research on this topic continues at the present time at a vigorous pace, in view of the imminence of the issue.  相似文献   

4.
Scheduling outpatients and medical operation rooms has the following structure: Nusers are given appointment times to use a facility, the duration required by the facility to service each user is stochastic. The system incurs a “user idle cost” if a user arriving at the appointed time finds the facility still engaged by preceding users, while a “facility idle cost” is incurred if the facility becomes free before the next user arrives. We develop an accurate procedure to compute the expected total system costs for any given appointment schedule. Compared to earlier related procedures, ours is much faster and can handle larger problems as well as very general service-time distributions. We then show that this fast computation procedure enables one to determine easily the “lowest-cost appointment schedule” for any given “job” (i.e., “user”) sequence. This in turn will enable one to search for the optimal job sequence that has the best “lowest-cost appointment schedule”.  相似文献   

5.
Many practical problems of quality control involve the use of ordinal scales. Questionnaires planned to collect judgments on qualitative or linguistic scales, whose levels are terms such as “good,” “bad,” “medium,” etc., are extensively used both in evaluating service quality and in visual controls for manufacturing industry. In an ordinal environment, the concept of distance between two generic levels of the same scale is not defined. Therefore, a population (universe) of judgments cannot be described using “traditional” statistical distributions since they are based on the notion of distance. The concept of “distribution shape” cannot be defined as well. In this article, we introduce a new statistical entity, the so-called ordinal distribution, to describe a population of judgments expressed on an ordinal scale. We also discuss which of the traditional location and dispersion measures can be used in this context and we briefly analyze some of their properties. A new dispersion measure, the ordinal range, as an extension of the cardinal range to ordinal scales, is then proposed. A practical application in the field of quality is developed throughout the article.  相似文献   

6.
Due to natural or man-made disasters, the evacuation of a whole region or city may become necessary. Apart from private traffic, the emergency services also need to consider transit-dependent evacuees which have to be transported from collection points to secure shelters outside the endangered region with the help of a bus fleet. We consider a simplified version of the arising bus evacuation problem (BEP), which is a vehicle scheduling problem that aims at minimizing the network clearance time, i.e., the time needed until the last person is brought to safety. In this paper, we consider an adjustable robust formulation without recourse for the BEP, the robust bus evacuation problem (RBEP), in which the exact numbers of evacuees are not known in advance. Instead, a set of likely scenarios is known. After some reckoning time, this uncertainty is eliminated and planners are given exact figures. The problem is to decide for each bus, if it is better to send it right away—using uncertain information on the evacuees—or to wait until the the scenario becomes known. We present a mixed-integer linear programming formulation for the RBEP and discuss solution approaches; in particular, we present a tabu search framework for finding heuristic solutions of acceptable quality within short computation time. In computational experiments using both randomly generated instances and the real-world scenario of evacuating the city of Kaiserslautern, Germany, we compare our solution approaches.  相似文献   

7.
The principle of subdomain methods has been introduced many years ago1 and such methods have been used with success in recent years. In this paper, we discuss some methods used to solve the condensed system resulting from the Schur complement method. We will present the hybrid method and a mechanical application: the contact problem between two deformable solids. In the case of such a problem, we propose two methods: one for the contact with friction and one for the contact without friction. Finally, the results of validation tests for both proposed approaches are given.  相似文献   

8.
This study will contribute to the uncapacitated single allocation p-hub median problem (USApHMP) which is known as an NP-hard problem in the literature. This problem is concerned with locating of hub facilities in a network and allocating of each non-hub node to just one hub in order to minimize total transportation costs in the network. A hybrid variable neighborhood search (VNS) algorithm is proposed considering three structures of local search which are used as a combination of nested VNS and sequential VNS in the algorithm. For reduction of the dimensions in the nested part, social network analysis centrality measures for the node are used to choose elite points instead of all existing points in the local search structures. The obtained results demonstrate that this will not only retain quality of the solutions, but also reduce run time of the algorithm significantly. Three standard data sets (AP, CAB, and URAND) were used for numerical analysis. Computational results show that quality of the obtained solutions is good and able to compete with other heuristics addressed in the literature. From the viewpoint of execution time, it considerably outperforms all other algorithms. The intelligent search embedded in this algorithm makes it robust and efficient on networks with up to 400 nodes.  相似文献   

9.
We have carried out a study of the particle size distribution and aqueous dissolution rate of two commercially available qualities of orthoboric acid, labeled “crystal” (ABC) and “powder” (ABP). In a previous work, we have shown that the two commercial qualities of orthoboric acid chosen as model compound (“powder” and “crystal”) are related to the same crystal network in spite of their dvferent names. However, these two qualities have very different size particle distributions, as previously determined by sieving and confirmed by the present laser light scattering study. Dissolution testing is performed under sink conditions and show that the bulk ABC quality dissolves far more rapidly that the bulk ABP quality, For each quality, dissolution rates of four sieved particle size fractions (0-90 μm; 90-125 μm; 125-180 μm; 180-250 μm) were compared. Concerning the ABC quality, comparisons were also done with three other particles size fractions: 250-355 μm, 355-500 μm, and 500-710 μm. This study used the dQ/dt versus t profile. Dissolution profiles of the fractions enclosing particles with a size superior to 125 μm are very close. On the other hand, fractions enclosing particles with a size smaller than 90 μm present a different profile and a slower rate of dissolution.  相似文献   

10.
We introduce the following problem in this paper. There are n points on the plane that are to be observed from some point on a circle of given radius that encloses all of the points. We wish to find the observation point that has the best possible view of the n points in the sense that if we draw lines of sight from the observation point to the given points, the smallest angle between the lines is maximized. Applications could include the planning of photographs or displays. This is a “maximin problem” in which the function to be maximized has many local optima. We present two methods for solving the problem, one more efficient in computer time, and the other in storage. We also present a simplified procedure for the case where the observation point is “infinitely” distant from the given points.  相似文献   

11.
We study the problem of sequencing mixed-model assembly lines operating with a heterogeneous workforce. The practical motivation for this study comes from the context of managing assembly lines in sheltered work centres for the disabled. We propose a general framework in which task execution times are both worker and model dependent. Within this framework, the problem is defined and mathematical mixed-integer models and heuristic procedures are proposed. These include a set of fast constructive heuristics, two local search procedures based on approximate measures using either a solution upper bound or the solution of a linear program and a GRASP metaheuristic. Computational tests with instances adapted from commonly used literature databases are used to validate the proposed approaches. These tests give insight on the quality of the different techniques, which prove to be very efficient both in terms of computational effort and solution quality when compared to other strategies such as a random sampling or the solution of the MIP models using a commercial solver.  相似文献   

12.
In order to expedite the process of introducing a product to market, organisations have shifted their paradigm towards concurrent engineering. This involves the simultaneous execution of successive activities on the basis of information available in rudimentary form. For this, cross-functional teams sporadically communicate to exchange available updated information at the cost of augmented time and money. Therefore, the aim of this paper is to present a model-based methodology to estimate the optimal amount of overlapping and communication policy with a view to minimising the product development cycle time at the lowest additional cost. In the first step of the methodology, an objective function comprising the cycle time and the cost of the complete project is formulated mathematically. To reach the optimal solution, a novel meta-heuristic, non-discrete ant colony optimisation, is proposed. The algorithm derives its governing traits from the traditional ant algorithms over a discrete domain, but has been modified to search results in a continuous search space. The salient feature of the proposed meta-heuristic is that it utilises the weighted sum of numerous probability distribution functions (PDFs) to represent the long-term pheromone information. This paper utilises a novel approach for pheromone maintenance to adequately update the PDFs after each tour by the ants. The performance of the proposed algorithm has been tested on a hypothetical illustrative example of mobile phones and its robustness has been authenticated against variants of particle swarm optimisation.  相似文献   

13.
This paper presents a hybrid Pareto-based local search (PLS) algorithm for solving the multi-objective flexible job shop scheduling problem. Three minimisation objectives are considered simultaneously, i.e. the maximum completion time (makespan), the total workload of all machines, and the workload of the critical machine. In this study, several well-designed neighbouring approaches are proposed, which consider the problem characteristics and thus can hold fast convergence ability while keep the population with a certain level of quality and diversity. Moreover, a variable neighbourhood search (VNS) based self-adaptive strategy is embedded in the hybrid algorithm to utilise the neighbouring approaches efficiently. Then, an external Pareto archive is developed to record the non-dominated solutions found so far. In addition, a speed-up method is devised to update the Pareto archive set. Experimental results on several well-known benchmarks show the efficiency of the proposed hybrid algorithm. It is concluded that the PLS algorithm is superior to the very recent algorithms, in term of both search quality and computational efficiency.  相似文献   

14.
Desirability functions (DFs) are commonly used in optimization of design parameters with multiple quality characteristic to obtain a good compromise among predicted response models obtained from experimental designs. Besides discussing multi-objective approaches for optimization of DFs, we present a brief review of literature about most commonly used Derringer and Suich type of DFs and others as well as their capabilities and limitations. Optimization of DFs of Derringer and Suich is a challenging problem. Although they have an advantageous shape over other DFs, their nonsmooth nature is a drawback. Commercially available software products used by quality engineers usually do optimization of these functions by derivative free search methods on the design domain (such as Design-Expert), which involves the risk of not finding the global optimum in a reasonable time. Use of gradient-based methods (as in MINITAB) after smoothing nondifferentiable points is also proposed as well as different metaheuristics and interactive multi-objective approaches, which have their own drawbacks. In this study, by utilizing a reformulation on DFs, it is shown that the nonsmooth optimization problem becomes a nonconvex mixed-integer nonlinear problem. Then, a continuous relaxation of this problem can be solved with nonconvex and global optimization approaches supported by widely available software programs. We demonstrate our findings on two well-known examples from the quality engineering literature and their extensions.  相似文献   

15.
Sensitivity analysis practices: Strategies for model-based inference   总被引:3,自引:0,他引:3  
Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having “sensitivity analysis” as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on “one-factor-at-a-time” (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA.  相似文献   

16.
Abstract

Recently, Gao and Tu presented an efficient algorithm for robust low bit‐rate video transmission by using a partial backward decodable bit stream (PBDBS) approach. In this paper, we first present a multiple‐PBDBS (MPBDBS) approach to improve on the previous PBDBS approach. Next a mathematical theory is provided to minimize the error propagation length in each group of blocks (GOB). Further, a novel MPBDBS‐based algorithm is presented for robust video transmission. Experimental results demonstrate that our proposed MPBDBS‐based algorithm has better image quality when compared to the previous PBDBS‐based algorithm, but has some bit‐rate and execution‐time degradation. In our experiments, both single and two‐bit error models are investigated.  相似文献   

17.
A common procedure in budget allocation is to let the different entities of an organization determine their optimal budgets. Once the individual requests are received, they are then cut by a common factor, as necessary, so that a global constraint is satisfied. We refer to this procedure as the “cut across the board” rule. In general, this method will not result in a globally optimal solution. In this paper we identify conditions that assure the global optimality of die “cut (or expand) across the board” rule. We specifically focus on a constrained multi-item inventory model and generalize results of Rosenblatt [10] and Plossl and Wight [8]. In addition, we briefly discuss applicability of the results to other areas.  相似文献   

18.
The disparities between the quasi-induced exposure (QIE) method and a standard case–control approach with crash responsibility as disease of interest are studied. The 10,748 drivers who had been given compulsory cannabis and alcohol tests subsequent to involvement in a fatal crash in France between 2001 and 2003 were used to compare the two approaches. Odds ratios were assessed using conditional and unconditional logistic regressions. While both approaches found that drivers under the influence of alcohol or cannabis increased the risk of causing a fatal crash, the two approaches are not equivalent. They differ mainly with regards to the driver sample selected. The QIE method results in splitting the overall road safety issue into two sub-studies: a matched case–control study dealing with two-vehicle crashes and a case–control study dealing with single-vehicle crashes but with a specific control group. Using a specific generic term such as “QIE method” should not hide the real underlying epidemiological design. On the contrary, the standard case–control approach studies drivers involved in all type of crashes whatever the distribution of the responsibility in each crash. This method also known as “responsibility analysis” is the most relevant for assessing the overall road safety implications of a driver characteristic.  相似文献   

19.
Abstract

To efficiently execute a finite element program on a hypercube, we need to map nodes of the corresponding finite element graph to processors of a hypercube such that each processor has approximately the same amount of computational load and the communication among processors is minimized. If the number of nodes of a finite element graph will not be increased during the execution of a program, the mapping only needs to be performed once. However, if a finite element graph is solution‐adaptive, that is, the number of nodes will be increased discretely due to the refinement of some finite elements during the execution of a program, a run‐time load balancing algorithm has to be performed many times in order to balance the computational load of processors while keeping the communication cost as low as possible. In this paper, we propose a parallel iterative load balancing algorithm (ILB) to deal with the load imbalancing problem of a solution‐adaptive finite element program. The proposed algorithm has three properties. First, the algorithm is simple and easy to be implemented. Second, the execution of the algorithm is fast. Third, it guarantees that the computational load will be balanced after the execution of the algorithm. We have implemented the proposed algorithm along with two parallel mapping algorithms, parallel orthogonal recursive bisection (ORB) [19] and parallel recursive mincut bipartitioning (MC) [8], on a 16‐node NCUBE‐2. Three criteria, the execution time of load balancing algorithms, the computation time of an application program under different load balancing algorithms, and the total execution time of an application program (under several refinement phases) are used for performance evaluation. Experimental results show that (1) the execution time of ILB is very short compared to those of MC and ORB; (2) the mappings produced by ILB are better than those of ORB and MC; and (3) the speedups produced by ILB are better than those of ORB and MC.  相似文献   

20.
How should a video rental chain replenish its stock of new movies over time? Any such policy should consist of two key dimensions: (i) the number of copies purchased; and (ii) when to remove a movie from the front shelves and replace it by a newly released one. We first analyze this bi-variate problem for an integrated chain. As for decentralized chains, we show that a (wholesale) price-only contract cannot coordinate such a chain. We then consider a price-and-revenue-sharing contract. Such a contract can achieve coordination, but the unique price and share which are needed may not provide one of the parties with its desired profit (i.e., it will violate individual rationality). This situation has been reported in the case of Blockbuster Video and has led to litigation between Blockbuster and Disney Studios. We thus propose adding a third lever: a license fee (or subsidy) associated with each new movie. Such a contract can coordinate the channel and satisfy the individual rationality requirements. In fact, all our results hold true irrespective of whether or not the rental store is allowed to sell surplus copies of movies. We are able to compare the optimal decision variable and coordinating lever values, as well as the optimal profits, for the “rental only” and “sales + rental” models. Our numerical examples, which utilize empirical demand data have significant managerial implications in terms of increasing the effectiveness of the video rental industry.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号