排序方式: 共有40条查询结果,搜索用时 46 毫秒
1.
Joint halftoning and watermarking 总被引:2,自引:0,他引:2
A framework to jointly halftone and watermark a grayscale images is presented. The framework needs the definition of three components: a human visual system (HVS)-based error metric between the continuous-tone image and a halftone, a watermarking scheme with a corresponding watermark detection measure, and a search strategy to traverse the space of halftones. We employ the HVS-based error metric used in the direct binary search (DBS) halftoning algorithm, and we use a block-based spread spectrum watermarking scheme and the toggle and swap search strategy of DBS. The halftone is printed on a desktop printer and scanned using a flatbed scanner. The watermark is detected from the scanned image and a number of post-processed versions of the scanned image, including one restored in Adobe PhotoShop. The results show that the watermark is extremely resilient to printing, scanning, and post-processing; for a given baseline image quality, joint optimization is better than watermarking and halftoning independently. For this particular algorithm, the original continuous-tone image is required to detect the watermark. 相似文献
2.
3.
Structural parts used in boilers, turbines, ships, and many household purposes are manufactured through sheet metal forming processes. During manufacturing, the micro structure of the material is deformed and micro cracks along with anisotropic properties get induced. Present research deals with a thin sheet metal plate containing a central crack subjected to mixed mode (I+II) loading. With special reference to Lankford's coefficient and degree of anisotropy, the effect of anisotropic triaxiality on crack initiation angle has been investigated. The result reveals the combinations of Lankford's coefficient and degree of anisotropy for which crack initiation angle do not change. 相似文献
4.
Michael Forbes Jim Lawrence Yu Lei Raghu N. Kacker D. Richard Kuhn 《Journal of research of the National Institute of Standards and Technology》2008,113(5):287-297
Covering arrays are structures for well-representing extremely large input spaces and are used to efficiently implement blackbox testing for software and hardware. This paper proposes refinements over the In-Parameter-Order strategy (for arbitrary t). When constructing homogeneous-alphabet covering arrays, these refinements reduce runtime in nearly all cases by a factor of more than 5 and in some cases by factors as large as 280. This trend is increasing with the number of columns in the covering array. Moreover, the resulting covering arrays are about 5 % smaller. Consequently, this new algorithm has constructed many covering arrays that are the smallest in the literature. A heuristic variant of the algorithm sometimes produces comparably sized covering arrays while running significantly faster. 相似文献
5.
Anand M. Joglekar Raghu N. Kacker 《Quality and Reliability Engineering International》1989,5(2):113-123
Design of experiments is a quality technology to achieve product excellence, that is to achieve high quality at low cost. It is a tool to optimize product and process designs, to accelerate the development cycle, to reduce development costs, to improve the transition of products from R & D to manufacturing and to troubleshoot manufacturing problems effectively. It has been successfully, but sporadically, used in the United States. More recently, it has been identified as a major technological reason for the success of Japan in producing high-quality products at low cost. In the United States, the need for increased competitiveness and the emphasis on quality improvement demands a widespread use of design of experiments by engineers, scientists and quality professionals. In the past, such widespread use has been hampered by a lack of proper training and a lack of availability of tools to easily implement design of experiments in industry. Three steps are essential, and are being taken, to change this situation dramatically. First, simple graphical methods, to design and analyse experiments, need to be developed, particularly when the necessary microcomputer resources are not available. Secondly, engineers, scientists and quality professionals must have access to microcomputer-based software for design and analysis of experiments.1 Availability of such software would allow users to concentrate on the important scientific and engineering aspects of the problem by computerizing the necessary statistical expertise. Finally, since a majority of the current workforce is expected to be working in the year 2000, a massive training effort, based upon simple graphical methods and appropriate computer software, is necessary.2 The purpose of this paper is to describe a methodology based upon a new graphical method called interaction graphs and other previously known techniques, to simplify the correct design of practically important fractional factorial experiments. The essential problem in designing a fractional factorial experiment is first stated. The interaction graph for a 16-trial fractional factorial design is given to illustrate how the graphical procedure can be easily used to design a two-level fractional factorial experiment. Other previously known techniques are described to easily modify the two-level fractional factorial designs to create mixed multi-level designs. Interaction graphs for other practically useful fractional factorial designs are provided. A computer package called CADE (computer aided design of experiments), which automatically generates the appropriate fractional factorial designs based upon user specifications of factors, levels and interactions and conducts complete analyses of the designed experiments is briefly described.1 Finally, the graphical method is compared with other available methods for designing fractional factorial experiments. 相似文献
6.
Application of inline imaging for monitoring crystallization process in a continuous oscillatory baffled crystallizer 下载免费PDF全文
Rohit Kacker Sebastian Maaß Jörn Emmerich Herman Kramer 《American Institute of Chemical Engineers》2018,64(7):2450-2461
In this study, an in situ imaging system has been analysed to characterize the crystal size, the shape and the number of particles during a continuous crystallization process in a Continuous Oscillatory Baffled Crystallizer (COBC). Two image analysis approaches were examined for particle characterization in the suspension containing both small nuclei and larger grown crystals (nonspherical and irregular in shape). The pattern matching approach, in which the particles are approximated to be spherical, did result in an overestimation of the size. Alternatively, a segmentation‐based algorithm resulted in reliable crystal size and shape characteristics. The laser diffraction analysis in comparison to the image analysis overestimated the particle sizes due to the agglomeration of particles upon filtration and drying. The trend in the particle counts during the start of crystallization process, including nucleation, determined by the image analysis probe was comparable with the one measured by FBRM, highlighting the potential of in situ imaging for process monitoring. © 2018 American Institute of Chemical Engineers AIChE J, 64: 2450–2461, 2018 相似文献
7.
Raghu Kacker Ingram Olkin 《Journal of research of the National Institute of Standards and Technology》2005,110(1):67-77
This article is a survey of the tables of probability distributions published about or after the publication in 1964 of the Handbook of Mathematical Functions, edited by Abramowitz and Stegun 相似文献
8.
Ruchin Kacker Shailendra Singh Bhadauria 《Fatigue & Fracture of Engineering Materials & Structures》2020,43(2):250-264
The new model of stress triaxiality, subjected to plane strain condition under mixed‐mode (I + II) loading, at the yield loci of the crack tip, has been formulated using unified strength theory. It evaluates critical values of triaxiality for various convex and non‐convex failure criteria, unlike the existing model. It shows the effects of Poisson's ratio and intermediate principal stress for materials whose strength in tension and compression is either equal or unequal. Further, on this basis, the crack initiation angles are predicted for various crack inclinations and compared with those obtained from other fracture criteria. The plastic zone shapes supplement the results. Critical yield stress factor, a significant parameter at the crack tip got lowered as the difference among the three principal stresses reduced to a minimum. The crack initiation angles obtained from the model showed good agreement with those obtained from G‐, S‐, and T‐criterion. 相似文献
9.
The synthetic-perturbation screening (SPS) methodology is based on an empirical approach; SPS introduces artificial perturbations into the MIMD program and captures the effects of such perturbations by using the modern branch of statistics called design of experiments. SPS can provide the basis of a powerful tool for screening MIMD programs for performance bottlenecks. This technique is portable across machines and architectures, and scales extremely well on massively parallel processors. The purpose of this paper is to explain the general approach and to extend it to address specific features that are the main source of poor performance on the shared memory programming model. These include performance degradation due to load imbalance and insufficient parallelism, and overhead introduced by synchronizations and by accessing shared data structures. We illustrate the practicality of SPS by demonstrating its use on two very different case studies: a large image understanding benchmark and a parallel quicksort. 相似文献
10.
Compliant free-standing structures can be used as chip-to-substrate interconnects. Such ldquocompliant interconnectsrdquo are a potential solution to the requirements that will be imposed on chip-to-substrate interconnects over the next decade. However, cost of implementation and electrical performance limit compliant interconnects. In our previous work, we have proposed a new compliant interconnect technology called FlexConnect to address these concerns with compliant interconnects. An innovative cost-effective MEMS-based fabrication process is used to fabricate these compliant interconnects. Sequential lithography and electroplating processes with up to two masking steps are utilized. Utilizing the proposed fabrication process, in this paper, FlexConnects are realized at a 100-mum pitch. High-frequency modeling of the electrical parasitics of the interconnect is performed. Through finite-element-based models, the advantage of using multiple electrical paths as part of the interconnect design is shown from a thermomechanical reliability perspective. Finally, taking advantage of the MEMS-based photolithographic fabrication process, a heterogeneous combination of FlexConnects and column interconnects is proposed. This approach is shown to be an additional avenue to attain improved electrical performance without compromising mechanical performance. 相似文献