Participating media with an inhomogeneous index of refraction make light follow curved paths. Simulating this in a global
illumination environment has usually been neglected due to the complexity of the calculations involved, sacrificing accurate
physical simulations for efficient visual results.
This paper aims to simulate non-linear media in a more reasonable time than previous works without losing physical correctness.
Accuracy is achieved by solving the Eikonal equation of geometrical optics, which describes the path followed by a light beam
that traverses a non-linear medium. This equation is used in the context of a photon mapping extension. 相似文献
Some supervised tasks are presented with a numerical output but decisions have to be made in a discrete, binarised, way, according to a particular cutoff. This binarised regression task is a very common situation that requires its own analysis, different from regression and classification—and ordinal regression. We first investigate the application cases in terms of the information about the distribution and range of the cutoffs and distinguish six possible scenarios, some of which are more common than others. Next, we study two basic approaches: the retraining approach, which discretises the training set whenever the cutoff is available and learns a new classifier from it, and the reframing approach, which learns a regression model and sets the cutoff when this is available during deployment. In order to assess the binarised regression task, we introduce context plots featuring error against cutoff. Two special cases are of interest, the \( UCE \) and \( OCE \) curves, showing that the area under the former is the mean absolute error and the latter is a new metric that is in between a ranking measure and a residual-based measure. A comprehensive evaluation of the retraining and reframing approaches is performed using a repository of binarised regression problems created on purpose, concluding that no method is clearly better than the other, except when the size of the training data is small. 相似文献
One of the applications of workflow systems is the management of administrative processes characterized by the transmission
of information elements among users of an organization. Tasks contained in these processes are carried out by users responsible
for confirming, modifying or adding information throughout. These processes need to be defined in workflow management systems
in which all the elements are perfectly identified and are easily adaptable to changes that may arise in the sequences of
tasks, in the users involved or in the data transmitted from one task to another. For this kind of processes is easier to
reuse those represented in ontologies. On one hand, existing ontologies for representing some domain elements can be reused.
At the same time, ontologies have an excellent expressive capacity to define tasks, their relationships and the flow control
among them with precision. This paper proposes a complete model, together with the necessary software tools, for tackling
this issue.
álvaro E. PrietoEmail:
álvaro E. Prieto
is a teaching/research assistant professor of Computer Science at the University of Extremadura, Spain. He has an MSc in Computer
Science from the University of Extremadura (2000). His Ph.D. research addresses the use of ontologies in workflows. He is
currently involved in various national and regional R&D&I projects.
Adolfo Lozano-Tello
is teaching/research assistant professor of Computer Science Department at University of Extremadura, Spain. He is a Ph.D.
(2002) with a special prize of extraordinary thesis about selection of ontologies for software applications. He has published
more than 50 papers on the above issues on Software Engineering and Knowledge Engineering. 相似文献
The positive and negative influences of violent/action games, henceforth called “action games”, remains controversial in the scholarly literature. Although debate continues whether action games influence aggressive behavior, little research has examined the influence of action games on civic engagement. The current study addresses this gap by examining the correlation between exposure to action games on civic engagement and on-line prosocial behavior in a sample of 873 teenagers. Results indicated that girls as well as teens who had parents who were more technologically savvy tended to engage in more civic behaviors. Exposure to action games predicted more prosocial behavior on-line, but did not predict civic engagement either positively or negatively. However, exposure to action games and parental involvement interacted to promote youth civic engagement. Action-game-playing-youth whose parents were involved in game play and supervision were most civically involved, compared to youth who did not play action games, or whose parents were less involved. These results indicated little support for the belief that exposure to violence in video games decreases prosocial behavior and/or civic engagement. Conversely some support was found for the possibility that playing action games is associated with small increased prosocial behavior and civic engagement in the real world, possibly due to the team-oriented multiplayer options in many of these games. 相似文献
Electrocoagulation (EC) is a wastewater treatment process in which aqueous pollutants can be removed by adsorption, entrapment, precipitation or coalescence during a coagulation step produced by electrochemically generated metallic species. When using Fe as the sacrificial electrode, Fe(2+) and Fe(3+) ions are formed. As Fe(3+) species are paramagnetic, this property can in principle be used to facilitate their removal through the application of a magnetic field. In the present work we present a proof-of-concept for a combined electrochemical-magnetic method for pollutant removal. For this approach, the amounts of Fe(2+) and Fe(3+) produced in an EC cell at various voltages were measured by spectroscopic methods to confirm that Fe(3+) species predominate (up to 84%). The effectiveness of the presence of a magnetic field in the precipitation of coagulants from a suspension was confirmed by monitoring the turbidity change versus time with and without exposure to a magnetic field, up to a 30% improvement. 相似文献
Decomposing an input image into its intrinsic shading and reflectance components is a long‐standing ill‐posed problem. We present a novel algorithm that requires no user strokes and works on a single image. Based on simple assumptions about its reflectance and luminance, we first find clusters of similar reflectance in the image, and build a linear system describing the connections and relations between them. Our assumptions are less restrictive than widely‐adopted Retinex‐based approaches, and can be further relaxed in conflicting situations. The resulting system is robust even in the presence of areas where our assumptions do not hold. We show a wide variety of results, including natural images, objects from the MIT dataset and texture images, along with several applications, proving the versatility of our method. 相似文献
In this paper, we present several important details in the process of legacy code parallelization, mostly related to the problem of maintaining numerical output of a legacy code while obtaining a balanced workload for parallel processing. Since we maintained the non-uniform mesh imposed by the original finite element code, we have to develop a specially designed data distribution among processors so that data restrictions are met in the finite element method. In particular, we introduce a data distribution method that is initially used in shared memory parallel processing and obtain better performance than the previous parallel program version. Besides, this method can be extended to other parallel platforms such as distributed memory parallel computers. We present results including several problems related to performance profiling on different (development and production) parallel platforms. The use of new and old parallel computing architectures leads to different behavior of the same code, which in all cases provides better performance in multiprocessor hardware.
High per capita milk consumption in Mexico indicated a strong need for documentation of aflatoxin M1 (AFM1) levels in milk. A survey of 580, 2-liter samples (n = 290), was conducted to quantify AFM1 using high-performance liquid chromatography, considering two maximum tolerance levels (0.05 and 0.5 microg/liter). We relate aflatoxin levels in the seven most consumed brands from different regions, with two processes (pasteurized and ultrapasteurized), different expiration dates, and different fat content: whole fat (28, 30, and 33 g), half-skimmed (10, 16, and 20 g), light (1, 2, and 4 g), and with vegetable oil. Pasteurization and ultrapasteurization did not diminish AFM1 contamination present at levels of 0 to 8.35 microg/liter in 40% of the milk samples at concentrations > or = 0.05 microg/liter and in 10% of the samples at > or = 0.5 microg/liter. Statistically significant relationships were AFM1 contamination with brand (P = 0.002 at the > or = 0.05 microg/liter level and P = 0.034 at the > or = 0.5 microg/ liter level) and higher AFM1 levels with mild or warm seasons of the year (P = 0.0003). Samples with greater fat content had slightly more probability (P = 0.067) of being contaminated by AFM1 at the > or = 0.5 microg/liter level. The milk with the lowest contamination of AFM1 was a brand imported as powder and rehydrated in Mexico. 相似文献