This paper proposes an improved version of a recently proposed modified simulated annealing algorithm (MSAA) named as an improved MSAA (I-MSAA) to tackle the size optimization of truss structures with frequency constraint. This kind of problem is problematic because its feasible region is non-convex while the boundaries are highly non-linear. The main motivation is to improve the exploitative behavior of MSAA, taking concept from water wave optimization metaheuristic (WWO). An interesting concept of WWO is its breaking operation. Thirty functions extracted from the CEC2014 test suite and four benchmark truss optimization problems with frequency constraints are explored for the validity of the proposed algorithm. Numerical results indicate that I-MSAA is more reliable, stable and efficient than those found by other existing metaheuristics in the literature.
Predicting whether the intended audience will be able to recognize the meaning of an icon or pictograph is not an easy task. Many icon recognition studies have been conducted in the past. However, their findings cannot be generalized to other icons that were not included in the study, which, we argue, is their main limitation. In this paper, we propose a comprehensive taxonomy of icons that is intended to enable the generalization of the findings of recognition studies. To accomplish this, we analyzed a sample of more than eight hundred icons according to three axes: lexical category, semantic category, and representation strategy. Three basic representation strategies were identified: visual similarity; semantic association; and arbitrary convention. These representation strategies are in agreement with the strategies identified in previous taxonomies. However, a greater number of subcategories of these strategies were identified. Our results also indicate that the lexical and semantic attributes of a concept influence the choice of representation strategy. 相似文献
In this paper, we verify how far electric disturbance signals can be compressed without compromising the analysis of encoded fault records. A recently proposed compression algorithm, referred to as Damped Sinusoidal Matching Pursuit (DSMP) has the remarkable feature of obtaining both compact and physically interpretable representations. However, for fault analysis applications, one is primarily interested in how accurate can be the analysis performed on compressed signals, instead of evaluating mean-squared error figures. Unlike previous works in digital fault records compression, the performance of the DSMP compression method is evaluated using a protocol based on fault analysis procedures commonly performed by expert engineers. This protocol is applied for comparing the results obtained in the analysis of both uncompressed records and their compressed versions at different compression ratios. The results show that the DSMP is a reliable compression system since it achieves high compression ratios (6.4:1) without causing fault analysis misinterpretation. 相似文献
In this paper, we study the sensitivity of centrality metrics as a key metric of social networks to support visual reasoning. As centrality represents the prestige or importance of a node in a network, its sensitivity represents the importance of the relationship between this and all other nodes in the network. We have derived an analytical solution that extracts the sensitivity as the derivative of centrality with respect to degree for two centrality metrics based on feedback and random walks. We show that these sensitivities are good indicators of the distribution of centrality in the network, and how changes are expected to be propagated if we introduce changes to the network. These metrics also help us simplify a complex network in a way that retains the main structural properties and that results in trustworthy, readable diagrams. Sensitivity is also a key concept for uncertainty analysis of social networks, and we show how our approach may help analysts gain insight on the robustness of key network metrics. Through a number of examples, we illustrate the need for measuring sensitivity, and the impact it has on the visualization of and interaction with social and other scale-free networks. 相似文献
Recent advances in Wireless Mesh Networks (WMNs) have overcome the drawbacks of traditional wired and ad-hoc networks and now they are seen as a means of allowing last mile communications with quality level assurance in Future Multimedia Systems. However, new routing schemes are needed to provide end-to-end Quality of Service (QoS) and Quality of Experience (QoE) support for delay/loss/jitter-sensitive multimedia applications. The well-known OLSR (Optimized Link State Routing) protocol with ETX (Expected Transmission Count) metric brings many benefits to the path selection process, but has a drawback with regard to queue availability management, which reduces the system performance. This problem is caused when OLSR-EXT control messages are exchanged and the queues of mesh routers along the end-to-end communication path are overloaded. As a result, multimedia-related packets will suffer from loss/delay/jitter and the overall system performance will decrease. This paper proposes the Optimized Link State Routing-Fuzzy ETX Queue (OLSR-FEQ) protocol to overcome the limitations of OLSR-ETX regarding queue availability, QoS and QoE assurance. OLSR-FEQ optimizes network and user-based parameters by coordinating queue availability, QoS and fuzzy issues in the routing decision process as a way of allocating the best paths for multimedia applications. Performance evaluations were carried out with the Network Simulator (NS-2.34) to show the benefits of the proposed solution when compared with existing routing schemes, namely OLSR-ETX, OLSR-FLC, OLSR-MD and HWMP (IEEE 802.11s standard), regarding QoS (unsuccessful packet delivery and throughput) and QoE (PSNR, SSIM, VQM and MOS) parameters. 相似文献
This work presents a study of RTP multiplexing schemes, which are compared with the normal use of RTP, in terms of experienced quality. Bandwidth saving, latency and packet loss for different options are studied, and some tests of Voice over IP (VoIP) traffic are carried out in order to compare the quality obtained using different implementations of the router buffer. Voice quality is calculated using ITU R-factor, which is a widely accepted quality estimator. The tests show the bandwidth savings of multiplexing, and also the importance of packet size for certain buffers, as latency and packet loss may be affected. The customer’s experience improvement is measured, showing that the use of multiplexing can be interesting in some scenarios, like an enterprise with different offices connected via the Internet. The system is also tested using different numbers of samples per packet, and the distribution of the flows into different tunnels is found to be an important factor in order to achieve an optimal perceived quality for each kind of buffer. Grouping all the flows into a single tunnel will not always be the best solution, as the increase of the number of flows does not improve bandwidth efficiency indefinitely. If the buffer penalizes big packets, it will be better to group the flows into a number of tunnels. The router processing capacity has to be taken into account too, as the limit of packets per second it can manage must not be exceeded. The obtained results show that multiplexing is a good way to improve customer’s experience of VoIP in scenarios where many RTP flows share the same path. 相似文献
Model-based testing is focused on testing techniques which rely on the use of models. The diversity of systems and software to be tested implies the need for research on a variety of models and methods for test automation. We briefly review this research area and introduce several papers selected from the 22nd International Conference on Testing Software and Systems (ICTSS). 相似文献
Spam has become a major issue in computer security because it is a channel for threats such as computer viruses, worms, and phishing. More than 86% of received e-mails are spam. Historical approaches to combating these messages, including simple techniques such as sender blacklisting or the use of e-mail signatures, are no longer completely reliable. Many current solutions feature machine-learning algorithms trained using statistical representations of the terms that most commonly appear in such e-mails. However, these methods are merely syntactic and are unable to account for the underlying semantics of terms within messages. In this paper, we explore the use of semantics in spam filtering by introducing a pre-processing step of Word Sense Disambiguation (WSD). Based upon this disambiguated representation, we apply several well-known machine-learning models and show that the proposed method can detect the internal semantics of spam messages. 相似文献
In response to K. Danzinger's (see record 1986-00068-001) suggestion that the first use of the term subject in the English-language psychological literature occurred in 1886 in the context of experiments involving the hypnotic state, the present author points out that there are examples of the use of the term in discussions of experiments on thought transference published by the Society for Psychical Research in the 1880's. (8 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
Iranian Polymer Journal - The present work focuses on the assessment of the ability of porcine plasma protein (PPP) to be electrospun satisfactorily to form fibre mats, and their rheological and... 相似文献