首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   525篇
  免费   47篇
工业技术   572篇
  2024年   1篇
  2023年   6篇
  2022年   10篇
  2021年   35篇
  2020年   19篇
  2019年   12篇
  2018年   18篇
  2017年   20篇
  2016年   32篇
  2015年   22篇
  2014年   33篇
  2013年   40篇
  2012年   35篇
  2011年   37篇
  2010年   38篇
  2009年   52篇
  2008年   30篇
  2007年   33篇
  2006年   13篇
  2005年   24篇
  2004年   22篇
  2003年   11篇
  2002年   10篇
  2001年   3篇
  2000年   8篇
  1998年   1篇
  1997年   2篇
  1996年   1篇
  1993年   2篇
  1989年   1篇
  1986年   1篇
排序方式: 共有572条查询结果,搜索用时 15 毫秒
11.
Over the past decade process mining has emerged as a new analytical discipline able to answer a variety of questions based on event data. Event logs have a very particular structure; events have timestamps, refer to activities and resources, and need to be correlated to form process instances. Process mining results tend to be very different from classical data mining results, e.g., process discovery may yield end-to-end process models capturing different perspectives rather than decision trees or frequent patterns. A process-mining tool like ProM provides hundreds of different process mining techniques ranging from discovery and conformance checking to filtering and prediction. Typically, a combination of techniques is needed and, for every step, there are different techniques that may be very sensitive to parameter settings. Moreover, event logs may be huge and may need to be decomposed and distributed for analysis. These aspects make it very cumbersome to analyze event logs manually. Process mining should be repeatable and automated. Therefore, we propose a framework to support the analysis of process mining workflows. Existing scientific workflow systems and data mining tools are not tailored towards process mining and the artifacts used for analysis (process models and event logs). This paper structures the basic building blocks needed for process mining and describes various analysis scenarios. Based on these requirements we implemented RapidProM, a tool supporting scientific workflows for process mining. Examples illustrating the different scenarios are provided to show the feasibility of the approach.  相似文献   
12.
Distributed Online Social Networks (DOSN) are a valid alternative to OSN based on peer-to-peer communications. Without centralised data management, DOSN must provide the users with higher level of control over their personal information and privacy. Thus, users may wish to restrict their personal network to a limited set of peers, depending on the level of trust with them. This means that the effective social network (used for information exchange) may be a subset of the complete social network, and may present different structural patterns, which could limit information diffusion. In this paper, we estimate the capability of DOSN to diffuse content based on trust between social peers. To have a realistic representation of a OSN friendship graph, we consider a large-scale Facebook network, from which we estimate the trust level between friends. Then, we consider only social links above a certain threshold of trust, and we analyse the potential capability of the resulting graph to spread information through several structural indices. We test four possible thresholds, coinciding with the definition of personal social circles derived from sociology and anthropology. The results show that limiting the network to “active social contacts” leads to a graph with high network connectivity, where the nodes are still well-connected to each other, thus information can potentially cover a large number of nodes with respect to the original graph. On the other hand, the coverage drops for more restrictive assumptions. Nevertheless the re-insertion of a single excluded friend for each user is sufficient to obtain good coverage (i.e., always higher than 40 %) even in the most restricted graphs. We also analyse the potential capability of the network to spread information (i.e., network spreadability), studying the properties of the social paths between any pairs of users in the graph, which represent the effective channels traversed by information. The value of contact frequency between pairs of users determines a decay of trust along the path (the higher the contact frequency the lower the decay), and a consequent decay in the level of trustworthiness of information traversing the path. We show that selecting the link to re-insert in the network with probability proportional to its level of trust is the best re-insertion strategy, as it leads to the best connectivity/spreadability combination.  相似文献   
13.
The ever accelerating state of technology has powered an increasing interest in heat transfer solutions and process engineering innovations in the microfluidics domain. In order to carry out such developments, reliable heat transfer diagnostic techniques are necessary. Thermo-liquid crystal (TLC) thermography, in combination with particle image velocimetry, has been a widely accepted and commonly used technique for the simultaneous measurement and characterization of temperature and velocity fields in macroscopic fluid flows for several decades. However, low seeding density, volume illumination, and low TLC particle image quality at high magnifications present unsurpassed challenges to its application to three-dimensional flows with microscopic dimensions. In this work, a measurement technique to evaluate the color response of individual non-encapsulated TLC particles is presented. A Shirasu porous glass membrane emulsification approach was used to produce the non-encapsulated TLC particles with a narrow size distribution and a multi-variable calibration procedure, making use of all three RGB and HSI color components, as well as the proper orthogonally decomposed RGB components, was used to achieve unprecedented low uncertainty levels in the temperature estimation of individual particles, opening the door to simultaneous temperature and velocity tracking using 3D velocimetry techniques.  相似文献   
14.
The RFID technology is becoming ever more popular in the development of ubiquitous computing applications. A full exploitation of the RFID potential requires the study and implementation of human–computer interaction (HCI) modalities to be able to support wide usability by the target audience. This implies the need for programming methodologies specifically dedicated to support the easy and efficient prototyping of applications to have feedback from early tests with users. On the basis of our field‐working experience, we have designed oDect, a high‐level language and platform‐independent application programming interface (API), ad hoc designed to meet the needs of typical applications for mobile devices (smart phones and PDAs). oDect aims at allowing application developers to create their prototypes focusing on the needs of the final users, without having to care about the low‐level software that interacts with the RFID hardware. Further, in an end‐user developing (EUD) approach, oDect provides specific support for the application end‐user herself to cope with typical problems of RFID applications in detecting objects. We describe in detail the features of the API and discuss the findings of a test with four programmers, where we analyse and evaluate the use of the API in four sample applications. We also present results of an end‐user test, which investigated strengths and weaknesses of the territorial agenda (TA) concept. The TA is an RFID‐based citizen guide that aids—through time‐ and location‐based reminders—users in their daily activities in a city. The TA directly exploits EUD features of oDect, in particular concerning the possibility of linking detected objects with custom actions. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   
15.
Stereo Light Probe   总被引:1,自引:0,他引:1  
In this paper we present a practical, simple and robust method to acquire the spatially‐varying illumination of a real‐world scene. The basic idea of the proposed method is to acquire the radiance distribution of the scene using high‐dynamic range images of two reflective balls. The use of two light probes instead of a single one allows to estimate, not only the direction and intensity of the light sources, but also the actual position in space of the light sources. To robustly achieve this goal we first rectify the two input spherical images, then, using a region‐based stereo matching algorithm, we establish correspondences and compute the position of each light. The radiance distribution so obtained can be used for augmented reality applications, photo‐realistic rendering and accurate reflectance properties estimation. The accuracy and the effectiveness of the method have been tested by measuring the computed light position and rendering synthetic version of a real object in the same scene. The comparison with standard method that uses a simple spherical lighting environment is also shown.  相似文献   
16.
Artificial intelligent actuators are extensively explored for emerging applications such as soft robots, human-machine interfaces, and biomedical devices. However, intelligent actuating systems based on synthesized polymers suffer from challenges in renewability, sustainability, and safety, while natural polymer-based actuators show limited capabilities and performances due to the presence of abundant hydrogen-bond lockers. Here this study reports a new hydrogen bond-mediated strategy to develop mimosa-inspired starch actuators (SA). By harnessing the unique features of gelatinization and abundant hydrogen bonds, these SA enable high-sensitivity and multi-responsive actuation in various scenarios. The non-gelatinized SA can be irreversibly programmed into diverse shapes, such as artificial flowers, bowl shapes, and helix structures, using near-infrared light. Furthermore, the gelatinized SA exhibit reversibly multi-responsive actuation when exposed to low humidity (10.2%), low temperature (37 °C), or low-energy light (0.42 W cm−2). More importantly, the SA demonstrate robust applications in smart living, including artificial mimosa, intelligent lampshade, and morphing food. By overcoming the hydrogen-bond lockers inherent in natural polymers, SA open new avenues for next-generation recyclable materials and actuators, bringing them closer to practical applications.  相似文献   
17.
CAD based shape optimization for gas turbine component design   总被引:1,自引:0,他引:1  
In order to improve product characteristics, engineering design makes increasing use of Robust Design and Multidisciplinary Design Optimisation. Common to both methodologies is the need to vary the object’s shape and to assess the resulting change in performance, both executed within an automatic loop. This shape change can be realised by modifying the parameter values of a suitably parameterised Computer Aided Design (CAD) model. This paper presents the adopted methodology and the achieved results when performing optimisation of a gas turbine disk. Our approach to hierarchical modelling employing design tables is presented, with methods to ensure satisfactory geometry variation by commercial CAD systems. The conducted studies included stochastic and probabilistic design optimisation. To solve the multi-objective optimisation problem, a Pareto optimum criterion was used. The results demonstrate that CAD centric approach enables significant progress towards automating the entire process while achieving a higher quality product with the reduced susceptibility to manufacturing imperfections.  相似文献   
18.
Climate change is likely to have a profound effect on many agricultural variables, although the extent of its influence will vary over the course of the annual farm management cycle. Consequently, the effect of different and interconnected physical, technical and economic factors must be modeled in order to estimate the effects of climate change on agricultural productivity. Such modeling commonly makes use of indicators that summarize the among environmental factors that are considered when farmers plan their activities. This study uses net evapotranspiration (ETN), estimated using EPIC, as a proxy index for the physical factors considered by farmers when managing irrigation. Recent trends suggest that the probability distribution function of ETN may continue to change in the near future due to changes in the irrigation needs of crops. Also, water availability may continue to vary due to changes in the rainfall regime. The impacts of the uncertainties related to these changes on costs are evaluated using a Discrete Stochastic Programming model representing an irrigable Mediterranean area where limited water is supplied from a reservoir. In this context, adaptation to climate change can be best supported by improvements to the collective irrigation systems, rather than by measures aimed at individual farms such as those contained within the rural development policy.  相似文献   
19.
In this work we present a new algorithm for accelerating the colour bilateral filter based on a subsampling strategy working in the spatial domain. The base idea is to use a suitable subset of samples of the entire kernel in order to obtain a good estimation of the exact filter values. The main advantages of the proposed approach are that it has an excellent trade‐off between visual quality and speed‐up, a very low memory overhead is required and it is straightforward to implement on the GPU allowing real‐time filtering. We show different applications of the proposed filter, in particular efficient cross‐bilateral filtering, real‐time edge‐aware image editing and fast video denoising. We compare our method against the state of the art in terms of image quality, time performance and memory usage.  相似文献   
20.
In this work, an innovative procedure for real-time heat transfer modelling and dimensional change estimation during meat cooking was presented. By using a multipoint temperature probe, a punctual calculation of slowest heating point (SHP) temperature was obtained at each time from the radial temperature distribution inside the product. Experimental temperature data from the multipoint probe was combined with a mathematical algorithm previously validated to perform a real-time estimation of SHP temperature and residual cooking time on the basis of the data stored at each instant. The developed procedure and algorithm were validated by cooking pork loin and roast beef samples at 180 and 200 °C both under natural and forced convection regimes. Real-time predicted cooking time and SHP endpoint temperature values were very close to those experimentally obtained: at the 85% of the cooking process, the maximum percentage errors for SHP endpoint temperature and cooking time prediction were 1.72 and 1.67%, respectively. In addition, SHP location inside the meat samples was also obtained at each time instant and used to estimate dimensional changes during cooking: calculated final characteristic dimensions were very similar to those experimentally obtained for all cooking trials. The developed approach could be useful for the automatic cooking operations planning in food-service with microbial safety assurance.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号