首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   290篇
  免费   13篇
  国内免费   1篇
工业技术   304篇
  2023年   4篇
  2022年   2篇
  2021年   13篇
  2020年   4篇
  2019年   8篇
  2018年   6篇
  2017年   11篇
  2016年   9篇
  2015年   16篇
  2014年   9篇
  2013年   13篇
  2012年   15篇
  2011年   19篇
  2010年   18篇
  2009年   16篇
  2008年   25篇
  2007年   14篇
  2006年   18篇
  2005年   9篇
  2004年   7篇
  2003年   9篇
  2002年   11篇
  2001年   2篇
  2000年   2篇
  1999年   6篇
  1998年   4篇
  1997年   2篇
  1996年   5篇
  1995年   1篇
  1994年   5篇
  1993年   1篇
  1992年   4篇
  1991年   1篇
  1990年   3篇
  1986年   2篇
  1985年   2篇
  1983年   1篇
  1982年   1篇
  1975年   2篇
  1974年   1篇
  1972年   2篇
  1966年   1篇
排序方式: 共有304条查询结果,搜索用时 20 毫秒
1.
Gamut mapping deals with the need to adjust a color image to fit into the constrained color gamut of a given rendering medium. A typical use for this tool is the reproduction of a color image prior to its printing, such that it exploits best the given printer/medium color gamut, namely the colors the printer can produce on the given medium. Most of the classical gamut mapping methods involve a pixel-by-pixel mapping and ignore the spatial color configuration. Recently proposed spatial-dependent approaches for gamut mapping are either based on heuristic assumptions or involve a high computational cost. In this paper, we present a new variational approach for space-dependent gamut mapping. Our treatment starts with the presentation of a new measure for the problem, closely related to a recent measure proposed for Retinex. We also link our method to recent measures that attempt to couple spectral and spatial perceptual measures. It is shown that the gamut mapping problem leads to a quadratic programming formulation, guaranteed to have a unique solution if the gamut of the target device is convex. An efficient numerical solution is proposed with promising results.  相似文献   
2.
Conventional access methods cannot be effectively used in large Scientific/Statistical Database (SSDB) applications. A file structure (called bit transposed file (BTF)) is proposed which offers several attractive features that are better suited for the special characteristics that SSDBs exhibit. This file structure is an extreme version of the (attribute) transposed file. The data are stored by vertical bit partitions. The bit patterns of attributes are assigned using one of several data encoding methods. Each of these encoding methods is appropriate for different query types. The bit partitions can also be compressed using a version of the run length encoding scheme. Efficient operators on compressed bit vectors have been developed and form the basis of a query language. Because of the simplicity of the file structure and query language, optimization problems for database design, query evaluation, and common subexpression removal can be formalized and efficient exact solution or near optimal solution can be achieved. In addition to selective power with low overheads for SSDBs, the BTF is also amenable to special parallel hardware. Results from experiments with the file structure suggest that this approach may be a reasonable alternative file structure for large SSDBs.  相似文献   
3.
This paper describes a technique for runtime monitoring (RM) and runtime verification (RV) of systems with invisible events and data artifacts. Our approach combines well-known hidden markov model (HMM) techniques for learning and subsequent identification of hidden artifacts, with runtime monitoring of probabilistic formal specifications. The proposed approach entails a process in which the end-user first develops and validates deterministic formal specification assertions, s/he then identifies hidden artifacts in those assertions. Those artifacts induce the state set of the identifying HMM. HMM parameters are learned using standard frequency analysis techniques. In the verification or monitoring phase, the system emits visible events and data symbols, used by the HMM to deduce invisible events and data symbols, and sequences thereof; both types of symbols are then used by a probabilistic formal specification assertion to monitor or verify the system.  相似文献   
4.
High-performance polymers are an important class of materials that are used in challenging conditions, such as in aerospace applications. Until now, 3D printing based on stereolithography processes can not be performed due to a lack of suitable materials. There is report on new materials and printing compositions that enable 3D printing of objects having extremely high thermal resistance, with Tg of 283 °C and excellent mechanical properties. The printing is performed by a low-cost Digital Light Processing printer, and the formulation is based on a dual-cure mechanism, photo, and thermal process. The main components are a molecule that has both epoxy and acrylate groups, alkylated melamine that enables a high degree of crosslinking, and a soluble precursor of silica. The resulting objects are made of hybrid materials, in which the silicon is present in the polymeric backbone and partly as silica enforcement particles.  相似文献   
5.
6.
Certain behavioral properties of distributed systems are difficult to express in interleaving semantics, whereas they are naturally expressed in terms of partial orders of events or, equivalently, Mazurkiewicz traces. Two examples of such properties are serializability of a database and global snapshots of concurrent systems. Recently, a modest extension for LTL by an operator that expresses snapshots, has been proposed. It combines the ease of linear (interleaving) specification with this useful partial order concept. The new construct allows one to assert that a global snapshot appeared in the past, perhaps not in the observed execution sequence, but possibly in an equivalent one.  相似文献   
7.
This paper presents a framework for augmenting independent validation and verification (IV&V) of software systems with computer-based IV&V techniques. The framework allows an IV&V team to capture its own understanding of the application as well as the expected behavior of any proposed system for solving the underlying problem by using an executable system reference model, which uses formal assertions to specify mission- and safety-critical behaviors. The framework uses execution-based model checking to validate the correctness of the assertions and to verify the correctness and adequacy of the system under test.  相似文献   
8.
Personalization technologies offer powerful tools for enhancing the user experience in a wide variety of systems, but at the same time raise new privacy concerns. For example, systems that personalize advertisements according to the physical location of the user or according to the user??s friends?? search history, introduce new privacy risks that may discourage wide adoption of personalization technologies. This article analyzes the privacy risks associated with several current and prominent personalization trends, namely social-based personalization, behavioral profiling, and location-based personalization. We survey user attitudes towards privacy and personalization, as well as technologies that can help reduce privacy risks. We conclude with a discussion that frames risks and technical solutions in the intersection between personalization and privacy, as well as areas for further investigation. This frameworks can help designers and researchers to contextualize privacy challenges of solutions when designing personalization systems.  相似文献   
9.
This paper presents RM-BCS, a novel run-time monitoring (RM) technique that uses periodic bounded constraint solving (BCS) to monitor streaming systems with undetermined artifacts such as agent roles and message classification. BCS is used to discover maps of interest, mapping a streaming collection of messages to a domain of discourse that consists of a set of known agents, roles, and message meanings; the desired map is one in which an underlying first-order logic constraint is satisfied. We demonstrate the technique for the problem of detecting an, Internet-based terrorist attack coordination session within frames of streaming data such as a Twitter stream. The proposed technique is a third class of RM techniques, the first two being RM of deterministic observation streams and RM of probabilistic observation streams.  相似文献   
10.
The paper focuses on mining clusters that are characterized by a lagged relationship between the data objects. We call such clusters lagged co-clusters. A lagged co-cluster of a matrix is a submatrix determined by a subset of rows and their corresponding lag over a subset of columns. Extracting such subsets may reveal an underlying governing regulatory mechanism. Such a regulatory mechanism is quite common in real-life settings. It appears in a variety of fields: meteorology, seismic activity, stock market behavior, neuronal brain activity, river flow, and navigation, but a limited list of examples. Mining such lagged co-clusters not only helps in understanding the relationship between objects in the domain, but assists in forecasting their future behavior. For most interesting variants of this problem, finding an optimal lagged co-cluster is NP-complete problem. We present a polynomial-time Monte-Carlo algorithm for mining lagged co-clusters. We prove that, with fixed probability, the algorithm mines a lagged co-cluster which encompasses the optimal lagged co-cluster by a maximum 2 ratio columns overhead and completely no rows overhead. Moreover, the algorithm handles noise, anti-correlations, missing values, and overlapping patterns. The algorithm is extensively evaluated using both artificial and real-world test environments. The first enable the evaluation of specific, isolated properties of the algorithm. The latter (river flow and topographic data) enable the evaluation of the algorithm to efficiently mine relevant and coherent lagged co-clusters in environments that are temporal, i.e., time reading data and non-temporal.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号