全文获取类型
收费全文 | 3404篇 |
免费 | 291篇 |
国内免费 | 11篇 |
学科分类
工业技术 | 3706篇 |
出版年
2024年 | 6篇 |
2023年 | 45篇 |
2022年 | 92篇 |
2021年 | 262篇 |
2020年 | 126篇 |
2019年 | 128篇 |
2018年 | 147篇 |
2017年 | 126篇 |
2016年 | 178篇 |
2015年 | 144篇 |
2014年 | 193篇 |
2013年 | 288篇 |
2012年 | 231篇 |
2011年 | 268篇 |
2010年 | 186篇 |
2009年 | 212篇 |
2008年 | 183篇 |
2007年 | 121篇 |
2006年 | 136篇 |
2005年 | 111篇 |
2004年 | 80篇 |
2003年 | 69篇 |
2002年 | 59篇 |
2001年 | 43篇 |
2000年 | 31篇 |
1999年 | 28篇 |
1998年 | 31篇 |
1997年 | 21篇 |
1996年 | 32篇 |
1995年 | 18篇 |
1994年 | 11篇 |
1993年 | 11篇 |
1992年 | 13篇 |
1991年 | 9篇 |
1990年 | 3篇 |
1989年 | 6篇 |
1988年 | 2篇 |
1987年 | 8篇 |
1986年 | 5篇 |
1985年 | 4篇 |
1984年 | 7篇 |
1983年 | 6篇 |
1982年 | 4篇 |
1981年 | 3篇 |
1980年 | 3篇 |
1979年 | 2篇 |
1978年 | 2篇 |
1977年 | 5篇 |
1976年 | 2篇 |
1975年 | 3篇 |
排序方式: 共有3706条查询结果,搜索用时 671 毫秒
61.
62.
63.
Fabio Poiesi Riccardo Mazzon Andrea Cavallaro 《Computer Vision and Image Understanding》2013,117(10):1257-1272
We propose a generic online multi-target track-before-detect (MT-TBD) that is applicable on confidence maps used as observations. The proposed tracker is based on particle filtering and automatically initializes tracks. The main novelty is the inclusion of the target ID in the particle state, enabling the algorithm to deal with unknown and large number of targets. To overcome the problem of mixing IDs of targets close to each other, we propose a probabilistic model of target birth and death based on a Markov Random Field (MRF) applied to the particle IDs. Each particle ID is managed using the information carried by neighboring particles. The assignment of the IDs to the targets is performed using Mean-Shift clustering and supported by a Gaussian Mixture Model. We also show that the computational complexity of MT-TBD is proportional only to the number of particles. To compare our method with recent state-of-the-art works, we include a postprocessing stage suited for multi-person tracking. We validate the method on real-world and crowded scenarios, and demonstrate its robustness in scenes presenting different perspective views and targets very close to each other. 相似文献
64.
Anja Le Blanc John Brooke Donal Fellows Marco Soldati David Pérez-Suárez Alessandro Marassi Andrej Santin 《Journal of Grid Computing》2013,11(3):481-503
In this paper we describe how we have introduced workflows into the working practices of a community for whom the concept of workflows is very new, namely the heliophysics community. Heliophysics is a branch of astrophysics which studies the Sun and the interactions between the Sun and the planets, by tracking solar events as they travel throughout the Solar system. Heliophysics produces two major challenges for workflow technology. Firstly it is a systems science where research is currently developed by many different communities who need reliable data models and metadata to be able to work together. Thus it has major challenges in the semantics of workflows. Secondly, the problem of time is critical in heliophysics; the workflows must take account of the propagation of events outwards from the sun. They have to address the four dimensional nature of space and time in terms of the indexing of data. We discuss how we have built an environment for Heliophysics workflows building on and extending the Taverna workflow system and utilising the myExperiment site for sharing workflows. We also describe how we have integrated the workflows into the existing practices of the communities involved in Heliophysics by developing a web portal which can hide the technical details from the users, who can concentrate on the data from their scientific point of view rather than on the methods used to integrate and process the data. This work has been developed in the EU Framework 7 project HELIO, and is being disseminated to the worldwide Heliophysics community, since Heliophysics requires integration of effort on a global scale. 相似文献
65.
Francesco Bellotti Riccardo Berta Massimiliano Margarone Alessandro De Gloria 《Software》2008,38(12):1241-1259
The RFID technology is becoming ever more popular in the development of ubiquitous computing applications. A full exploitation of the RFID potential requires the study and implementation of human–computer interaction (HCI) modalities to be able to support wide usability by the target audience. This implies the need for programming methodologies specifically dedicated to support the easy and efficient prototyping of applications to have feedback from early tests with users. On the basis of our field‐working experience, we have designed oDect, a high‐level language and platform‐independent application programming interface (API), ad hoc designed to meet the needs of typical applications for mobile devices (smart phones and PDAs). oDect aims at allowing application developers to create their prototypes focusing on the needs of the final users, without having to care about the low‐level software that interacts with the RFID hardware. Further, in an end‐user developing (EUD) approach, oDect provides specific support for the application end‐user herself to cope with typical problems of RFID applications in detecting objects. We describe in detail the features of the API and discuss the findings of a test with four programmers, where we analyse and evaluate the use of the API in four sample applications. We also present results of an end‐user test, which investigated strengths and weaknesses of the territorial agenda (TA) concept. The TA is an RFID‐based citizen guide that aids—through time‐ and location‐based reminders—users in their daily activities in a city. The TA directly exploits EUD features of oDect, in particular concerning the possibility of linking detected objects with custom actions. Copyright © 2008 John Wiley & Sons, Ltd. 相似文献
66.
Boosting text segmentation via progressive classification 总被引:5,自引:4,他引:1
Eugenio Cesario Francesco Folino Antonio Locane Giuseppe Manco Riccardo Ortale 《Knowledge and Information Systems》2008,15(3):285-320
A novel approach for reconciling tuples stored as free text into an existing attribute schema is proposed. The basic idea
is to subject the available text to progressive classification, i.e., a multi-stage classification scheme where, at each intermediate stage, a classifier is learnt that analyzes the textual
fragments not reconciled at the end of the previous steps. Classification is accomplished by an ad hoc exploitation of traditional
association mining algorithms, and is supported by a data transformation scheme which takes advantage of domain-specific dictionaries/ontologies.
A key feature is the capability of progressively enriching the available ontology with the results of the previous stages
of classification, thus significantly improving the overall classification accuracy. An extensive experimental evaluation
shows the effectiveness of our approach. 相似文献
67.
In this paper we consider the p-ary transitive reduction (TR
p
) problem where p>0 is an integer; for p=2 this problem arises in inferring a sparsest possible (biological) signal transduction network consistent with a set of
experimental observations with a goal to minimize false positive inferences even if risking false negatives. Special cases
of TR
p
have been investigated before in different contexts; the best previous results are as follows:
In this paper, our contributions are as follows:
R. Albert’s research was partly supported by a Sloan Research Fellowship in Science and Technology.
B. DasGupta’s research was partly supported by NSF grants DBI-0543365, IIS-0612044 and IIS-0346973.
E. Sontag’s research was partly supported by NSF grants EIA 0205116 and DMS-0504557. 相似文献
(1) | The minimum equivalent digraph problem, that correspond to a special case of TR1 with no critical edges, is known to be MAX-SNP-hard, admits a polynomial time algorithm with an approximation ratio of 1.617+ε for any constant ε>0 (Chiu and Liu in Sci. Sin. 4:1396–1400, 1965) and can be solved in linear time for directed acyclic graphs (Aho et al. in SIAM J. Comput. 1(2):131–137, 1972). |
(2) | A 2-approximation algorithm exists for TR1 (Frederickson and JàJà in SIAM J. Comput. 10(2):270–283, 1981; Khuller et al. in 19th Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 937–938, 1999). |
• | We observe that TR p , for any integer p>0, can be solved in linear time for directed acyclic graphs using the ideas in Aho et al. (SIAM J. Comput. 1(2):131–137, 1972). |
• | We provide a 1.78-approximation for TR1 that improves the 2-approximation mentioned in (2) above. |
• | We provide a 2+o(1)-approximation for TR p on general graphs for any fixed prime p>1. |
68.
A vital part of a modern economy is an information market. In this market, information products are being traded in countless
ways. Information is bought, modified, integrated, incorporated into other products, and then sold again. Often, the manufacturing
of an information product requires the collaboration of several participants. A virtual enterprise is a community of business
entities that collaborate on the manufacturing of complex products. This collaboration is often ad hoc, for a specific product
only, after which the virtual enterprise may dismantle. The virtual enterprise paradigm is particularly appealing for modeling
collaborations for manufacturing information products, and in this paper we present a new model, called VirtuE, for modeling
such activities. VirtuE has three principal components. First, it defines a distributed infrastructure with concepts such as members, products, inventories, and production plans. Second, it defines transactions among members, to enable collaborative production of complex products. Finally, it provides means for the instrumentation of enterprises, to measure their performance and to govern their behavior. 相似文献
69.
Roberto Bruttomesso Alessandro Cimatti Anders Franzen Alberto Griggio Roberto Sebastiani 《Annals of Mathematics and Artificial Intelligence》2009,55(1-2):63-99
Most state-of-the-art approaches for Satisfiability Modulo Theories $(SMT(\mathcal{T}))$ rely on the integration between a SAT solver and a decision procedure for sets of literals in the background theory $\mathcal{T} (\mathcal{T}{\text {-}}solver)$ . Often $\mathcal{T}$ is the combination $\mathcal{T}_1 \cup \mathcal{T}_2$ of two (or more) simpler theories $(SMT(\mathcal{T}_1 \cup \mathcal{T}_2))$ , s.t. the specific ${\mathcal{T}_i}{\text {-}}solvers$ must be combined. Up to a few years ago, the standard approach to $SMT(\mathcal{T}_1 \cup \mathcal{T}_2)$ was to integrate the SAT solver with one combined $\mathcal{T}_1 \cup \mathcal{T}_2{\text {-}}solver$ , obtained from two distinct ${\mathcal{T}_i}{\text {-}}solvers$ by means of evolutions of Nelson and Oppen’s (NO) combination procedure, in which the ${\mathcal{T}_i}{\text {-}}solvers$ deduce and exchange interface equalities. Nowadays many state-of-the-art SMT solvers use evolutions of a more recent $SMT(\mathcal{T}_1 \cup \mathcal{T}_2)$ procedure called Delayed Theory Combination (DTC), in which each ${\mathcal{T}_i}{\text {-}}solver$ interacts directly and only with the SAT solver, in such a way that part or all of the (possibly very expensive) reasoning effort on interface equalities is delegated to the SAT solver itself. In this paper we present a comparative analysis of DTC vs. NO for $SMT(\mathcal{T}_1 \cup \mathcal{T}_2)$ . On the one hand, we explain the advantages of DTC in exploiting the power of modern SAT solvers to reduce the search. On the other hand, we show that the extra amount of Boolean search required to the SAT solver can be controlled. In fact, we prove two novel theoretical results, for both convex and non-convex theories and for different deduction capabilities of the ${\mathcal{T}_i}{\text {-}}solvers$ , which relate the amount of extra Boolean search required to the SAT solver by DTC with the number of deductions and case-splits required to the ${\mathcal{T}_i}{\text {-}}solvers$ by NO in order to perform the same tasks: (i) under the same hypotheses of deduction capabilities of the ${\mathcal{T}_i}{\text {-}}solvers$ required by NO, DTC causes no extra Boolean search; (ii) using ${\mathcal{T}_i}{\text {-}}solvers$ with limited or no deduction capabilities, the extra Boolean search required can be reduced down to a negligible amount by controlling the quality of the $\mathcal{T}$ -conflict sets returned by the ${\mathcal{T}_i}{\text {-}}solvers$ . 相似文献
70.
Real-time inverse dynamics control of parallel manipulators using general-purpose multibody software
This work deals with the problem of computing the inverse dynamics of complex constrained mechanical systems for real-time
control applications. The main goal is the control of robotic systems using model-based schemes in which the inverse model
itself is obtained using a general purpose multibody software, exploiting the redundant coordinate formalism. The resulting
control scheme is essentially equivalent to a classical computed torque control, commonly used in robotics applications. This
work proposes to use modern general-purpose multibody software to compute the inverse dynamics of complex rigid mechanisms
in an efficient way, so that it suits the requirements of realistic real-time applications as well. This task can be very
difficult, since it involves a higher number of equations than the relative coordinates approach. The latter is believed to
be less general, and may suffer from topology limitations. The use of specialized linear algebra solvers makes this kind of
control algorithms usable in real-time for mechanism models of realistic complexity. Numerical results from the simulation
of practical applications are presented, consisting in a “delta” robot and a bio-mimetic 11 degrees of freedom manipulator
controlled using the same software and the same algorithm. 相似文献