全文获取类型
收费全文 | 3449篇 |
免费 | 245篇 |
国内免费 | 1篇 |
学科分类
工业技术 | 3695篇 |
出版年
2023年 | 44篇 |
2022年 | 87篇 |
2021年 | 128篇 |
2020年 | 79篇 |
2019年 | 76篇 |
2018年 | 86篇 |
2017年 | 110篇 |
2016年 | 138篇 |
2015年 | 122篇 |
2014年 | 141篇 |
2013年 | 240篇 |
2012年 | 173篇 |
2011年 | 212篇 |
2010年 | 169篇 |
2009年 | 144篇 |
2008年 | 168篇 |
2007年 | 171篇 |
2006年 | 133篇 |
2005年 | 109篇 |
2004年 | 105篇 |
2003年 | 75篇 |
2002年 | 89篇 |
2001年 | 59篇 |
2000年 | 62篇 |
1999年 | 48篇 |
1998年 | 72篇 |
1997年 | 63篇 |
1996年 | 35篇 |
1995年 | 46篇 |
1994年 | 44篇 |
1993年 | 35篇 |
1992年 | 33篇 |
1991年 | 16篇 |
1990年 | 34篇 |
1989年 | 19篇 |
1988年 | 33篇 |
1987年 | 23篇 |
1986年 | 16篇 |
1985年 | 28篇 |
1984年 | 24篇 |
1983年 | 20篇 |
1982年 | 17篇 |
1981年 | 22篇 |
1980年 | 13篇 |
1979年 | 18篇 |
1978年 | 15篇 |
1977年 | 12篇 |
1976年 | 12篇 |
1975年 | 16篇 |
1974年 | 8篇 |
排序方式: 共有3695条查询结果,搜索用时 15 毫秒
51.
Increasingly powerful computers and increased emphasis on evidence based decision making are creating a demand for merging
and integrating data from different sources into a single data set. The demand for data is outstripping our ability to ensure
data integrity, and sometimes analysis is performed on data that are not appropriate for the purposes they are used for. Here
we describe problems that arise when data from different sources are merged, and we suggest that one way to add context to
data so that users can make informed decisions about their ontological context is through ontology-based metadata. Examples
of the problem are taken from health data with emphasis on difficulties in standardizing Emergency Room wait times. We describe
eight fields that can be used to capture contextual metadata. These fields are captured using ethnographic methods from users
and database stewards who frequently understand precisely how context and institutional usage have shaped interpretation of
semantic fields. We argue that attaching a portable archive of ontological context to travel with data—based on information
from users and developers—is a means of ensuring that data are integrated and compared in multiple contexts with greater integrity
and more robust results.
相似文献
Nadine SchuurmanEmail: |
52.
Faith Ellen Panagiota Fatourou Eleftherios Kosmas Alessia Milani Corentin Travers 《Distributed Computing》2016,29(4):251-277
A universal construction is a general mechanism for obtaining a concurrent implementation of an object from its sequential code. We show that there is no universal construction that is both disjoint-access parallel (guaranteeing the processes operating on different parts of an implemented object do not interfere with one another) and wait-free (guaranteeing progress for each nonfaulty process when accessing an object). In contrast, we present a universal construction which results in disjoint-access parallel, wait-free implementations of any object provided there is a bound on the number of data items accessed by each operation supported by the object. 相似文献
53.
Marie‐Laure Bougnol Jose H. Dulá 《International Transactions in Operational Research》2016,23(4):655-668
This paper treats the problem of how to determine weights in a ranking, which will cause a selected entity to attain the highest possible position. We establish that there are two types of entities in a ranking scheme: those which can be ranked as number one and those which cannot. These two types of entities can be identified using the “ranking hull” of the data; a polyhedral set that envelops the data. Only entities with data points on the boundary of this hull can attain the number one position. There are no weights that will make an entity whose data point is in the interior of the hull to ever attain the number one position. We deal with these two types of entities separately. In the first case, we propose an approach for finding a set of weights that, under special conditions, will result in a selected entity achieving the top of the ranking without ties and without ignoring any of the attributes. For the second category of entities, we devise a procedure to guarantee that these entities will attain their highest possible position in the ranking. The first case will require using interior point methods to solve a linear program (LP). The second case involves a binary mixed integer formulation. These two mathematical programs were tested on data from a well‐known university ranking. 相似文献
54.
In the cold, Purdue Pegboard (PP) performance declines. The purpose of this study was to determine if this cold-induced impairment is consistent across days (i.e. test-retest reliability) in 5°C. In thermoneutral air (25°C), 14 men were familiarised to the dominant hand (PPa) and bimanual (PPb) PP tasks. They then experienced two 90-min cold exposures (Day 1, Day 2) while wearing ~1 clo. Bare hands were maintained throughout. Performance on both tasks showed high reliability from day to day (intraclass correlations >0.700) in both thermoneutral and cold conditions. However for both tasks, room temperature performance did not predict performance in the cold (intraclass correlations <0.450). When screening applicants for manual labour in the cold, one must consider that room temperature dexterity does not correlate with dexterity in the cold. It is recommended that a 60-min period of cold exposure be employed to assess manual dexterity in these workers. STATEMENT OF RELEVANCE: This study shows that PP performance in room temperature does not predict performance in the cold but performance in the cold is consistent from day to day. When screening applicants for manual labour in the cold, it is recommended that dexterity tests be conducted in the same ambient conditions. 相似文献
55.
Optimization of Geoscience Laser Altimeter System waveform metrics to support vegetation measurements 总被引:1,自引:0,他引:1
The Geoscience Laser Altimeter System (GLAS) has collected over 250 million measurements of vegetation height over forests globally. Accurate vegetation heights can be determined using waveform metrics that include vertical extent and extent of the waveform's trailing and leading edges. All three indices are highly dependent upon the signal strength, background noise and signal-to-noise ratio of the waveform, as the background noise contribution to the waveforms has to be removed before their calculation. Over the last six years, GLAS has collected data during thirteen observation periods using illumination from three different lasers. The power levels of these lasers have changed over time, resulting in variable signal power and noise characteristics. Atmospheric conditions vary continuously, also influencing signal power and noise.To minimize these effects, we optimized a noise coefficient which could be constant or vary according to observation period or noise metric. This parameter is used with the mean and standard deviation of the background noise to determine a noise level threshold that is removed from each waveform. An optimization analysis was used with a global dataset of waveforms that are near-coincident with waveforms from other observation periods; the goal of the optimization was to minimize the difference in vertical extent between spatially overlapping GLAS observations. Optimizations based on absolute difference in height led to situations in which the total extent was minimized as well; further optimizations reduced a normalized difference in height extent. The simplest optimizations were based on a constant value to be applied to all observations; noise coefficients of 2.7, 3.2, 3.4 and 4.0 were determined for datasets consisting of global forests, global vegetation, forest in the legal Amazon basin and boreal forests respectively. Optimizations based on the power level or the signal-to-noise ratio of waveforms best minimized differences in waveform extent, decreasing the percent root mean squared height difference by 25-54% over the constant value approach. Further development of methods to ensure temporal consistency of waveform indices will be necessary to support long-term satellite lidar missions and will result in more accurate and precise estimates of canopy height. 相似文献
56.
Long-lived renaming allows processes to repeatedly get distinct names from a small name space and release these names. This
paper presents two long-lived renaming algorithms in which the name a process gets is bounded above by the number of processes
currently occupying a name or performing one of these operations. The first algorithm is asynchronous, uses LL/SC objects, and has step complexity that is linear in the number of processes, c, currently getting or releasing a name. The second is synchronous, uses registers and counters, and has step complexity that
is polylogarithmic in c. Both tolerate any number of process crashes. 相似文献
57.
Marie A. Wright 《Computer Fraud & Security》2001,2001(8):8-10
The need to protect US critical infrastructures has been the primary motivation behind two of the US Federal Government’s security education initiatives: the National Security Agency’s Centers of Academic Excellence in Information Assurance Education (CAE/IAE) programme, and the National Science Foundation’s Federal Cyber Service: Scholarship for Service (SFS) programme. This article assesses the strength of these education programmes with respect to improving the level of critical infrastructure security. 相似文献
58.
59.
Marcia A. Mardis Ellen S. Hoffman Todd E. Marshall 《International Journal on Digital Libraries》2008,9(1):19-27
Despite their decade of deployment, educational digital libraries have not achieved sustained use in elementary and secondary
schools in the United States. Barriers to accessing the Internet and computers have been widely targeted by myriad initiatives,
but efforts aimed at bridging this first-level “digital divide” have not led to increased use of the Internet and digital
library resources in U.S. classrooms. In fact, such programs have revealed additional divides that affect educators’ use.
This paper examines the additional digital divide levels and proposes a new framework for understanding technology innovation
in schools that can improve development and outreach approaches by digital library developers. 相似文献
60.
Clara Mata Ellen K. Longmire David H. McKenna Katie K. Glass Allison Hubel 《Microfluidics and nanofluidics》2008,5(4):529-540
A recently proposed application of microfluidics is the post-thaw processing of biological cells. Numerical simulations suggest
that diffusion-based extraction of the cryoprotective agent dimethyl sulfoxide (DMSO) from blood cells is viable and more
efficient than centrifugation, the conventional method of DMSO removal. In order to validate the theoretical model used in
these simulations, a prototype was built and the flow of two parallel streams, a suspension of Jurkat cells containing DMSO
and a wash stream that contained neither cells nor DMSO, was characterized experimentally. DMSO transport in a rectangular
channel (depth 500 μm, width 25 mm and overall length 125 mm) was studied as a function of three dimensionless parameters:
depth ratio of the streams, cell volume fraction in the cell solution, and the Peclet number (Pe) based on channel depth, average flow rate and the diffusion coefficient for DMSO in water. In our studies, values of Pe ranged from O(103) to O(104). Laminar flow was ensured by keeping the Reynolds number between O(1) and O(10). Experimental results based on visual and
quantitative data demonstrate conclusively that a microfluidic device can effectively remove DMSO from liquid and cell laden
streams without compromising cell recovery. Also, flow conditions in the microfluidic device appear to have no adverse effect
on cell viability at the outlet. Further, the results demonstrate that we can predict the amount of DMSO removed from a given
device with the theoretical model mentioned previously. 相似文献