首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   66篇
  免费   0篇
工业技术   66篇
  2021年   1篇
  2014年   1篇
  2011年   4篇
  2010年   1篇
  2009年   4篇
  2008年   1篇
  2007年   1篇
  2006年   2篇
  2005年   2篇
  2004年   1篇
  2003年   1篇
  2002年   4篇
  2001年   4篇
  2000年   5篇
  1999年   1篇
  1998年   10篇
  1997年   2篇
  1996年   6篇
  1995年   4篇
  1993年   2篇
  1992年   3篇
  1989年   1篇
  1983年   1篇
  1978年   1篇
  1977年   1篇
  1976年   1篇
  1971年   1篇
排序方式: 共有66条查询结果,搜索用时 31 毫秒
1.
Modeling spatially distributed phenomena in terms of its controlling factors is a recurring problem in geoscience. Most efforts concentrate on predicting the value of response variable in terms of controlling variables either through a physical model or a regression model. However, many geospatial systems comprises complex, nonlinear, and spatially non-uniform relationships, making it difficult to even formulate a viable model. This paper focuses on spatial partitioning of controlling variables that are attributed to a particular range of a response variable. Thus, the presented method surveys spatially distributed relationships between predictors and response. The method is based on association analysis technique of identifying emerging patterns, which are extended in order to be applied more effectively to geospatial data sets. The outcome of the method is a list of spatial footprints, each characterized by a unique “controlling pattern”—a list of specific values of predictors that locally correlate with a specified value of response variable. Mapping the controlling footprints reveals geographic regionalization of relationship between predictors and response. The data mining underpinnings of the method are given and its application to a real world problem is demonstrated using an expository example focusing on determining variety of environmental associations of high vegetation density across the continental United States.  相似文献   
2.
Does code decay? Assessing the evidence from change management data   总被引:1,自引:0,他引:1  
A central feature of the evolution of large software systems is that change-which is necessary to add new functionality, accommodate new hardware, and repair faults-becomes increasingly difficult over time. We approach this phenomenon, which we term code decay, scientifically and statistically. We define code decay and propose a number of measurements (code decay indices) on software and on the organizations that produce it, that serve as symptoms, risk factors, and predictors of decay. Using an unusually rich data set (the fifteen-plus year change history of the millions of lines of software for a telephone switching system), we find mixed, but on the whole persuasive, statistical evidence of code decay, which is corroborated by developers of the code. Suggestive indications that perfective maintenance can retard code decay are also discussed  相似文献   
3.
The GIAO-SCF method for calculating isotropic nuclear magnetic shielding values has been utilized to explain certain features in the 1H-NMR spectrum of 2-methylene-8,8-dimethyl-1,4,6,10-tetraoxaspiro[4.5] decane. Population distributions of the low-energy conformers based on their ab initio energies were used to produce weighting factors for the individual calculated shielding values to calculate the weighted average of the shielding values for a complete set of conformers. The differences in 1H chemical shifts between the hydrogens of the two methyl groups and between the axial and equatorial hydrogens in 2-methylene-8,8-dimethyl-1,4,6,10-tetraoxaspiro[4.5] decane were shown to be due to energy differences between the chair and boat orientations of the six-membered ring and contribution from a twist-boat conformation. Results suggest a hypothesis that intramolecular differences in chemical shift might be calculated to a greater degree of accuracy than chemical shifts calculated relative to a standard.  相似文献   
4.
Eick  S.G. 《Computer》1998,31(10):63-69
To facilitate Y2K conversions, Bell Laboratories has developed a Y2K visualization tool. The input to the tool is the output of commercially available Cobol parsing tools that identify lines potentially affected by Y2K. While these tools are extremely useful, their output is daunting. Presenting this output visually increases assessment productivity by as much as 80 percent. Visualization also improves conversion quality by suggesting more-informed and efficient repair strategies. This visualization tool is based on the idea that no one view is sufficient to answer important questions concerning Y2K. It therefore provides a suite of tightly coupled, linked views. Each view is engineered for a particular task, is interactive, and is used for both display and analysis. Linking between views causes interactive operations to propagate instantly in each view. The authors report that the results to date have been promising. In one case, the time required for assessment and conversion strategy development dropped from three weeks to three days  相似文献   
5.
Analysis and in vitro Reconstitution of Triglycerides of the Sibirian Marmot Oil from Marmota bobac Oils from marmot have been considered antiphlogistic, they are highly unsaturated and reveal a characteristic 2 : 1 ratio for linolenic to linoleic acid. Here, the triglycerides of Marmota bobac oil were analyzed by reversed-phase HPLC under isocratic conditions. For identification of peaks standards were synthesized by lipase-catalyzed interesterification of uniform triglycerides. The four main peaks were identified as linolenoyldioleoyl- and dilinoleoyloleoylglycerols (11.2%), linolenoyloleoylpalmitoyl-and dilinoleoylpalmitoylglycerols (8.3%), trioleoylglycerol (11.2%) and dioleoylpalmitoylglycerol (11.1%). By enzymic interesterification of uniform triglycerides of the four major fatty acids a synthetic oil with a triglyceride pattern similar to that of the natural oil was prepared.  相似文献   
6.
During 2001 to 2004, a study was conducted to assess the indoor environmental and health impact of installing allergen-reducing interventions in the homes of asthmatic children. Based on the results of a pilot study, to determine an intervention that would provide improved symptom scores and a reduction in house dust mite allergen (Der p 1), mechanical ventilation and heat recovery (MVHR) systems were installed in 16 homes. Environmental and respiratory health assessments were conducted before and after the installation of the MVHR systems. The results indicated that the installation of MVHR systems reduced Der p 1 concentrations in living room carpets and mattresses. There were significant reductions in symptom scores for breathlessness during exercise, wheezing, and coughing during the day and night. Although, there was not a parallel control group for the main study, the lack of change in the pilot study control group (who did not receive an intervention), indicated that the changes in symptom scores were in part to do with the intervention. Larger scale trials are needed to determine the efficacy of MVHR systems in homes to improve indoor air quality and reduce asthma symptoms.  相似文献   
7.
The objective was to study the photocationic polymerization of an expanding monomer, 1,5,7,11‐tetraoxaspiro[5.5]undecane (TOSU), and an aromatic dioxirane, bisphenol A diglycidyl ether (BADGE). Both homopolymerizations and binary polymerizations were conducted. The homopolymer, poly(TOSU), was found to be a linear poly(carbonate), which was soluble in acetone. Poly(BADGE) products contained ether linkages in addition to primary and secondary alcohol functionalities. Binary polymerization products varied depending on the irradiation time and length of dark cure. 13C‐NMR analysis of binary polymerizate products revealed peaks not seen in homopolymer spectra consistent with the formation of copolymer linkages. Mass spectrometry data revealed peaks consistent with oligomers that contained both TOSU and BADGE mer units. The structures of key reaction products were proposed. © 2004 Wiley Periodicals, Inc. J Appl Polym Sci 92: 62–71, 2004  相似文献   
8.
D-600 the methoxy derivative of verapamil, is said to affect the force of cardiac contraction and the slow inward current (LSi) specifically by reducing the membrane conductance for Ca2+ (gsi). However, it is apparent that many effects of D-600 cannot be adequately explained solely by an effect on gsi. We studied the effects of D-600 on membrane current and tension of cat papillary muscle, using a conventional single sucrose gap voltage clamp technique. The results indicate that D-600 not only reduces the maximal Ca conductance but also, depending on concentration and duration of exposure, alters both the kinetics of the Ca-carrying system and the amplitude of the steady state outward current. No changes in the steady state activation and inactivation variables or in the rate of Isi inactivation were found. However, a substantial increase in the time to peak Lsi, as much as 7 times normal, was observed after exposure to D-600 (0.5 X 10(-6) to 2.0 X 10(-6) M) for at least 20 minutes. Because approximately only 75% of the reduction in Lsi induced by D-600 could be attributed to change in the maximum value of gsi (gsi), we conclude that the change in time to peak and about 25% of the reduction in Isi must be due to a change in the activation kinetics of the Ca-carrying system, Calculations suggest that the time to 70% activation of gsi can be prolonged to as much as 10 times normal by prolonged exposure to negatively inotropic concentrations of D-600.  相似文献   
9.
Visualizing network data   总被引:3,自引:0,他引:3  
Networks are critical to modern society, and a thorough understanding of how they behave is crucial to their efficient operation. Fortunately, data on networks is plentiful; by visualizing this data, it is possible to greatly improve our understanding. Our focus is on visualizing the data associated with a network and not on simply visualizing the structure of the network itself. We begin with three static network displays; two of these use geographical relationships, while the third is a matrix arrangement that gives equal emphasis to all network links. Static displays can be swamped with large amounts of data; hence we introduce direct manipulation techniques that permit the graphs to continue to reveal relationships in the context of much more data. In effect, the static displays are parameterized so that interesting views may easily be discovered interactively. The software to carry out this network visualization is called SeeNet  相似文献   
10.
Traditionally, rule-based forward-chaining systems are considered to be standalone, working on a volatile memory. This paper focuses on the integration of forward-chaining rules with command-driven programming paradigms in the context of permanent, integrated knowledge bases. A system architecture is proposed that integrates the data management functions of large computerized knowledge bases into a module called a knowledge base management system (KBMS). Experiences we had in integrating rules with operations into a prototype KBMS called DALI are surveyed. For this integration, a new form of production rule, called the activation pattern controlled rule, is introduced, which augments traditional forward-chaining rules by a second, additional left-hand side, which allows making rules sensitive to calls of particular operations. Activation pattern controlled rules play an important role in DALI's system architecture, because they facilitate the storage of knowledge that has been specified relying on mixed programming, a combination of data-driven, command-driven, and preventive programming. The general problems of implementing permanent knowledge bases that contain rules and operations are discussed, and an algorithm for implementating activation pattern controlled rules, called IPTREAT, a generalization of the TREAT algorithm, is provided. Furthermore, the paper intends to clarify the differences between traditional, volatile rule-based systems and rule-based systems that are geared toward knowledge integration by supporting a permanent knowledge base.This paper is an extended and significantly revised version of a paper entitled Integrating Rules into a Knowledge Base Management System, which was presented at the First International Conference on Systems Integration, April 1990 [1].  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号