首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   84350篇
  免费   1025篇
  国内免费   408篇
工业技术   85783篇
  2023年   16篇
  2022年   42篇
  2021年   67篇
  2020年   50篇
  2019年   36篇
  2018年   14464篇
  2017年   13400篇
  2016年   9989篇
  2015年   630篇
  2014年   262篇
  2013年   246篇
  2012年   3187篇
  2011年   9459篇
  2010年   8305篇
  2009年   5579篇
  2008年   6801篇
  2007年   7787篇
  2006年   148篇
  2005年   1227篇
  2004年   1139篇
  2003年   1186篇
  2002年   554篇
  2001年   107篇
  2000年   180篇
  1999年   67篇
  1998年   57篇
  1997年   32篇
  1996年   52篇
  1995年   13篇
  1994年   18篇
  1993年   11篇
  1992年   18篇
  1991年   24篇
  1988年   10篇
  1969年   24篇
  1968年   43篇
  1967年   33篇
  1966年   42篇
  1965年   44篇
  1964年   11篇
  1963年   28篇
  1962年   22篇
  1961年   18篇
  1960年   30篇
  1959年   35篇
  1958年   37篇
  1957年   36篇
  1956年   34篇
  1955年   63篇
  1954年   68篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
951.
There is an increasing use of computer media for negotiations. However, the use of computer-mediated channels increases the hostile expressions of emotion, termed flaming. Although researchers agree that flaming has important effects on negotiation, predictions concerning these effects are inconsistent, suggesting a need for further investigation. We address this need by extending current flaming and negotiation research in two ways. First, we identify two different types of flaming: that which is motivated by perceptions concerning the negotiating opponent (e.g., he/she is unfair) and that which is motivated by perceptions concerning the negotiating context (e.g., the communication channel is too slow). Second, we differentiate between the effects of flaming on the concession behaviors of the flame sender and the flame recipient, and the effects of these behaviors on negotiated agreement. Via a laboratory study, we demonstrate that flames directed at the negotiation opponent slightly decrease the likelihood of reaching an agreement, and when an agreement is reached, it result in outcomes significantly favoring the flame recipient rather than the flame sender. In contrast, flames directed at the negotiation context significantly increase the likelihood of agreement, although outcomes still favor the flame recipient over the flame sender. These results suggest that flame senders are generally worse off than flame recipients, which provides an important basis for the strategic use of flaming in negotiations.  相似文献   
952.
About 20 years ago, Markus and Robey noted that most research on IT impacts had been guided by deterministic perspectives and had neglected to use an emergent perspective, which could account for contradictory findings. They further observed that most research in this area had been carried out using variance theories at the expense of process theories. Finally, they suggested that more emphasis on multilevel theory building would likely improve empirical reliability. In this paper, we reiterate the observations and suggestions made by Markus and Robey on the causal structure of IT impact theories and carry out an analysis of empirical research published in four major IS journals, Management Information Systems Quarterly (MISQ), Information Systems Research (ISR), the European Journal of Information Systems (EJIS), and Information and Organization (I&O), to assess compliance with those recommendations. Our final sample consisted of 161 theory-driven articles, accounting for approximately 21% of all the empirical articles published in these journals. Our results first reveal that 91% of the studies in MISQ, ISR, and EJIS focused on deterministic theories, while 63% of those in I&O adopted an emergent perspective. Furthermore, 91% of the articles in MISQ, ISR, and EJIS adopted a variance model; this compares with 71% from I&O that applied a process model. Lastly, mixed levels of analysis were found in 14% of all the surveyed articles. Implications of these findings for future research are discussed.  相似文献   
953.
Since DeLone and McLean (D&M) developed their model of IS success, there has been much research on the topic of success as well as extensions and tests of their model. Using the technique of a qualitative literature review, this research reviews 180 papers found in the academic literature for the period 1992–2007 dealing with some aspect of IS success. Using the six dimensions of the D&M model – system quality, information quality, service quality, use, user satisfaction, and net benefits – 90 empirical studies were examined and the results summarized. Measures for the six success constructs are described and 15 pairwise associations between the success constructs are analyzed. This work builds on the prior research related to IS success by summarizing the measures applied to the evaluation of IS success and by examining the relationships that comprise the D&M IS success model in both individual and organizational contexts.  相似文献   
954.
In the 1990s, enrollments grew rapidly in information systems (IS) and computer science. Then, beginning in 2000 and 2001, enrollments declined precipitously. This paper looks at the enrollment bubble and the dotcom bubble that drove IT enrollments. Although the enrollment bubble occurred worldwide, this paper focuses primarily on U.S. data, which is widely available, and secondarily on Western Europe data. The paper notes that the dotcom bubble was an investment disaster but that U.S. IT employment fell surprisingly little and soon surpassed the bubble's peak IT employment. In addition, U.S. IT unemployment rose to almost the level of total unemployment in 2003, then fell to traditional low levels by 2005. Job prospects in the U.S. and most other countries are good for the short term, and the U.S. Bureau of Labor Statistics employment projections for 2006–2016 indicate that job prospects in the U.S. will continue to be good for most IT jobs. However, offshoring is a persistent concern for students in Western Europe and the United States. The data on offshoring are of poor quality, but several studies indicate that IT job losses from offshoring are small and may be counterbalanced by gains in IT inshoring jobs. At the same time, offshoring and productivity gains appear to be making low-level jobs such as programming and user support less attractive. This means that IS and computer science programs will have to focus on producing higher-level job skills among graduates. In addition, students may have to stop considering the undergraduate degree to be a terminal degree in IS and computer science.  相似文献   
955.
956.
Category Partition Method (CPM) is a general approach to specification-based program testing, where test frame reduction and refinement are two important issues. Test frame reduction is necessary since too many test frames may be produced, and test frame refinement is important since during CPM testing new information about test frame generation may be achieved and considered incrementally. Besides the information provided by testers or users, implementation related knowledge offers alternative information for reducing and refining CPM test frames. This paper explores the idea by proposing a call patterns semantics based test frame updating method for Prolog programs, in which a call patterns analysis is used to collect information about the way in which procedures are used in a program. The updated test frames will be represented as constraints. The effect of our test frame updating is two-fold. On one hand, it removes “uncared” data from the original set of test frames; on the other hand, it refines the test frames to which we should pay more attention. The first effect makes the input domain on which a procedure must be tested a subset of the procedure’s input domain, and the latter makes testers stand more chance to find out the faults that are more likely to show their presence in the use of the program under consideration. Our test frame updating method preserves the effectiveness of CPM testing with respect to the detection of faults we care. The test case generation from the updated set of test frames is also discussed. In order to show the applicability of our method an approximation call patterns semantics is proposed, and the test frame updating on the semantics is illustrated by an example.
Lingzhong ZhaoEmail:
  相似文献   
957.
Statistical process control (SPC) is a conventional means of monitoring software processes and detecting related problems, where the causes of detected problems can be identified using causal analysis. Determining the actual causes of reported problems requires significant effort due to the large number of possible causes. This study presents an approach to detect problems and identify the causes of problems using multivariate SPC. This proposed method can be applied to monitor multiple measures of software process simultaneously. The measures which are detected as the major impacts to the out-of-control signals can be used to identify the causes where the partial least squares (PLS) and statistical hypothesis testing are utilized to validate the identified causes of problems in this study. The main advantage of the proposed approach is that the correlated indices can be monitored simultaneously to facilitate the causal analysis of a software process.
Chih-Ping ChuEmail:

Ching-Pao Chang   is a PhD candidate in Computer Science & Information Engineering at the National Cheng-Kung University, Taiwan. He received his MA from the University of Southern California in 1998 in Computer Science. His current work deals with the software process improvement and defect prevention using machine learning techniques. Chih-Ping Chu   is Professor of Software Engineering in Department of Computer Science & Information Engineering at the National Cheng-Kung University (NCKU) in Taiwan. He received his MA in Computer Science from the University of California, Riverside in 1987, and his Doctorate in Computer Science from Louisiana State University in 1991. He is especially interested in parallel computing and software engineering.   相似文献   
958.
The identification of part families and machine groups that form the cells is a major step in the development of a cellular manufacturing system and, consequently, a large number of concepts, theories and algorithms have been proposed. One common assumption for most of these cell formation algorithms is that the product mix remains stable over a period of time. In today’s world, the market demand is being shaped by consumers resulting in a highly volatile market. This has given rise to a new class of products characterized by low volume and high variety. To incorporate product mix changes into an existing cellular manufacturing system many important issues have to be tackled. In this paper, a methodology to incorporate new parts and machines into an existing cellular manufacturing system has been presented. The objective is to fit the new parts and machines into an existing cellular manufacturing system thereby increasing machine utilization and reducing investment in new equipment.  相似文献   
959.
Hard turning with cubic boron nitride (CBN) tools has been proven to be more effective and efficient than traditional grinding operations in machining hardened steels. However, rapid tool wear is still one of the major hurdles affecting the wide implementation of hard turning in industry. Better prediction of the CBN tool wear progression helps to optimize cutting conditions and/or tool geometry to reduce tool wear, which further helps to make hard turning a viable technology. The objective of this study is to design a novel but simple neural network-based generalized optimal estimator for CBN tool wear prediction in hard turning. The proposed estimator is based on a fully forward connected neural network with cutting conditions and machining time as the inputs and tool flank wear as the output. Extended Kalman filter algorithm is utilized as the network training algorithm to speed up the learning convergence. Network neuron connection is optimized using a destructive optimization algorithm. Besides performance comparisons with the CBN tool wear measurements in hard turning, the proposed tool wear estimator is also evaluated against a multilayer perceptron neural network modeling approach and/or an analytical modeling approach, and it has been proven to be faster, more accurate, and more robust. Although this neural network-based estimator is designed for CBN tool wear modeling in this study, it is expected to be applicable to other tool wear modeling applications.  相似文献   
960.
Developing trusted softwares has become an important trend and a natural choice in the development of software technology and applications, and software trustworthiness modeling has become a prerequisite and necessary means. To discuss and explain the basic scientific problems in software trustworthiness and to establish theoretical foundations for software trustworthiness measurement, combining the ideas of dynamical system study, this paper studies evolutionary laws of software trustworthiness and the dynamical mechanism under the effect of various internal and external factors, and proposes dynamical models for software trustworthiness, thus, software trustworthiness can be considered as the statistical characteristics of behaviors of software systems in the dynamical and open environment. By analyzing two simple examples, the paper explains the relationship between the limit evolutionary behaviors of software trustworthiness attributes and dynamical system characteristics, and interprets the dynamical characteristics of software trustworthiness and their evolutionary complexity. Supported partially by the National Basic Research Program of China (Grant No. 2005CB321900) and the National Natural Science Foundation of China (Grant No. 60473091)  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号