首页 | 官方网站   微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2561篇
  免费   245篇
  国内免费   96篇
数理化   2902篇
  2024年   3篇
  2023年   25篇
  2022年   112篇
  2021年   111篇
  2020年   67篇
  2019年   81篇
  2018年   64篇
  2017年   109篇
  2016年   129篇
  2015年   74篇
  2014年   179篇
  2013年   227篇
  2012年   140篇
  2011年   149篇
  2010年   130篇
  2009年   168篇
  2008年   174篇
  2007年   129篇
  2006年   123篇
  2005年   108篇
  2004年   78篇
  2003年   76篇
  2002年   80篇
  2001年   58篇
  2000年   51篇
  1999年   46篇
  1998年   39篇
  1997年   20篇
  1996年   23篇
  1995年   16篇
  1994年   10篇
  1993年   16篇
  1992年   11篇
  1991年   15篇
  1990年   4篇
  1989年   6篇
  1988年   8篇
  1987年   7篇
  1986年   7篇
  1985年   11篇
  1984年   4篇
  1982年   5篇
  1981年   2篇
  1979年   1篇
  1978年   2篇
  1977年   1篇
  1976年   1篇
  1975年   1篇
  1957年   1篇
排序方式: 共有2902条查询结果,搜索用时 15 毫秒
101.
The TREX is a recently introduced method for performing sparse high-dimensional regression. Despite its statistical promise as an alternative to the lasso, square-root lasso, and scaled lasso, the TREX is computationally challenging in that it requires solving a nonconvex optimization problem. This article shows a remarkable result: despite the nonconvexity of the TREX problem, there exists a polynomial-time algorithm that is guaranteed to find the global minimum. This result adds the TREX to a very short list of nonconvex optimization problems that can be globally optimized (principal components analysis being a famous example). After deriving and developing this new approach, we demonstrate that (i) the ability of the preexisting TREX heuristic to reach the global minimum is strongly dependent on the difficulty of the underlying statistical problem, (ii) the new polynomial-time algorithm for TREX permits a novel variable ranking and selection scheme, (iii) this scheme can be incorporated into a rule that controls the false discovery rate (FDR) of included features in the model. To achieve this last aim, we provide an extension of the results of Barber and Candes to establish that the knockoff filter framework can be applied to the TREX. This investigation thus provides both a rare case study of a heuristic for nonconvex optimization and a novel way of exploiting nonconvexity for statistical inference.  相似文献   
102.
为了简化模型,提高模型预测精度,利用特征投影图(LPG)进行变量选择。对原始光谱进行连续小波变换(CWT),利用主成分分析(PCA)得到LPG,假定LPG中共线性光谱变量对建模作用相同,选出少数特征光谱变量建立预测模型,所得模型预测均方根误差(RMSEP)为0.345 4,优于其他建模方法,研究结果表明,LPG变量选择可有效简化近红外光谱模型,提高模型预测精度。  相似文献   
103.
A general portfolio of survivorship life insurance contracts is studied in a stochastic rate of return environment with a dependent mortality model. Two methods are used to derive the first two moments of the prospective loss random variable. The first one is based on the individual loss random variables while the second one studies annual stochastic cash flows. The distribution function of the present value of future losses at a given valuation time is derived. For illustrative purposes, an AR(1) process is used to model the stochastic rates of return, and the future lifetimes of a couple are assumed to follow a copula model. The effects of the mortality dependence, the portfolio size and the policy type, as well as the impact of investment strategies on the riskiness of portfolios of survivorship life insurance policies are analyzed by means of moments and probability distributions.  相似文献   
104.
Amita Sharma  Aparna Mehra 《Optimization》2013,62(11):1473-1500
In this paper, we attempt to design a portfolio optimization model for investors who desire to minimize the variation around the mean return and at the same time wish to achieve better return than the worst possible return realization at every time point in a single period portfolio investment. The portfolio is to be selected from the risky assets in the equity market. Since the minimax portfolio optimization model provides us with the portfolio that maximizes (minimizes) the worst return (worst loss) realization in the investment horizon period, in order to safeguard the interest of investors, the optimal value of the minimax optimization model is used to design a constraint in the mean-absolute semideviation model. This constraint can be viewed as a safety strategy adopted by an investor. Thus, our proposed bi-objective linear programming model involves mean return as a reward and mean-absolute semideviation as a risk in the objective function and minimax as a safety constraint, which enables a trade off between return and risk with a fixed safety value. The efficient frontier of the model is generated using the augmented -constraint method on the GAMS software. We simultaneously solve the ratio optimization problem which maximizes the ratio of mean return over mean-absolute semideviation with same minimax value in the safety constraint. Subsequently, we choose two portfolios on the above generated efficient frontier such that the risk from one of them is less and the mean return from other portfolio is more than the respective quantities of the optimal portfolio from the ratio optimization model. Extensive computational results and in-sample and out-of-sample analysis are provided to compare the financial performance of the optimal portfolios selected by our proposed model with that of the optimal portfolios from the existing minimax and mean-absolute semideviation portfolio optimization models on real data from S&P CNX Nifty index.  相似文献   
105.
本文研究了多期投资组合模型的问题.利用非正态稳定分布和参数估计的方法,建立了市场上含一个无风险证券和多个风险证券时多期投资组合的模型,对于描述风险证券所具有的偏态和过度峰态的非正态特征及其股市中的应用起到了作用.  相似文献   
106.
本文研究了均值-方差优化准则下,保险人的最优投资和最优再保险问题.我们用一个复合泊松过程模型来拟合保险人的风险过程,保险人可以投资无风险资产和价格服从跳跃-扩散过程的风险资产.此外保险人还可以购买新的业务(如再保险).本文的限制条件为投资和再保险策略均非负,即不允许卖空风险资产,且再保险的比例系数非负.除此之外,本文还引入了新巴塞尔协议对风险资产进行监管,使用随机二次线性(linear-quadratic,LQ)控制理论推导出最优值和最优策略.对应的哈密顿-雅克比-贝尔曼(Hamilton-Jacobi-Bellman,HJB)方程不再有古典解.在粘性解的框架下,我们给出了新的验证定理,并得到有效策略(最优投资策略和最优再保险策略)的显式解和有效前沿.  相似文献   
107.
This paper introduces a class of unit-linked annuities that extends existing annuities by allowing portfolio shocks to be gradually absorbed into the annuity payouts. Consequently, our new class enables insurers to offer an affordable and adequate annuity with a stable payout stream. We show how to price and adequately hedge the annuity payouts in a general financial environment. In particular, our model accounts for various stylized facts of stock returns such as asymmetry and heavy-tailedness. Furthermore, the generality of our framework makes it possible to explore the impact of a parameter misspecification on the annuity price and the hedging performance.  相似文献   
108.
109.
LeBeau et al. (2003) [4] introduced the ‘virtual-subcell’ (VSC) method of finding a collision partner for a given DSMC particle in a cell; all potential collision partners in the cell are examined to find the nearest neighbor, which becomes the collision partner. Here I propose a modification of the VSC method, the ‘pseudo-subcell’ (PSC) method, whereby the search for a collision partner stops whenever a ‘near-enough’ particle is found, i.e. whenever another particle is found within the ‘pseudo-subcell’ of radius δ centered on the first particle. The radius of the pseudo-subcell is given by δ = Fdn, where dn is the expected distance to the nearest neighbor and F is a constant which can be adjusted to give a desired trade-off between CPU time and accuracy as measured by a small mean collision separation (MCS). For 3D orthogonal cells, of various aspect ratios, dn/L ≈ 0.746/N0.383 where N is the number of particles in the cell and L is the cube root of the cell volume. There is a good chance that a particle will be found in the pseudo-subcell and there is a good chance that such a particle is in fact the nearest neighbor. If no particle is found within the pseudo-subcell the closest particle becomes the collision partner.  相似文献   
110.
We propose a computational methodology to compute and extract circadian rhythmic patterns from an individual animal’s activity-event time series. This lengthy dataset, composed of a sequential event history, contains an unknown number of latent rhythmic cycles of varying duration and missing waveform information. Our computations aim at identifying the onset signature phase which individually indicates a sharp event intensity surge, where a subject-night ends and a brand new cycle’s subject-day begins, and collectively induces a linearity manifesting the individual circadian rhythmicity and information about the average period. Based on the induced linearity, the least squares criterion is employed to choose an optimal sequence of computed onset signature phases among a finite collection derived from the hierarchical factor segmentation (HFS) algorithm. The multiple levels of coding schemes in the HFS algorithm are designed to extract contrasting patterns of aggregation against sparsity of activity events along the entire temporal axis. This optimal sequence dissects the whole time series into a sequence of rhythmic cycles without model assumptions or ad hoc behavioral definitions regarding the missing waveform information. The performance of our methodology is favorably compared with two popular approaches based on the periodogram in a simulation study and in real data analyses. The computer code and data used in this article are available on the JCGS webpage.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号