首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到19条相似文献,搜索用时 187 毫秒
1.
利用时间间隔只有一天的ERS-1/2雷达卫星干涉数据,采用合成孔径雷达干涉测量方法(InSAR),提取了青藏高原唐古拉山地区冬克玛底冰川的数字高程模型(DEM)和冰流运动速度参数.并利用该区域1:50000的DEM对干涉DEM进行了对比验证.同时根据干涉方法提取的冰流运动速度分析了冬克玛底冰川表面冰流运动的特点.研究表明,雷达干涉测量技术是精确提取中国西部山谷冰川地形和运动速度的有效手段,短时间基线的SAR干涉对更利于避免时间去相干问题.  相似文献   

2.
利用顺轨干涉ATI-SAR(Along-Track Interferometric Synthetic Aperture Radar)计算海面相干时间时,需要获取噪声引起的前后通道间的相关系数的下降.目前常用的做法是以陆地目标或双基线干涉SAR(Synthetic Aperture Radar)中基线长度为零的干涉数据作为参考.而现有的顺轨干涉SAR多为单基线干涉,且获取的海面图像中往往不包含陆地目标.针对这一问题,本文提出了一种面向无参考的顺轨干涉SAR海面相干时间计算方法.首先从SAR图像功率谱中估计噪声功率,计算图像的信噪比;然后根据信噪比计算出噪声引起的前后通道的相关系数的下降,进而计算出海面相干时间.将该方法运用于实际的机载顺轨干涉SAR数据,获取了海面相干时间.实验结果与仿真结果的对比表明,该方法能获取较为准确的海面相干时间,验证了本文方法的有效性.  相似文献   

3.
SAR干涉测量数字高程模型提取与高程误差校正   总被引:3,自引:0,他引:3  
介绍了基线几何参数的估计、干涉相位图生成、干涉相位去平、基线几何参数优化算法、高程解算等原理。利用在中国内蒙古自治区获取的一对SIR-C/X SAR L波段干涉测量数据研究了干涉测量数据处理方法,对整景SAR数据进行了处理,生成了相应地面范围的数字高程模型。在试验区地形图的辅助下,分析了利用粗略基线几何参数求算高程产生误差的原因,实现了基于GCP的基线几何参数优化算法。利用控制点进行校正后,数字高  相似文献   

4.
干涉SAR在遥感测量中的应用   总被引:1,自引:0,他引:1  
目的:论述干涉SAR技术在遥感测量中的应用,方法:在回顾干涉SAR发展历史之后,详细地说明了干涉SAR的工作原理和信号处理技术,最后讨论了干涉SAR广泛的应用现状和前景,结果:干涉SAR技术可以广泛应用于DEM测量及地球测绘的多个方面,结论:这项技术将会得到更大的发展,中国应广泛开展干涉SAR技术的应用研究。  相似文献   

5.
分布式小卫星SAR由于其基线长达数百米以上,采用沿航迹干涉测速时,速度模糊非常严重.本文分析了沿直线排列的小卫星编队利用多基线和多工作频率来消除测速模糊的方法;讨论了速度方向模糊问题;简要地分析了空间立体编队的分布式小卫星SAR消除速度模糊的方法.  相似文献   

6.
干涉合成孔径雷达二维相位展开问题及其算法   总被引:4,自引:1,他引:3  
干涉合成孔径雷达(InSAR)主要用于获取高精度数字高程图和高精度的地表变化检测等领域.相位模糊是干涉SAR测量中的基本问题之一.本文系统地分析了干涉SAR相位模糊和二维相位展开的基本概念和基本原理.将干涉SAR二维相位展开作为约束最优化问题,分析和比较了二维相位展开各类算法的特点,探讨了相位展开问题的研究方向.  相似文献   

7.
为了解决传统白光干涉测量技术中对线性位移机构的位移精度要求过高的问题,本文提出了一种全视场外差白光干涉测量技术。该技术主要通过使用存在差频的白光干涉信号作为光源来实现在大扫描步长和低扫描精度条件下相干峰位置的高精度检测。本文首先建立了白光外差干涉的数学模型,再根据数学模型提供的光强信号特性提出了整体系统设计方案,然后对测量方案的可行性进行了实验验证。最后针对多种误差对算法计算精度的影响进行了理论分析和数据对比。误差分析的结果表明:白光外差干涉测量技术提供更高的测量精度和更好的抗干扰性能,有效地降低了传统白光干涉测量对线性位移机构精度的严苛依赖,为光学自由曲面检测技术提供了更多的可选解决方案。  相似文献   

8.
目的 论述干涉 SAR技术在遥感测量中的应用 .方法 在回顾干涉 SAR发展历史之后 ,详细地说明了干涉 SAR的工作原理和信号处理技术 ,最后讨论了干涉 SAR广泛的应用现状和前景 .结果 干涉SAR技术可以广泛应用于 DEM测量及地球测绘的多个方面 .结论 这项技术将会得到更大的发展 ,中国应广泛开展干涉 SAR技术的应用研究  相似文献   

9.
鲁强  曾绍群 《光电工程》1997,24(1):46-50
针对前后表面平行抛光的物质,干涉测温是一种非常有效的非侵入性测量方法,本文中对常规干涉测温法进行改进,提出了一种新方法--弱相干法,该方法利用光源的弱相干性抑制背景反射光,选择所需信号光进行干涉测量,从而提高测量精度。文中描述了弱相干法的基本原理及相应的实验。  相似文献   

10.
干涉SAR相干系数估计的快速算法   总被引:2,自引:1,他引:1  
在SAR干涉测量中,相干系数是干涉图像对相位稳定性的衡量标准。干涉基线距和地表地形都影响相干系数的计算,在通常的相干系数的计算中,需要补偿由基线和地形引起的干涉相位,造成相干系数计算中计算量的大大增加。本文应用了一种不需要进行相位补偿的相干系数计算方法,并与通常的相干系数的计算方法相比较。实验结果证明,本算法在保证一定的相干系数计算精确性的基础上,大大提高了相干系数的计算效率,可以应用于快速浏览相干图像对,以评价其用于相干测量时的质量好坏。  相似文献   

11.
Vibrational spectra often require baseline removal before further data analysis can be performed. Manual (i.e., user) baseline determination and removal is a common technique used to perform this operation. Currently, little data exists that details the accuracy and precision that can be expected with manual baseline removal techniques. This study addresses this current lack of data. One hundred spectra of varying signal-to-noise ratio (SNR), signal-to-baseline ratio (SBR), baseline slope, and spectral congestion were constructed and baselines were subtracted by 16 volunteers who were categorized as being either experienced or inexperienced in baseline determination. In total, 285 baseline determinations were performed. The general level of accuracy and precision that can be expected for manually determined baselines from spectra of varying SNR, SBR, baseline slope, and spectral congestion is established. Furthermore, the effects of user experience on the accuracy and precision of baseline determination is estimated. The interactions between the above factors in affecting the accuracy and precision of baseline determination is highlighted. Where possible, the functional relationships between accuracy, precision, and the given spectral characteristic are detailed. The results provide users of manual baseline determination useful guidelines in establishing limits of accuracy and precision when performing manual baseline determination, as well as highlighting conditions that confound the accuracy and precision of manual baseline determination.  相似文献   

12.
Wavelet transform has been proven to be a high performance signal processing technique. In this paper, a novel algorithm which is more suitable to process analytical signals is proposed, and application of the algorithm in denoising, baseline correction, determination of component number in overlapping chromatograms, and preprocessing the data matrix for window factor analysis (WFA) in resolving multi-component overlapping chromatograms is reported. The main characteristic of the wavelet transform is that it decomposes a signal into localized contributions, and each of the contributions represents the information of different frequency contained in the original signal. Therefore, the noise can be filtered by separating the high frequency contributions from the chromatogram, the baseline correction can be obtained by removing the low frequency contributions from the chromatogram, and the resolved chromatogram can be retrieved from the overlapping signal by choosing a discrete detail at certain scales from the decomposed contributions. The determination of component number in overlapping chromatograms can be obtained by simply counting the number of peaks in the resolved chromatogram.  相似文献   

13.
Structural health monitoring (SHM) can be defined as a statistical pattern recognition problem which necessitates establishing a decision boundary for damage identification. In general, data points associated with damage manifest themselves near the tail of a baseline data distribution, which is obtained from a healthy state of a structure. Because damage diagnosis is concerned with outliers potentially associated with damage, improper modeling of the tail distribution may impair the performance of SHM by misclassifying a condition state of the structure. This paper attempts to address the issue of establishing a decision boundary based on extreme value statistics (EVS) so that the extreme values associated with the tail distribution can be properly modeled. The generalized extreme value distribution (GEV) is adopted to model the extreme values. A theoretical framework and a parameter estimation technique are developed to automatically estimate model parameters of the GEV. The validity of the proposed method is demonstrated through numerically simulated data, previously published real sample data sets, and experimental data obtained from the damage detection study in a composite plate.  相似文献   

14.
Control dosemeters are routinely provided to customers to monitor the background radiation so that it can be subtracted from the gross response of the dosemeter to arrive at the occupational dose. Landauer, the largest dosimetry processor in the world with subsidiaries in Australia, Brazil, China, France, Japan, Mexico and the UK, has clients in approximately 130 countries. The Glenwood facility processes over 1.1 million controls per year. This network of clients around the world provides a unique ability to monitor the world's ambient background radiation. Control data can be mined to provide useful historical information regarding ambient background rates and provide a historical baseline for geographical areas. Historical baseline can be used to provide site or region-specific background subtraction values, document the variation in ambient background radiation around a client's site or provide a baseline for measuring the efficiency of clean-up efforts in urban areas after a dirty bomb detonation.  相似文献   

15.
A baseline correction method that uses basis set projection to estimate spectral backgrounds has been developed and applied to gas chromatography/mass spectrometry (GC/MS) data. An orthogonal basis was constructed using singular value decomposition (SVD) for each GC/MS two-way data object from a set of baseline mass spectra. A novel aspect of this baseline correction method is the regularization parameter that prevents overfitting that may produce negative peaks in the corrected mass spectra or ion chromatograms. The number of components in the basis, the regularization parameter, and the mass spectral range from which the spectra were sampled to construct the basis were optimized so that the projected difference resolution (PDR) or signal-to-noise ratio (SNR) was maximized. PDR is a metric similar to chromatographic resolution that indicates the separation of classes in a multivariate data space. This new baseline correction method was evaluated with two synthetic data sets and a real GC/MS data set. The prediction accuracies obtained by using the fuzzy rule-building expert system (FuRES) and partial least-squares-discriminant analysis (PLS-DA) as classifiers were compared and validated through bootstrapped Latin partition (BLP) between data before and after baseline correction. The results indicate that baseline correction of the two-way GC/MS data using the proposed methods resulted in a significant increase in average PDR values and prediction accuracies.  相似文献   

16.
The measurement of specific heat by heat flux differential scanning calorimetry using the “ratio method” requires three consecutive runs (baseline, reference and sample) to be performed using identical temperature programs. Traditionally measurements are taken as the sample of interest is subjected to a controlled heating rate, but it is also possible to perform measurements on down-ramps, and this is often worthwhile as it provides a consistency check on the up-ramp data. In recent work a Netzsch DSC-404 has been used to investigate discrepancies observed between up- and down-ramp specific heat results. Based upon this experience several different “baseline correction” techniques have been developed, and have been applied to specific heat measurements on reference sapphire samples. The performance of these “baseline correction” techniques is shown to be extremely good, even up to 1400°C. An important conclusion is that, even in cases where agreement between up-ramp and down-ramp data is reasonable, a “baseline correction” method can still usefully be applied to improve the precision of the final results.  相似文献   

17.
We present here a fully automated spectral baseline-removal procedure. The method uses a large-window moving average to estimate the baseline; thus, it is a model-free approach with a peak-stripping method to remove spectral peaks. After processing, the baseline-corrected spectrum should yield a flat baseline and this endpoint can be verified with the χ(2)-statistic. The approach provides for multiple passes or iterations, based on a given χ(2)-statistic for convergence. If the baseline is acceptably flat given the χ(2)-statistic after the first pass at correction, the problem is solved. If not, the non-flat baseline (i.e., after the first effort or first pass at correction) should provide an indication of where the first pass caused too much or too little baseline to be subtracted. The second pass thus permits one to compensate for the errors incurred on the first pass. Thus, one can use a very large window so as to avoid affecting spectral peaks--even if the window is so large that the baseline is inaccurately removed--because baseline-correction errors can be assessed and compensated for on subsequent passes. We start with the largest possible window and gradually reduce it until acceptable baseline correction based on the χ(2) statistic is achieved. Results, obtained on both simulated and measured Raman data, are presented and discussed.  相似文献   

18.
When building a multivariate SPC model, it is commonly assumed that there is only one operational mode in the baseline data. However, multiple operational modes may exist. It is important to know the number of modes in the data in order to construct an effective process control system. Each operational mode appears as a cluster in the baseline data. This paper proposes a method to identify the correct number of clusters in baseline data. None of the existing methods for finding the number of clusters has all three of the following critical features: (i) the proposed method can determine if there is only one cluster, the most common case in baseline data; (ii) it can identify clusters that are convex or non-convex; and (iii) it is insensitive to user-specified parameters. The paper includes a comparison of the existing and proposed methods on four datasets. The proposed method consistently gives the correct number of clusters whereas the existing methods are unable to do so.  相似文献   

19.
This investigation aimed to adapt the total focusing method (TFM) algorithm (originated from the synthetic aperture focusing technique in digital signal processing) to accommodate a circular array of piezoelectric sensors (PZT) and characterise defects using guided wave signals for the development of a structural health monitoring system. This research presents the initial results of a broader study focusing on the development of a structural health monitoring (SHM) guided wave system for advance carbon fibre reinforced plastic (CFRP) composite materials. The current material investigated was an isotropic (aluminium) square plate with 16 transducers operating successively as emitter or sensor in pitch and catch configuration enabling the collection of 240 signals per assessment. The Lamb wave signals collected were tuned on the symmetric fundamental mode with a wavelength of 17 mm, by setting the excitation frequency to 300 kHz. The initial condition for the imaging system, such as wave speed and transducer position, were determined with post processing of the baseline signals through a method involving the identification of the waves reflected from the free edge of the plate. The imaging algorithm was adapted to accommodate multiple transmitting transducers in random positions. A circular defect of 10 mm in diameter was drilled in the plate, which is similar to the delamination size introduced by a low velocity impact event in a composite plate. Images were obtained by applying the TFM to the baseline signals, Test 1 data (corresponding to the signals obtained after introduction of the defect) and to the data derived from the subtraction of the baseline to the Test 1 signals. The result shows that despite the damage diameter being 40 % smaller than the wavelength, the image (of the subtracted baseline data) demonstrated that the system can locate where the waves were reflected from the defect boundary. In other words, the contour of the damaged area was highlighted enabling its size and position to be determined.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号