首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Acoustic impedance is one of the best attributes for seismic interpretation and reservoir characterisation. We present an approach for estimating acoustic impedance accurately from a band‐limited and noisy seismic data. The approach is composed of two stages: inverting for reflectivity from seismic data and then estimating impedance from the reflectivity inverted in the first stage. For the first stage, we achieve a two‐step spectral inversion that locates the positions of reflection coefficients in the first step and determines the amplitudes of the reflection coefficients in the second step under the constraints of the positions located in the first step. For the second stage, we construct an iterative impedance estimation algorithm based on reflectivity. In each iteration, the iterative impedance estimation algorithm estimates the absolute acoustic impedance based on an initial acoustic impedance model that is given by summing the high‐frequency component of acoustic impedance estimated at the last iteration and a low‐frequency component determined in advance using other data. The known low‐frequency component is used to restrict the acoustic impedance variation tendency in each iteration. Examples using one‐ and two‐dimensional synthetic and field seismic data show that the approach is flexible and superior to the conventional spectral inversion and recursive inversion methods for generating more accurate acoustic impedance models.  相似文献   

2.
In this paper, we propose a workflow based on SalSi for the detection and delineation of geological structures such as salt domes. SalSi is a seismic attribute designed based on the modelling of human visual system that detects the salient features and captures the spatial correlation within seismic volumes for delineating seismic structures. Using this attribute we cannot only highlight the neighbouring regions of salt domes to assist a seismic interpreter but also delineate such structures using a region growing method and post‐processing. The proposed delineation workflow detects the salt‐dome boundary with very good precision and accuracy. Experimental results show the effectiveness of the proposed workflow on a real seismic dataset acquired from the North Sea, F3 block. For the subjective evaluation of the results of different salt‐dome delineation algorithms, we have used a reference salt‐dome boundary interpreted by a geophysicist. For the objective evaluation of results, we have used five different metrics based on pixels, shape, and curvedness to establish the effectiveness of the proposed workflow. The proposed workflow is not only fast but also yields better results as compared with other salt‐dome delineation algorithms and shows a promising potential in seismic interpretation.  相似文献   

3.
This article addresses the question whether time‐lapse seismic reflection techniques can be used to follow and quantify the effects of solution salt mining. Specifically, the production of magnesium salts as mined in the north of the Netherlands is considered. The use of seismic time‐lapse techniques to follow such a production has not previously been investigated. For hydrocarbon production and CO2 storage, time‐lapse seismics are used to look at reservoir changes mainly caused by pressure and saturation changes in large reservoirs, while for solution mining salt is produced from caverns with a limited lateral extent, with much smaller production volumes and a fluid (brine) replacing a solid (magnesium salt). In our approach we start from the present situation of the mine and then study three different production scenarios, representing salt production both in vertical and lateral directions of the mine. The present situation and future scenarios have been transformed into subsurface models that were input to an elastic finite‐difference scheme to create synthetic seismic data. These data have been analysed and processed up to migrated seismic images, such that time‐lapse analyses of intermediate and final results could be done. From the analyses, it is found that both vertical and lateral production is visible well above the detection threshold in difference data, both at pre‐imaging and post‐imaging stages. In quantitative terms, an additional production of the mine of 6 m causes time‐shifts in the order of 2 ms (pre‐imaging) and 4 ms (post‐imaging) and amplitude changes of above 20% in the imaged sections. A laterally oriented production causes even larger amplitude changes at the edge of the cavern due to replacement of solid magnesium salt with brine introducing a large seismic contrast. Overall, our pre‐imaging and post‐imaging time‐lapse analysis indicates that the effects of solution salt mining can be observed and quantified on seismic data. The effects seem large enough to be observable in real seismic data containing noise.  相似文献   

4.
Prestack depth imaging of seismic data in complex areas such as salt structures requires extensive velocity model updating. In many cases, salt boundaries can be difficult to identify due to lack of seismic reflectivity. Traditional amplitude based segmentation methods do not properly tackle this problem, resulting in extensive manual editing. This paper presents a selection of seismic attributes that can reveal texture differences between the salt diapirs and the surrounding geology as opposed to amplitude‐sensitive attributes that are used in case of well defined boundaries. The approach consists of first extracting selected texture attributes, then using these attributes to train a classifier to estimate the probability that each pixel in the data set belongs to one of the following classes: near‐horizontal layering, highly‐dipping areas and the inside of the salt that appears more like a low amplitude area with small variations in texture. To find the border between the inside of the salt and the highly‐dipping surroundings, the posterior probability of the class salt is input to a graph‐cut algorithm that produces a smooth, continuous border. An in‐line seismic section and a timeslice from a 3D North Sea data set were employed to test the proposed approach. Comparisons between the automatically segmented salt contours and the corresponding contours as provided by an experienced interpreter showed a high degree of similarity.  相似文献   

5.
Optimization of sub-band coding method for seismic data compression   总被引:2,自引:0,他引:2  
Seismic data volumes, which require huge transmission capacities and massive storage media, continue to increase rapidly due to acquisition of 3D and 4D multiple streamer surveys, multicomponent data sets, reprocessing of prestack seismic data, calculation of post‐stack seismic data attributes, etc. We consider lossy compression as an important tool for efficient handling of large seismic data sets. We present a 2D lossy seismic data compression algorithm, based on sub‐band coding, and we focus on adaptation and optimization of the method for common‐offset gathers. The sub‐band coding algorithm consists of five stages: first, a preprocessing phase using an automatic gain control to decrease the non‐stationary behaviour of seismic data; second, a decorrelation stage using a uniform analysis filter bank to concentrate the energy of seismic data into a minimum number of sub‐bands; third, an iterative classification algorithm, based on an estimation of variances of blocks of sub‐band samples, to classify the sub‐band samples into a fixed number of classes with approximately the same statistics; fourth, a quantization step using a uniform scalar quantizer, which gives an approximation of the sub‐band samples to allow for high compression ratios; and fifth, an entropy coding stage using a fixed number of arithmetic encoders matched to the corresponding statistics of the classified and quantized sub‐band samples to achieve compression. Decompression basically performs the opposite operations in reverse order. We compare the proposed algorithm with three other seismic data compression algorithms. The high performance of our optimized sub‐band coding method is supported by objective and subjective results.  相似文献   

6.
The local cosine/sine basis is a localized version of the cosine/sine basis with a window function which can have arbitrary smoothness. It has orthogonality and good time and frequency localization properties. The adaptive local cosine/sine basis is a best‐basis obtained from an overabundant library of cosine/sine packets based on a cost‐functional. We propose a 2D semi‐adaptive (time‐adaptive or space‐adaptive) local cosine transform (referred to as a 2D semi‐ALCT) and apply it to the SEG–EAEG salt model synthetic data set for compression. From the numerical results, we see that most of the important features of the data set can be well preserved even in the high compression ratio (CR=40:1) case. Using reconstructed data from the highly compressed ALCT coefficients (CR=40:1) for migration, we can still obtain a high‐quality image including subsalt structures. Furthermore, we find that the window partition, generated by the 2D semi‐ALCT, is well adapted to the characteristics of the seismic data set, and the compression capability of the 2D semi‐ALCT is greater than that of the 2D uniform local cosine transform (2D ULCT). We find also that a (32, 32) or (32, 64) minimum (time, space) window size can generate the best compression results for the SEG–EAEG salt data set.  相似文献   

7.
A method to provide an improved time‐lapse seismic attribute for dynamic interpretation is presented. This is based on the causal link between the time‐lapse seismic response and well production activity taken over time. The resultant image is obtained by computing correlation coefficients between sequences of time‐lapse seismic changes extracted over different time intervals from multiply repeated seismic and identical time sequences of cumulative fluid volumes produced or injected from the wells. Maps of these cross‐correlations show localized, spatially contiguous signals surrounding individual wells or a specific well group. These may be associated with connected regions around the selected well or well group. Application firstly to a synthetic data set reveals that hydraulic compartments may be delineated using this method. A second application to a field data set provides empirical evidence that a connected well‐centric fault block and active geobody can be detected. It is concluded that uniting well data and time‐lapse seismic using our proposed method delivers a new attribute for dynamic interpretation and potential updating of the model for the producing reservoir.  相似文献   

8.
An approximation is developed that allows mapped 4D seismic amplitudes and time‐shifts to be related directly to the weighted linear sum of pore pressure and saturation changes. The weights in this relation are identified as key groups of parameters from a petroelastic model and include the reservoir porosity. This dependence on groups of parameters explains the inherent non‐uniqueness of this problem experienced by previous researchers. The proposed relation is of use in 4D seismic data feasibility studies and inversion and interpretation of the 4D seismic response in terms of pore pressure and water saturation changes. A further result is drawn from analysis of data from the North Sea and West Africa, which reveals that the relative interplay between the effects of pore pressure and saturation changes on the seismic data can be simplified to the control of a single, spatially variant parameter CS/CP. Combining these results with those from published literature, we find that CS/CP = 8 appears to be a generality across a range of clastic reservoirs with a similar mean porosity. Using this CS/CP value, an in situ seismic‐scale constraint for the rock stress sensitivity component of the petroelastic model is constructed considering this component carries the largest uncertainty.  相似文献   

9.
Due to the complicated geophysical character of tight gas sands in the Sulige gasfield of China, conventional surface seismic has faced great challenges in reservoir delineation. In order to improve this situation, a large‐scale 3D‐3C vertical seismic profiling (VSP) survey (more than 15 000 shots) was conducted simultaneously with 3D‐3C surface seismic data acquisition in this area in 2005. This paper presents a case study on the delineation of tight gas sands by use of multi‐component 3D VSP technology. Two imaging volumes (PP compressional wave; PSv converted wave) were generated with 3D‐3C VSP data processing. By comparison, the dominant frequencies of the 3D VSP images were 10–15 Hz higher than that of surface seismic images. Delineation of the tight gas sands is achieved by using the multi‐component information in the VSP data leading to reduce uncertainties in data interpretation. We performed a routine data interpretation on these images and developed a new attribute titled ‘Centroid Frequency Ratio of PSv and PP Waves’ for indication of the tight gas sands. The results demonstrated that the new attribute was sensitive to this type of reservoir. By combining geologic, drilling and log data, a comprehensive evaluation based on the 3D VSP data was conducted and a new well location for drilling was proposed. The major results in this paper tell us that successful application of 3D‐3C VSP technologies are only accomplished through a synthesis of many disciplines. We need detailed analysis to evaluate each step in planning, acquisition, processing and interpretation to achieve our objectives. High resolution, successful processing of multi‐component information, combination of PP and PSv volumes to extract useful attributes, receiver depth information and offset/ azimuth‐dependent anisotropy in the 3D VSP data are the major accomplishments derived from our attention to detail in the above steps.  相似文献   

10.
11.
Imaging pre‐salt reflections for data acquired from the coastal region of the Red Sea is a task that requires prestack migration velocity analysis. Conventional post‐stack time processing lacks the lateral inhomogeneity capability, necessary for such a problem. Prestack migration velocity analysis in the vertical time domain reduces the velocity–depth ambiguity that usually hampers the performance of prestack depth‐migration velocity analysis. In prestack τ‐migration velocity analysis, the interval velocity model and the output images are defined in τ (i.e. vertical time). As a result, we avoid placing reflectors at erroneous depths during the velocity analysis process and thus avoid inaccurately altering the shape of the velocity model, which in turn speeds up the convergence to the true model. Using a 1D velocity update scheme, the prestack τ‐migration velocity analysis produces good images of data from the Midyan region of the Red Sea. For the first seismic line from this region, only three prestack τ‐migration velocity analysis iterations were required to focus pre‐salt reflections in τ. However, the second line, which crosses the first line, is slightly more complicated and thus required five iterations to reach the final, reasonably focused, τ‐image. After mapping the images for the two crossing lines to depth, using the final velocity models, the placements of reflectors in the two 2D lines were consistent at their crossing point. Some errors occurred due to the influence of out‐of‐plane reflections on 2D imaging. However, such errors are identifiable and are generally small.  相似文献   

12.
Seismic conditioning of static reservoir model properties such as porosity and lithology has traditionally been faced as a solution of an inverse problem. Dynamic reservoir model properties have been constrained by time‐lapse seismic data. Here, we propose a methodology to jointly estimate rock properties (such as porosity) and dynamic property changes (such as pressure and saturation changes) from time‐lapse seismic data. The methodology is based on a full Bayesian approach to seismic inversion and can be divided into two steps. First we estimate the conditional probability of elastic properties and their relative changes; then we estimate the posterior probability of rock properties and dynamic property changes. We apply the proposed methodology to a synthetic reservoir study where we have created a synthetic seismic survey for a real dynamic reservoir model including pre‐production and production scenarios. The final result is a set of point‐wise probability distributions that allow us to predict the most probable reservoir models at each time step and to evaluate the associated uncertainty. Finally we also show an application to real field data from the Norwegian Sea, where we estimate changes in gas saturation and pressure from time‐lapse seismic amplitude differences. The inverted results show the hydrocarbon displacement at the times of two repeated seismic surveys.  相似文献   

13.
Random noise attenuation, preserving the events and weak features by improving signal‐to‐noise ratio and resolution of seismic data are the most important issues in geophysics. To achieve this objective, we proposed a novel seismic random noise attenuation method by building a compound algorithm. The proposed method combines sparsity prior regularization based on shearlet transform and anisotropic variational regularization. The anisotropic variational regularization which is based on the linear combination of weighted anisotropic total variation and anisotropic second‐order total variation attenuates noises while preserving the events of seismic data and it effectively avoids the fine‐scale artefacts due to shearlets from the restored seismic data. The proposed method is formulated as a convex optimization problem and the split Bregman iteration is applied to solve the optimization problem. To verify the effectiveness of the proposed method, we test it on several synthetic seismic datasets and real datasets. Compared with three methods (the linear combination of weighted anisotropic total variation and anisotropic second‐order total variation, shearlets and shearlet‐based weighted anisotropic total variation), the numerical experiments indicate that the proposed method attenuates random noises while alleviating artefact and preserving events and features of seismic data. The obtained result also confirms that the proposed method improves the signal‐to‐noise ratio.  相似文献   

14.
目前叠加速度的获取主要是通过人工拾取速度谱,存在着效率低、耗时长且易受人为因素影响的缺点.本文提出了一种基于自适应阈值约束的无监督聚类智能速度拾取方法,实现叠加速度的自动拾取,在保证速度拾取精度的同时提高拾取效率.利用时窗方法在速度谱中计算自适应阈值,从而识别出一次反射波速度能量团作为速度拾取的候选区域.然后,根据K均值方法将不同的速度能量团聚类,并将最终的聚类中心作为拾取的叠加速度.最后,依据人工拾取速度的经验,加入了离群速度点的后处理工作,以获得更光滑的速度场.模型和实际地震数据测试结果表明,本文提出的方法总体上与人工拾取叠加速度的精度相当,但明显提升了速度拾取效率,极大缓解了人工拾取负担.  相似文献   

15.
In this case study we consider the seismic processing of a challenging land data set from the Arabian Peninsula. It suffers from rough top‐surface topography, a strongly varying weathering layer, and complex near‐surface geology. We aim at establishing a new seismic imaging workflow, well‐suited to these specific problems of land data processing. This workflow is based on the common‐reflection‐surface stack for topography, a generalized high‐density velocity analysis and stacking process. It is applied in a non‐interactive manner and provides an entire set of physically interpretable stacking parameters that include and complement the conventional stacking velocity. The implementation introduced combines two different approaches to topography handling to minimize the computational effort: after initial values of the stacking parameters are determined for a smoothly curved floating datum using conventional elevation statics, the final stack and also the related residual static correction are applied to the original prestack data, considering the true source and receiver elevations without the assumption of nearly vertical rays. Finally, we extrapolate all results to a chosen planar reference level using the stacking parameters. This redatuming procedure removes the influence of the rough measurement surface and provides standardized input for interpretation, tomographic velocity model determination, and post‐stack depth migration. The methodology of the residual static correction employed and the details of its application to this data example are discussed in a separate paper in this issue. In view of the complex near‐surface conditions, the imaging workflow that is conducted, i.e. stack – residual static correction – redatuming – tomographic inversion – prestack and post‐stack depth migration, leads to a significant improvement in resolution, signal‐to‐noise ratio and reflector continuity.  相似文献   

16.
Seismic facies analysis is a well‐established technique in the workflow followed by seismic interpreters. Typically, huge volumes of seismic data are scanned to derive maps of interesting features and find particular patterns, correlating them with the subsurface lithology and the lateral changes in the reservoir. In this paper, we show how seismic facies analysis can be accomplished in an effective and complementary way to the usual one. Our idea is to translate the seismic data in the musical domain through a process called sonification, mainly based on a very accurate time–frequency analysis of the original seismic signals. From these sonified seismic data, we extract several original musical attributes for seismic facies analysis, and we show that they can capture and explain underlying stratigraphic and structural features. Moreover, we introduce a complete workflow for seismic facies analysis starting exclusively from musical attributes, based on state‐of‐the‐art machine learning computational techniques applied to the classification of the aforementioned musical attributes. We apply this workflow to two case studies: a sub‐salt two‐dimensional seismic section and a three‐dimensional seismic cube. Seismic facies analysis through musical attributes proves to be very useful in enhancing the interpretation of complicated structural features and in anticipating the presence of hydrocarbon‐bearing layers.  相似文献   

17.
Conventional seismic data processing methods based on post‐stack time migration have been playing an important role in coal exploration for decades. However, post‐stack time migration processing often results in low‐quality images in complex geological environments. In order to obtain high‐quality images, we present a strategy that applies the Kirchhoff prestack time migration (PSTM) method to coal seismic data. In this paper, we describe the implementation of Kirchhoff PSTM to a 3D coal seam. Meanwhile we derive the workflow of 3D Kirchhoff PSTM processing based on coal seismic data. The processing sequence of 3D Kirchhoff PSTM includes two major steps: 1) the estimation of the 3D root‐mean‐square (RMS) velocity field; 2) Kirchhoff prestack time migration processing. During the construction of a 3D velocity model, dip moveout velocity is served as an initial migration velocity field. We combine 3D Kirchhoff PSTM with the continuous adjustment of a 3D RMS velocity field by the criteria of flattened common reflection point gathers. In comparison with post‐stack time migration, the application of 3D Kirchhoff PSTM to coal seismic data produces better images of the coal seam reflections.  相似文献   

18.
We present an approach that creates the possibility of reservoir monitoring on a quasi‐continuous basis using surface seismic data. Current strategies and logistics for seismic data acquisition impose restrictions on the calendar‐time temporal resolution obtainable for a given surface‐seismic time‐lapse monitoring program. One factor that restricts the implementation of a quasi‐continuous monitoring program using conventional strategies is the time it takes to acquire a complete survey. Here quasi‐continuous monitoring describes the process of reservoir monitoring at short‐time intervals. Our approach circumvents the restriction by requiring only a subset of complete survey data each time an image of the reservoir is needed using surface seismic data. Ideally, the time interval between survey subset acquisitions should be short so that changes in the reservoir properties are small. The accumulated data acquired are used to estimate the unavailable data at the monitor survey time and the combined recorded and estimated data are used to produce an image of the subsurface for monitoring. We will illustrate the effectiveness of our approach using 2D and 3D synthetic seismic data and 3D field seismic data. We will explain the benefits and drawbacks of the proposed approach.  相似文献   

19.
Surface‐related multiples are attenuated for one sail line and one streamer of a 3D data set (courtesy of Compagnie Générale de Géophysique). The survey was carried out in the Gulf of Mexico in the Green Canyon area where salt intrusions close to the water‐bottom are present. Because of the complexity of the subsurface, a wavefield method incorporating the full 3D volume of the data for multiple removal is necessary. This method comprises modelling of the multiples, where the data are used as a prediction operator, and a subtraction step, where the model of the multiples is adaptively removed from the data with matching filters. The accuracy of the multiple model depends on the source/receiver coverage at the surface. When this coverage is not dense enough, the multiple model contains errors that make successful subtraction more difficult. In these circumstances, one can either (1) improve the modelling step by interpolating the missing traces, (2) improve the subtraction step by designing methods that are less sensitive to modelling errors, or (3) both. For this data set, the second option is investigated by predicting the multiples in a 2D sense (as opposed to 3D) and performing the subtraction with a pattern‐based approach. Because some traces and shots are missing for the 2D prediction, the data are interpolated in the in‐line direction using a hyperbolic Radon transform with and without sparseness constraints. The interpolation with a sparseness constraint yields the best multiple model. For the subtraction, the pattern‐based technique is compared with a more standard, adaptive‐subtraction scheme. The pattern‐based approach is based on the estimation of 3D prediction‐error filters for the primaries and the multiples, followed by a least‐squares estimation of the primaries. Both methods are compared before and after prestack depth migration. These results suggest that, when the multiple model is not accurate, the pattern‐based method is more effective than adaptive subtraction at removing surface‐related multiples while preserving the primaries.  相似文献   

20.
The modeling of tethering elements of seabed anchored floating structures is addressed, with particular reference to the so‐called Archimedes Bridge (submerged floating tunnel, SFT) solution for deep water crossing; attention is devoted to the design solution encompassing slender bars as anchor elements. Two numerical tools are proposed: firstly, a geometrically nonlinear finite element (NWB model), developed in previous work, has been refined in order to capture the effect of higher flexural modes of anchor bars. Secondly, a 3D beam element, based on the classical corotational formulation (CR model), has been developed and coded. Both elements are implemented in a numerical procedure for the dynamic time domain step‐by‐step analysis of nonlinear discretized systems; seismic loading is introduced by generating artificial time histories of spatially variable seismic motion. An example of application of the NWB element is shown regarding the behavior of the dynamic model of a complete SFT. The model was subjected to extreme multiple‐support seismic loading. The seismic behavior is here illustrated and commented, especially in light of the effect of higher local vibration modes of the anchor bars. Finally, a comparison between the performances of the two modeling approaches is presented. Both harmonic and seismic excitations are considered in the test; the results justify the use of the simpler NWB approach, especially in the SFT design phase. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号