首页 | 官方网站   微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
This paper describes the general characteristics of raw data from fiber‐fed spectrographs in general and fiber‐fed IFUs in particular. The different steps of the data reduction are presented, and the techniques used to address the unusual characteristics of these data are described in detail. These techniques have been implemented in a specialized software package, R3D, developed to reduce fiber‐based integral field spectroscopy (IFS) data. The package comprises a set of command‐line routines adapted for each of these steps, suitable for creating pipelines. The routines have been tested against simulations, and against real data from various integral field spectrographs (PMAS, PPAK, GMOS, VIMOS and INTEGRAL). Particular attention is paid to the treatment of cross‐talk. (© 2006 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

2.
A probabilistic technique for the joint estimation of background and sources with the aim of detecting faint and extended celestial objects is described. Bayesian probability theory is applied to gain insight into the co-existence of background and sources through a probabilistic two-component mixture model, which provides consistent uncertainties of background and sources. A multiresolution analysis is used for revealing faint and extended objects in the frame of the Bayesian mixture model. All the revealed sources are parametrized automatically providing source position, net counts, morphological parameters and their errors.
We demonstrate the capability of our method by applying it to three simulated data sets characterized by different background and source intensities. The results of employing two different prior knowledge on the source signal distribution are shown. The probabilistic method allows for the detection of bright and faint sources independently of their morphology and the kind of background. The results from our analysis of the three simulated data sets are compared with other source detection methods. Additionally, the technique is applied to ROSAT All-Sky Survey data.  相似文献   

3.
A speedy pixon algorithm for image reconstruction is described. Two applications of the method to simulated astronomical data sets are also reported. In one case, galaxy clusters are extracted from multiwavelength microwave sky maps using the spectral dependence of the Sunyaev–Zel'dovich effect to distinguish them from the microwave background fluctuations and the instrumental noise. The second example involves the recovery of a sharply peaked emission profile, such as might be produced by a galaxy cluster observed in X-rays. These simulations show the ability of the technique both to detect sources in low signal-to-noise ratio data and to deconvolve a telescope beam in order to recover the internal structure of a source.  相似文献   

4.
We derive a generalized van Cittert-Zernike (vC-Z) theorem for radio astronomy that is valid for partially polarized sources over an arbitrarily wide field of view (FoV). The classical vC-Z theorem is the theoretical foundation of radio astronomical interferometry, and its application is the basis of interferometric imaging. Existing generalized vC-Z theorems in radio astronomy assume, however, either paraxiality (narrow FoV) or scalar (unpolarized) sources. Our theorem uses neither of these assumptions, which are seldom fulfiled in practice in radio astronomy, and treats the full electromagnetic field. To handle wide, partially polarized fields, we extend the two-dimensional (2D) electric field (Jones vector) formalism of the standard 'Measurement Equation' (ME) of radio astronomical interferometry to the full three-dimensional (3D) formalism developed in optical coherence theory. The resulting vC-Z theorem enables full-sky imaging in a single telescope pointing, and imaging based not only on standard dual-polarized interferometers (that measure 2D electric fields) but also electric tripoles and electromagnetic vector-sensor interferometers. We show that the standard 2D ME is easily obtained from our formalism in the case of dual-polarized antenna element interferometers. We also exploit an extended 2D ME to determine that dual-polarized interferometers can have polarimetric aberrations at the edges of a wide FoV. Our vC-Z theorem is particularly relevant to proposed, and recently developed, wide FoV interferometers such as Low Frequency Array (LOFAR) and Square Kilometer Array (SKA), for which direction-dependent effects will be important.  相似文献   

5.
We describe the integral field unit (IFU) which converts the Gemini Multiobject Spectrograph (GMOS) installed on the Gemini-North telescope to an integral field spectrograph,which produces spectra over a contiguous field of view of 7 × 5 arcsec with spatial sampling of 0.2 arcsecover the wavelength range 0.4-1.0 μm.GMOS is converted to this mode by the remote insertion of the IFU into thebeam in place of the masks used for the multiobject mode. A separate fieldof half the area of the main field, but otherwise identical, is alsoprovided to improve background subtraction. The IFU contains 1500lenslet-coupled fibres and was the first facility of any type for integralfield spectroscopy employed on an 8/10 m telescope.We describe the design, construction and testing of the GMOS IFU and present measurements of the throughput both in the laboratory and at the telescope. We compare these with a theoretical prediction made before construction started. All are in good agreement with each other, with the on-telescope throughput exceeding 60% (averaged over wavelength). Finallywe show an example of data obtained during commissioning to illustrate the power of the device.  相似文献   

6.
We develop a radio astronomical approach to 3D‐reconstruction in few projections tomography. It is based on the 2‐CLEAN DSA method which consists of two clean algorithms by using a synthesized beam. In complex cases two extreme solutions are used for the analysis of the image structure. These solutions determine the limits of permissible energy redistribution on the image among the components of small and large scales. Two variants of 3D‐reconstruction proceeding from a set of two‐dimensional projections (3D2D) and from a set of one‐dimensional ones (3D1D) are considered. It is shown that the quality of 3D2D‐reconstruction should be similar to the quality of 2D1D‐reconstruction if the same number of equally spaced scans is used. But a doubled number of projections is required for 3D1D‐reconstruction. We have simulated 3D‐reconstruction of an optically thin emitting object. The present technique is a component of astrotomography and it has good prospects for a wide range of remote sensing. (© 2005 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

7.
We report on the results of the spectroscopy of 10 objects previously classified as brown dwarf candidates via RIJHK colors by Eisenbeiss et al. (2009), who performed deep imaging observations on a ∼0.4 sq.deg. field at the edge of the Pleiades. We describe and judge on classification techniques in the region of M‐type stars. To classify and characterise the objects, visual and near infrared spectra have been obtained with VLT FORS and ISAAC. The spectral classification was performed using the shape of the spectra as well as spectral indices that are sensitive to the spectral type and luminosity class of M‐type stars and late M‐type brown dwarfs. Furthermore a spectrophotometric distance was calculated and compared the distance of the Pleiades to investigate the membership probability. As a second argument we analyzed the proper motion. The brown dwarf candidates were found not to be brown dwarfs, but late‐K to mid‐M‐type dwarf stars. Based on the obtained distance and tabulated proper motions we conclude that all objects are background dwarf stars (© 2011 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

8.
We have been undertaking a programme on the Gemini 8‐m telescopes to demonstrate the power of integral field spectroscopy, using the optical GMOS spectrograph, and the new CIRPASS instrument in the near‐infrared. Here we present some preliminary results from 3D spectroscopy of extra‐galactic objects, mapping the emission lines in a 3CR radio galaxy and in a gravitationally lensed arc, exploring dark matter sub‐structure through observations of an Einstein Cross gravitational lens, and the star formation time‐scales of young massive clusters in the starburst galaxy NGC 1140. (© 2004 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

9.
From a historical point of view, it was only through the advent of the CCD as a linear, high dynamic range panoramic detector that it became possible to overcome the source confusion problem for stellar photometry, e.g., in star clusters or nearby galaxies. The ability of accurately sampling the point-spread-function (PSF) in two dimensions and to use it as a template for fitting severely overlapping stellar images is of fundamental importance for crowded-field photometry, and has thus become the foundation for the determination of accurate color-magnitude diagrams of globular clusters and the study of resolved stellar populations in nearby galaxies. Analogous to CCDs, the introduction of integral field spectrographs has opened a new avenue for crowded-field 3D spectroscopy, which benefits in the same way from PSF-fitting techniques as CCD photometry does. This paper presents first experience with sampling the PSF in 3D spectroscopy, reviews the effects of atmospheric refraction, discusses background subtraction problems, and presents several science applications as obtained from observations with the PMAS instrument at Calar Alto Observatory.  相似文献   

10.
We present the first version of E3D, the Euro3D visualization tool for data from integral field spectroscopy. We describe its major characteristics, based on the proposed requirements, the current state of the project, and some planned future upgrades. We show examples of its use and capabilities. (© 2004 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

11.
We performed extensive data simulations for the planned ultra‐wide‐field, high‐precision photometric telescope ICE‐T (International Concordia Explorer Telescope). ICE‐T consists of two 60 cm‐aperture Schmidt telescopes with a joint field of view simultaneously in two photometric bandpasses. Two CCD cameras, each with a single 10.3k × 10.3k thinned back‐illuminated device, would image a sky field of 65 square degrees. Given a location of the telescope at Dome C on the East Antarctic Plateau, we searched for the star fields that best exploit the technical capabilities of the instrument and the site. We considered the effects of diurnal air mass and refraction variations, solar and lunar interference, interstellar absorption, overexposing of bright stars and ghosts, crowding by background stars, and the ratio of dwarf to giant stars in the field. Using NOMAD, SSA, Tycho‐2 and 2MASS‐based stellar positions and BVIJH magnitudes for these fields, we simulated the effects of the telescope's point‐spread‐function, the integration, and the co‐addition times. Simulations of transit light curves are presented for the selected star fields and convolved with the expected instrumental characteristics. For the brightest stars, we showed that ICE‐T should be capable of detecting a 2 REarth Super Earth around a G2 solar‐type star, as well as an Earth around an M0‐star – if these targets were as abundant as hot Jupiters. Simultaneously, the telescope would monitor the host star's surface activity in an astrophysically interpretable manner (© 2009 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

12.
An algorithm for cosmic‐ray rejection from single images is presented. The algorithm is based on modeling human perception using fuzzy logic. The proposed algorithm is specifically designed to reject multiple‐pixel cosmic ray hits that are larger than some of the point spread functions of the true astronomical sources. Experiments show that the algorithm can accurately reject ∼97.5% of the cosmic rays hits, while mistakenly rejecting 0.02% of the true astronomical sources. The major advantage of the presented algorithm is its computational efficiency. (© 2005 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

13.
Our new algorithm for differential photometry solves the problem of identifying proper comparison stars without a prior detailed study of the field of view. The comparison stars' variability is determined in a self‐consistent way, and their weighted average is used as a reference level. The maximum error in differential photometry using objects and reference stars of different spectral types is estimated. The results from these calculations show that the photometric band chosen greatly determines the level of accuracy achieved. Finally, an important application of high‐precision differential photometry are planetary transits. (© 2005 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

14.
Difference imaging is a technique for obtaining precise relative photometry of variable sources in crowded stellar fields and, as such, constitutes a crucial part of the data reduction pipeline in surveys for microlensing events or transiting extrasolar planets. The Optimal Image Subtraction (OIS) algorithm of Alard & Lupton (1998) permits the accurate differencing of images by determining convolution kernels which, when applied to reference images with particularly good seeing and signal‐to‐noise (S/N), provide excellent matches to the point‐spread functions (PSF) in other images of the time series to be analysed. The convolution kernels are built as linear combinations of a set of basis functions, conventionally bivariate Gaussians modulated by polynomials. The kernel parameters, mainly the widths and maximal degrees of the basis function model, must be supplied by the user. Ideally, the parameters should be matched to the PSF, pixel‐sampling, and S/N of the data set or individual images to be analysed. We have studied the dependence of the reduction outcome as a function of the kernel parameters using our new implementation of OIS within the IDL‐based TRIPP package. From the analysis of noise‐free PSF simulations of both single objects and crowded fields, as well as the test images in the ISIS OIS software package, we derive qualitative and quantitative relations between the kernel parameters and the success of the subtraction as a function of the PSF widths and sampling in reference and data images and compare the results to those of other implementations found in the literature. On the basis of these simulations, we provide recommended parameters for data sets with different S/N and sampling. (© 2007 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

15.
A maximum entropy method (MEM) is presented for separating the emission resulting from different foreground components from simulated satellite observations of the cosmic microwave background radiation (CMBR). In particular, the method is applied to simulated observations by the proposed Planck Surveyor satellite. The simulations, performed by Bouchet &38; Gispert, include emission from the CMBR and the kinetic and thermal Sunyaev–Zel'dovich (SZ) effects from galaxy clusters, as well as Galactic dust, free–free and synchrotron emission. We find that the MEM technique performs well and produces faithful reconstructions of the main input components. The method is also compared with traditional Wiener filtering and is shown to produce consistently better results, particularly in the recovery of the thermal SZ effect.  相似文献   

16.
We present the results of TRIFFID simultaneous V - and B -band observations of the cores of the globular clusters M15, M92 and NGC 6712. A variability search of their dense centres was made feasible through performing post-exposure image sharpening on the images, increasing the image resolution by a factor of ∼2. The isis implementation of the image subtraction technique developed by Alard & Lupton was then used to detect flux variations in our image sets. We have obtained periods for all observable variables (in our field of view) in NGC 6712 and we have found two new RR Lyrae variables (an RRab and an RRc). We have confirmed three variables in our field of view of the M92. For M15, we detect 48 variables in our field of view, 23 of which are new discoveries. We obtain periods and amplitudes for all variables and classify new ones based on the light-curve shape, the most significant period and the mean magnitude in the V band. Among the detected RR Lyrae we find 19 RRc, 12 RRab and two RRd types. In the subsequent analysis we find a marked increase in RRc over RRab variables in the core. In a refined procedure to search for fainter objects we find no dwarf novae in our field of view of M15. Simulations performed on the data set to quantify our sensitivity to such objects indicate that an upper limit of 10 dwarf novae (at 92 per cent probability) exist in our field of view. The implications this result has on globular clusters are discussed.  相似文献   

17.
We present a study of the dynamic range limitations in images produced with the proposed Square Kilometre Array (SKA) using the Cotton-Schwab CLEAN algorithm for data processing. The study is limited to the case of a small field of view and a snap-shot observation. A new modification of the Cotton-Schwab algorithm involving optimization of the position of clean components is suggested. This algorithm can reach a dynamic range as high as 106 even if the point source lies between image grid points, in contrast to about 103 for existing CLEAN-based algorithms in the same circumstances. It is shown that the positional accuracy of clean components, floating point precision and the w-term are extremely important at high dynamic range. The influence of these factors can be reduced if the variance of the gradient of the point spread function is minimized during the array design.  相似文献   

18.
We developed a source detection algorithm based on the Minimal Spanning Tree (MST), that is a graph-theoretical method useful for finding clusters in a given set of points. This algorithm is applied to γ-ray bi-dimensional images where the points correspond to the arrival direction of photons, and the possible sources are associated with the regions where they clusterize. Some filters to select these clusters and to reduce the spurious detections are introduced. An empirical study of the statistical properties of MST on random fields is carried out in order to derive some criteria to estimate the best filter values. We also introduce two parameters useful to verify the goodness of candidate sources. To show how the MST algorithm works in practice, we present an application to an EGRET observation of the Virgo field, at high Galactic latitude and with a low and rather uniform background, in which several sources are detected.  相似文献   

19.
We discuss the technique of Wide-field imaging as it applies to Very Long Baseline Interferometry (VLBI). In the past VLBI data sets were usually averaged so severely that the field-of-view was typically restricted to regions extending a few hundred milliarcseconds from the phase centre of the field. Recent advances in data analysis techniques, together with increasing data storage capabilities, and enhanced computer processing power, now permit VLBI images to be made whose angular size represents a significant fraction of an individual antenna's primary beam. This technique has recently been successfully applied to several large separation gravitational lens systems, compact Supernova Remnants in the starburst galaxy M82, and two faint radio sources located within the same VLA FIRST field. It seems likely that other VLBI observing programmes might benefit from this wide-field approach to VLBI data analysis. With the raw sensitivity of global VLBI set to improve by a factor 4–5 over the coming few years, the number of sources that can be detected in a given field will rise considerably. In addition, a continued progression in VLBI's ability to image relatively faint and extended low brightness temperature features (such as hot-spots in large-scale astrophysical jets) is also to be expected. As VLBI sensitivity approaches the μJy level, a wide-field approach to data analysis becomes inevitable.  相似文献   

20.
We check the performance of the FRODOSpec integral‐field spectrograph for observations of faint sources in crowded fields. Although the standard processing pipeline L2 yields too noisy fibre spectra, we present a new processing software (L2LENS) that gives rise to accurate spectra for the two images of the gravitationally lensed quasar Q0957+561. Among other things, this L2LENS reduction tool accounts for the presence of cosmic‐ray events, scattered‐light backgrounds, blended sources, and chromatic source displacements due to differential atmospheric refraction. Our non‐standard reduction of Q0957+561 data shows the ability of FRODOSpec to provide useful information on a wide variety of targets, and thus, the big potential of integral‐field spectrographs on current and future robotic telescopes. (© 2014 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim)  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号