|
|
|
home Lymm Observatory optimal photometry how it works test data 1 test data 2 real data |
My main interest is time series photometry of faint objects, and in order to measure stellar fluxes in the images, I use a process called optimal extraction. Optimal extraction was originally developed in the 1980's for measuring CCD spectra. In the 1990's it was adapted by Tim Naylor - then of Keele University, and now Norman Lockyer Professor of Astrophysics at the Unversity Exeter - for photometric measurement of CCD images. A full description may be found in the original paper, Naylor (1997), which may be accessed via the author's web pages. Optimal extraction is a method for extracting - with the highest available precision - the total flux from a star measured against a sky background. It assumes the errors are dominated by photon statistics, so is particulary suitable for faint objects. It is not aperture photometry and nor is the flux obtained by direct fitting of a point spread function (PSF). For a star image superimposed on a significant sky background, it can be shown that optimal photometry offers an increase of up to 10% in signal/noise compared to aperture photometry. In the limiting case of zero sky background, no advantage is offered by optimal extraction. Despite working well with faint targets, the method seems to have remained almost unknown in the amateur community. Here, I will try to explain how optimal extraction works, with the aid of an Excel spreadsheet, which models the extraction process. This demonstrates several important features of both aperture and optimal photometry. Then I investigate the performance of both methods when applied to synthetic images, where the stellar PSFs and fluxes are known, and also to real data from my own observatory.
|
|