Photometry Pipeline

Read the following publications for a more throrough description of the photometry pipeline:

High Level Overview

astrometry We start with Emmanuel Bertin's SExtractor program to extract star images. Next, we run Astrometry.net which provides plate centers of our images with a 99.5% success rate. Our first test correctly identified the center for plate i52601 shown at left. This identification solved a long-standing mystery created when the operator of the 8" Draper Doublet pointed the telescope 1 hour in Right Ascension from the position he wrote in the logbook. The Astrometry.net program also identifies the rotation of the plate necessary to bring North up and whether a mosaic needs to be flipped because the plate was inserted emulsion-down in the telescope plate holder or the scanner.

After astrometry.net, we next use Jessica Mink's WCS tools to match the plate to the Tycho-2 catalog and account for stellar proper motions. We then use IRAF's ccmap to provide a 6th order polynomial fit of the distortions of the original telescope. We divide the plate into a 50 x 50 grid and make a final astrometric correction of the median residual positional error of all of the stars in the grid. We use the same grid to calculate an estimated extinction correction for all of the stars in each grid rectangle.

i31090

 

 

 

 

 

 

 

The image above shows the center of M44 taken in 1903 and illustrates a special analysis problem presented by old photometric techniques. Note that the triangle on the right has a ghost image on the left generated by a "Pickering Wedge". This wedge was placed on the telescope objective in an attempt to extend the dynamic range of film. When the primary star image is saturated, a researcher can measure the ghost images and add a known wedge magnitude offset. We recently discovered that the modern Damon Yellow series has a about 50 plates with similar ghost images probably caused by internal reflections in the filter. The solution to the ghost image problem is to superimpose the brightest stars on the plate and then search for the ghost image at the known offset of each Pickering Wedge. The search annulus shown in the following graph has a distinct peak at an angle of 180 degrees. With this information we can flag the ghost images so that they are not mistaken for novae.

wedge

 

 

 

 

 

 

After this step, we then perform a final match with the GSC2.3.2 catalog for every plate; and the Kepler Input Catalog for plates in the Orion Arm of the Milky Way. In April, 2011, the DASCH team began to use the APASS catalog, which offers accurate magnitudes in a brightness range which is a good match to our plate collection. Both the Kepler Input Catalog and APASS catalog are supplemented with proper motions from the UCAC4 catalog.

defectfilter annularbin

We next run a defect filter developed by Sumin Tang. This filter looks at four sets of SExtractor shape parameters for images that fit the GSC2.3.2 catalog and rejects images that do not match this profile. The image above shows that slightly trailed images make it easier to separate genuine star images from dust, contamination, plate defects, and development defects.

We next divide the images into nine annular bins in recognition of off-axis vignetting, coma, and other image distortions. We normally reject images in the outermost bin, bin 9, when searching for variables and novae because this bin usually shows severe problems with distortion and defects. For each annular bin, Sumin Tang developed an algorithm to estimate the colorterm necessary to transform the GSC2.3.2, Kepler, or APASS color system into the color system of the plate emulsion and any external color filters. Compare the colorterm correction of -0.24 for the Damon Blue plate on the left with the -0.78 correction for a Damon Red plate on the right.

colorterm

Sumin Tang also developed an algorithm to convert the instrumental SExtractor magnitudes to GSC2.3.2, Kepler Input Catalog, or APASS catalog magnitudes using the "locally weighted scatterplot smoothing" or "lowess" curve fitting algorithm. This piece-wise linear technique makes no assumptions concerning the non-linear behavior of old emulsions.

lowess

 

The next correction uses a local smoothing algorithm developed by Silas Laycock. We divide the plate into a 50 x 50 rectangular grid and make a final magnitude adjustment based on the the clipped median magnitude difference of all of the stars in the rectangular bin and the catalog magnitudes. This correction accounts for uneven emulsion, development, and original sky conditions in these large area plates. 
In December, 2011 the DASCH team implemented a second stage of smoothing, designed by Sumin Tang, which uses variable-size bins in both magnitude and plate location.

After the local smoothing correction, we reverse the extinction correction and return magnitudes to the original color system of the original catalog. At this stage we calculate magnitude estimates for all images which were not matched to the original catalog. The early astronomers, however, often exposed multiple star fields on a single plate in order to avoid errors caused by differences in plate emulsions and development. We, therefore, take the unmatched star images and rerun the pipeline as many times as is necessary to isolate all of the separate exposures. Our current record is the identification of 6 exposures each for plates mc04979, mc05077, and mc12668.

local