Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation of multi_image_photometry() code in photometry.py #145

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
7a5db3f
Quick photometry.py code reorganization moving soon to be removed fun…
JuanCab Jul 27, 2023
030b791
First, untested version of multi_image_photometry
JuanCab Jul 27, 2023
befd98a
Better integrated single_image_photometry with multi_image_photometry.
Jul 28, 2023
ec75600
Fixed logic in single_image_photometry to obtain x/y position from WC…
JuanCab Jul 28, 2023
a3f5661
Added ability to pass a fwhm_estimate to the new photometry routines.
JuanCab Jul 28, 2023
364885d
Improved FakecCCDImage and added function to shift center in images.
JuanCab Jul 28, 2023
a4a8fbf
A fully fleshed out test for the multi_image_photometry() function ad…
JuanCab Jul 28, 2023
a11de97
Updated shifted fake CCDImage test function to generate a new image i…
JuanCab Jul 28, 2023
d591830
Updated CHANGES.rst to reflect changes.
Jul 30, 2023
0243a9d
Tweaked multi_image_photometry() to pass use_coordinates to single_im…
Jul 30, 2023
bc49fb0
Fixed single_image_photometry() to not use Errors to exit unless inte…
JuanCab Jul 31, 2023
66dafe5
Adjust FWHM fitting to allow NaN and cleaned up not using Errors when…
JuanCab Jul 31, 2023
e7873c1
Added tests for missing ra/dec in source list and for no WCS in image…
JuanCab Jul 31, 2023
cc3015b
Moved checking of image headers to earlier in single_image_photometry…
JuanCab Aug 1, 2023
7cec31f
Fixed typo.
JuanCab Aug 1, 2023
a596fbd
Added output of bad FWHM count.
JuanCab Aug 1, 2023
95d79b4
Removed debug line
JuanCab Aug 1, 2023
1103f35
Fixed test_photometry so tests can run under Windows (with a stupid 3…
JuanCab Aug 1, 2023
311a7a3
Added ignore_cleanup_errors=True to TemporaryDirectory creation to av…
Aug 2, 2023
f95d9dd
Addressing all issued raised by Matt in PR #145
JuanCab Aug 3, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 5 additions & 3 deletions CHANGES.rst
Original file line number Diff line number Diff line change
@@ -1,14 +1,16 @@
1.3.10 (unreleased)
2.0.0 (unreleased)
------------------

New Features
^^^^^^^^^^^^
+ Creation of new data classes for handling aperture, photometry, and catalog data in a more consistent way by enforcing validation and certain column names. [#125]
+ Development of new data classes for handling source list, photometry, and catalog data which include data format validation. [#125]
+ Aperture photometry streamlined into ``single_image_photometry`` ``multi_image_photometry`` functions that use the new data classes. [#141]
+ ``multi_image_photometry`` now is a wrapper for single_image_photometry instead of a completely separate function.

Other Changes and Additions
^^^^^^^^^^^^^^^^^^^^^^^^^^^
+ Major reorganizaiton of code including moving functions to new modules. [#130, #133]

+ Now requires python 3.10 or later. [#147]

Bug Fixes
^^^^^^^^^
Expand Down
4 changes: 3 additions & 1 deletion stellarphot/core.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
import numpy as np
from astropy import units as u
from astropy.coordinates import EarthLocation, SkyCoord
from astropy.table import Column, QTable, Table
from astropy.time import Time

import numpy as np

__all__ = ['Camera', 'BaseEnhancedTable', 'PhotometryData', 'CatalogData',
'SourceListData']

Expand Down Expand Up @@ -537,6 +538,7 @@ def observatory(self):
return EarthLocation(lat=self.meta['lat'], lon=self.meta['lon'],
height=self.meta['height'])


class CatalogData(BaseEnhancedTable):
"""
A class to hold astronomical catalog data while performing validation
Expand Down
703 changes: 472 additions & 231 deletions stellarphot/photometry/photometry.py

Large diffs are not rendered by default.

12 changes: 5 additions & 7 deletions stellarphot/photometry/source_detection.py
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
import numpy as np

from astropy import units as u
from astropy.modeling.models import Const2D, Gaussian2D
from astropy.modeling.fitting import LevMarLSQFitter
from astropy.modeling.models import Const2D, Gaussian2D
from astropy.nddata import CCDData
from astropy.stats import sigma_clipped_stats
from astropy.nddata.utils import Cutout2D
from astropy.stats import gaussian_sigma_to_fwhm, sigma_clipped_stats
from astropy.table import Table

from photutils.detection import DAOStarFinder
from photutils.morphology import data_properties
from astropy.nddata.utils import Cutout2D
from astropy.stats import gaussian_sigma_to_fwhm

from stellarphot.core import SourceListData

Expand Down Expand Up @@ -54,7 +51,8 @@ def _fit_2dgaussian(data):

fitter = LevMarLSQFitter()
y, x = np.indices(data.shape)
gfit = fitter(g_init, x, y, data)
# Call fitter, enabling filtering of NaN/inf values
gfit = fitter(g_init, x, y, data, filter_non_finite=True)

return gfit

Expand Down
2 changes: 1 addition & 1 deletion stellarphot/photometry/tests/data/test_sources.csv
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
amplitude,x_mean,y_mean,x_stddev,y_stddev,theta,aperture
693,372.5,286.5,3,3,0,12
476,224.5,94.5,3,3,0,12
827,43.5,152.5,3,3,0,12
827,143.5,152.5,3,3,0,12
415,120.5,112.5,3,3,0,12
130 changes: 115 additions & 15 deletions stellarphot/photometry/tests/fake_image.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@

import astropy.io.fits as fits
import numpy as np
from astropy.nddata import CCDData
from astropy.table import Table
from astropy.utils.data import get_pkg_data_filename

from astropy.wcs import WCS
from photutils.datasets import make_gaussian_sources_image, make_noise_image


Expand Down Expand Up @@ -49,17 +50,116 @@ def image(self):

class FakeCCDImage(CCDData):
# Generates a fake CCDData object for testing purposes.
def __init__(self):
base_data = FakeImage()
super().__init__(base_data.image.copy(), unit='adu')
# Add some additional features to the CCDData object, like
# a header and the sources used to create the image.abs
self.header = fits.Header()
self.header['EXPOSURE'] = 1.0
self.header['DATE-OBS'] = '2018-01-01T00:00:00.0'
self.header['AIRMASS'] = 1.2
self.header['FILTER'] = 'V'
self.sources = base_data.sources.copy()
self.noise_dev = base_data.noise_dev
self.mean_noise = base_data.mean_noise
self.image_shape = base_data.image_shape.copy()
def __init__(self, *args, **kwargs):
# If no arguments are passed, use the default FakeImage.
if (len(args) == 0) and (len(kwargs) == 0):
base_data = FakeImage()
super().__init__(base_data.image.copy(), unit='adu')

# Append attributes from the base data object.
self.sources = base_data.sources.copy()
self.noise_dev = base_data.noise_dev
self.mean_noise = base_data.mean_noise
self.image_shape = base_data.image_shape.copy()

# Add some additional features to the CCDData object, like
# a header and the sources used to create the image.abs
self.header = fits.Header()
self.header['OBJECT'] = 'Test Object'
self.header['EXPOSURE'] = 1.0
self.header['DATE-OBS'] = '2018-01-01T00:00:00.0'
self.header['AIRMASS'] = 1.2
self.header['FILTER'] = 'V'

# Set up a WCS header for the CCDData object.
(size_y, size_x) = base_data.image_shape
pixel_scale = 0.75 # arcseconds per pixel
ra_center = 283.6165
dec_center = 33.05857
w = WCS(naxis=2)
w.wcs.crpix = [size_x / 2, size_y / 2] # Reference pixel (center of the image)
w.wcs.cdelt = [-pixel_scale / 3600, pixel_scale / 3600] # Pixel scale in degrees per pixel
w.wcs.crval = [ra_center, dec_center] # RA and Dec of the reference pixel in degrees
w.wcs.ctype = ['RA---TAN', 'DEC--TAN'] # Coordinate type (TAN projection)
# Rotate image to be slightly off of horizontal
north_angle_deg = 8.4
w.wcs.pc = [[np.cos(np.radians(north_angle_deg)), -np.sin(np.radians(north_angle_deg))],
[np.sin(np.radians(north_angle_deg)), np.cos(np.radians(north_angle_deg))]]

self.wcs = w
self.header.update(w.to_header())

def drop_wcs(self):
# Convienence function to remove WCS information from the CCDData object
# for testing purposes.
self.wcs = None
wcs_keywords = ['CTYPE', 'CRPIX', 'CRVAL', 'CDELT','CUNIT',
'CD1_', 'CD2_', 'PC1_', 'PC2_']
for keyword in wcs_keywords:
matching_keys = [key for key in self.header.keys() if keyword in key]
for key in matching_keys:
del self.header[key]


def shift_FakeCCDImage(ccd_data, x_shift, y_shift):
"""
Create a test CCD image based on the first test image with the central
positions shifted by the given amount. To prevent any sources being
shifted off the edge of the image, an error is thrown if this is
happening.

WCS information in the CCDData is also updated to reflect the shift.

Parameters:
ccd_data : `FakeCCDImage`
The original CCDData object.

x_shift : int
The amount of shift in x direction (in pixels)

y_shift : int
The amount of shift in y direction (in pixels)

Returns:
`FakeCCDImage`: A new CCDData object.
"""
# Copy WCS from original CCDData object
shifted_ccd_data = ccd_data.copy()
for key, val in ccd_data.__dict__.items():
try:
shifted_ccd_data.__dict__[key] = val.copy()
except AttributeError:
shifted_ccd_data.__dict__[key] = val
shifted_wcs = ccd_data.wcs.deepcopy()

# Calculate the new RA and Dec center after shifting
x_shift = int(x_shift)
y_shift = int(y_shift)
ra_center_shifted, dec_center_shifted = \
shifted_wcs.all_pix2world((shifted_ccd_data.data.shape[1]) / 2 + x_shift,
(shifted_ccd_data.data.shape[0]) / 2 + y_shift, 0)

# Shift source positions
shifted_ccd_data.sources['x_mean'] -= x_shift
shifted_ccd_data.sources['y_mean'] -= y_shift

# Check shifted sources are still on the image
if ( (np.any(shifted_ccd_data.sources['x_mean'] < 0)) |
(np.any(shifted_ccd_data.sources['x_mean'] > shifted_ccd_data.data.shape[1])) |
(np.any(shifted_ccd_data.sources['y_mean'] < 0)) |
(np.any(shifted_ccd_data.sources['y_mean'] > shifted_ccd_data.data.shape[0])) ):
raise ValueError('Sources shifted off the edge of the image.')

# Update the new RA and Dec center in the shifted WCS
shifted_wcs.wcs.crval = [ra_center_shifted, dec_center_shifted]
shifted_ccd_data.wcs = shifted_wcs

# Make image
srcs = make_gaussian_sources_image(shifted_ccd_data.image_shape,
shifted_ccd_data.sources)
background = make_noise_image(srcs.shape,
mean=shifted_ccd_data.mean_noise,
stddev=shifted_ccd_data.noise_dev)
shifted_ccd_data.data = srcs + background

return shifted_ccd_data
109 changes: 4 additions & 105 deletions stellarphot/photometry/tests/test_detection.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,115 +66,14 @@ def test_detect_source_with_padding():
fake_image = FakeImage()
sources = QTable(fake_image.sources, units={'x_mean':u.pixel, 'y_mean':u.pixel,
'x_stddev':u.pixel, 'y_stddev':u.pixel})
# print(sources)
# Pass only one value for the sky background for source detection
sky_per_pix = sources['sky_per_pix_avg'].mean()
# Padding was chosen to be large enough to ensure that one of the sources in
# test_sources.csv would land too close to the edge of the image.
found_sources = source_detection(fake_image.image,
fwhm=2 * sources['x_stddev'].mean(),
threshold=10,
sky_per_pix_avg=sky_per_pix, padding=50)
sky_per_pix_avg=sky_per_pix, padding=95)

# Did we drop one source because it was too close to the edge?
assert len(sources) - 1 == len(found_sources)


##
## Disabled BOTH FINAL TESTS until new single_image_photometry
## is implemented. Also moved tests to test_photometry.py
##
# def test_aperture_photometry_no_outlier_rejection():
# fake_image = FakeImage()
# sources = fake_image.sources
# aperture = sources['aperture'][0]
# found_sources = source_detection(fake_image.image,
# fwhm=sources['x_stddev'].mean(),
# threshold=10)
# phot = photutils_stellar_photometry(fake_image.image,
# found_sources, aperture,
# 2 * aperture, 3 * aperture,
# reject_background_outliers=False)
# phot.sort('aperture_sum')
# sources.sort('amplitude')
# found_sources.sort('flux')

# for inp, out in zip(sources, phot):
# stdev = inp['x_stddev']
# expected_flux = (inp['amplitude'] * 2 * np.pi *
# stdev**2 *
# (1 - np.exp(-aperture**2 / (2 * stdev**2))))
# This expected flux is correct IF there were no noise. With noise, the
# standard deviation in the sum of the noise within in the aperture is
# n_pix_in_aperture times the single-pixel standard deviation.

# We could require that the result be within some reasonable
# number of those expected variations or we could count up the
# actual number of background counts at each of the source
# positions.

# Here we just check whether any difference is consistent with
# less than the expected one sigma deviation.
# assert (np.abs(expected_flux - out['net_flux']) <
# np.pi * aperture**2 * fake_image.noise_dev)


# @pytest.mark.parametrize('reject', [True, False])
# def test_aperture_photometry_with_outlier_rejection(reject):
# """
# Insert some really large pixel values in the annulus and check that
# the photometry is correct when outliers are rejected and is
# incorrect when outliers are not rejected.
# """
# fake_image = FakeImage()
# sources = fake_image.sources
# aperture = sources['aperture'][0]
# image = fake_image.image

# found_sources = source_detection(image,
# fwhm=sources['x_stddev'].mean(),
# threshold=10)

# inner_annulus = 2 * aperture
# outer_annulus = 3 * aperture
# # Add some large pixel values to the annulus for each source.
# # adding these moves the average pixel value by quite a bit,
# # so we'll only get the correct net flux if these are removed.
# for source in fake_image.sources:
# center_px = (int(source['x_mean']), int(source['y_mean']))
# begin = center_px[0] + inner_annulus + 1
# end = begin + (outer_annulus - inner_annulus - 1)
# # Yes, x and y are deliberately reversed below.
# image[center_px[1], begin:end] = 100 * fake_image.mean_noise

# phot = photutils_stellar_photometry(image,
# found_sources, aperture,
# 2 * aperture, 3 * aperture,
# reject_background_outliers=reject)
# phot.sort('aperture_sum')
# sources.sort('amplitude')
# found_sources.sort('flux')

# for inp, out in zip(sources, phot):
# stdev = inp['x_stddev']
# expected_flux = (inp['amplitude'] * 2 * np.pi *
# stdev**2 *
# (1 - np.exp(-aperture**2 / (2 * stdev**2))))
# # This expected flux is correct IF there were no noise. With noise, the
# # standard deviation in the sum of the noise within in the aperture is
# # n_pix_in_aperture times the single-pixel standard deviation.
# #

# expected_deviation = np.pi * aperture**2 * fake_image.noise_dev
# # We could require that the result be within some reasonable
# # number of those expected variations or we could count up the
# # actual number of background counts at each of the source
# # positions.

# # Here we just check whether any difference is consistent with
# # less than the expected one sigma deviation.
# if reject:
# assert (np.abs(expected_flux - out['net_flux']) <
# expected_deviation)
# else:
# with pytest.raises(AssertionError):
# assert (np.abs(expected_flux - out['net_flux']) <
# expected_deviation)
assert len(sources) - 1 == len(found_sources)
Loading