Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Debug measurements test #307

Closed
wants to merge 34 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
a71660f
merge pipeline and fixtures
guynir42 May 28, 2024
016a8a8
working on datastore, fixed get_image
guynir42 May 29, 2024
8263607
datastore getters are updated
guynir42 May 29, 2024
18b7adb
fix way provenances are fetched in datastore
guynir42 May 30, 2024
f4d5e03
merge main
guynir42 May 30, 2024
3d998ff
add siblings to get_downstreams, refactor fixtures
guynir42 Jun 3, 2024
6eb982a
fixing tests
guynir42 Jun 4, 2024
2c11fae
fix tests
guynir42 Jun 4, 2024
4d2b6d0
fix more tests
guynir42 Jun 5, 2024
aaf68d6
merge main
guynir42 Jun 5, 2024
2dfac95
fix merger issue
guynir42 Jun 5, 2024
34cf9fe
fix more tests, split up pipeline tests
guynir42 Jun 5, 2024
2c89c15
fix all nan slice warning
guynir42 Jun 5, 2024
f734dfa
chase down some warnings
guynir42 Jun 5, 2024
2da30fe
few more warnings
guynir42 Jun 5, 2024
8a1ad8d
try this
guynir42 Jun 5, 2024
1ae8b26
fix workflow
guynir42 Jun 5, 2024
af6fabc
fix more
guynir42 Jun 6, 2024
0ee0c71
fix typo in workflow
guynir42 Jun 6, 2024
90757dc
fix deletion of PTF exposure and image
guynir42 Jun 6, 2024
4150ec4
turn off warnings as errors
guynir42 Jun 6, 2024
70e32dd
try this
guynir42 Jun 7, 2024
e842a8c
remove error
guynir42 Jun 7, 2024
e5e3e79
comment out first test
guynir42 Jun 7, 2024
162a220
trying to debug
guynir42 Jun 7, 2024
bee23f1
trying to debug
guynir42 Jun 7, 2024
8a173fa
more debugging
guynir42 Jun 7, 2024
f7042a1
address reviewer comments
guynir42 Jun 9, 2024
4d5651d
add more debug outputs
guynir42 Jun 9, 2024
c873a3f
add more debug outputs
guynir42 Jun 9, 2024
3e2ae69
reinstate tests
guynir42 Jun 10, 2024
5229d96
add temporary test file
guynir42 Jun 10, 2024
bb4ffa3
remove triggering on negative zogy scores
guynir42 Jun 12, 2024
32ba499
fix test
guynir42 Jun 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Run Pipeline Tests
name: Run Model Tests X

on:
push:
Expand Down Expand Up @@ -59,4 +59,5 @@ jobs:

- name: run test
run: |
TEST_SUBFOLDER=tests/pipeline docker compose run runtests
shopt -s nullglob
TEST_SUBFOLDER=$(ls tests/models/test_{x..z}*.py) docker compose run runtests
68 changes: 35 additions & 33 deletions default_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -82,23 +82,24 @@ preprocessing:
use_sky_subtraction: False

extraction:
measure_psf: true
threshold: 3.0
method: sextractor

astro_cal:
cross_match_catalog: gaia_dr3
solution_method: scamp
max_catalog_mag: [20.0]
mag_range_catalog: 4.0
min_catalog_stars: 50
max_sources_to_use: [2000, 1000, 500, 200]

photo_cal:
cross_match_catalog: gaia_dr3
max_catalog_mag: [20.0]
mag_range_catalog: 4.0
min_catalog_stars: 50
sources:
measure_psf: true
threshold: 3.0
method: sextractor

wcs:
cross_match_catalog: gaia_dr3
solution_method: scamp
max_catalog_mag: [20.0]
mag_range_catalog: 4.0
min_catalog_stars: 50
max_sources_to_use: [2000, 1000, 500, 200]

zp:
cross_match_catalog: gaia_dr3
max_catalog_mag: [20.0]
mag_range_catalog: 4.0
min_catalog_stars: 50

subtraction:
method: zogy
Expand Down Expand Up @@ -170,22 +171,23 @@ coaddition:
ignore_flags: 0
# The following are used to override the regular "extraction" parameters
extraction:
measure_psf: true
threshold: 3.0
method: sextractor
# The following are used to override the regular "astro_cal" parameters
astro_cal:
cross_match_catalog: gaia_dr3
solution_method: scamp
max_catalog_mag: [22.0]
mag_range_catalog: 6.0
min_catalog_stars: 50
# The following are used to override the regular "photo_cal" parameters
photo_cal:
cross_match_catalog: gaia_dr3
max_catalog_mag: [22.0]
mag_range_catalog: 6.0
min_catalog_stars: 50
sources:
measure_psf: true
threshold: 3.0
method: sextractor
# The following are used to override the regular astrometric calibration parameters
wcs:
cross_match_catalog: gaia_dr3
solution_method: scamp
max_catalog_mag: [22.0]
mag_range_catalog: 6.0
min_catalog_stars: 50
# The following are used to override the regular photometric calibration parameters
zp:
cross_match_catalog: gaia_dr3
max_catalog_mag: [22.0]
mag_range_catalog: 6.0
min_catalog_stars: 50


# DECam
Expand Down
99 changes: 55 additions & 44 deletions docs/overview.md

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion docs/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ By default, the volumes with archived files and the database files will still be
docker compose down -v
```

If all is well, the `-v` will delete the volumnes that stored the database and archive files.
If all is well, the `-v` will delete the volumes that stored the database and archive files.

You can see what volumes docker knows about with
```
Expand Down
64 changes: 64 additions & 0 deletions github_temp/run-pipeline-tests-1.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
name: Run Pipeline Tests 1

on:
push:
branches:
- main
pull_request:
workflow_dispatch:

jobs:
tests:
name: run tests in docker image
runs-on: ubuntu-latest
env:
REGISTRY: ghcr.io
COMPOSE_FILE: tests/docker-compose.yaml

steps:
- name: Dump docker logs on failure
if: failure()
uses: jwalton/gh-docker-logs@v2

- name: checkout code
uses: actions/checkout@v3
with:
submodules: recursive

- name: log into github container registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: setup docker buildx
uses: docker/setup-buildx-action@v2
with:
driver: docker-container

- name: bake
uses: docker/[email protected]
with:
workdir: tests
load: true
files: docker-compose.yaml
set: |
seechange_postgres.tags=ghcr.io/${{ github.repository_owner }}/seechange-postgres
seechange_postgres.cache-from=type=gha,scope=cached-seechange-postgres
seechange_postgres.cache-to=type=gha,scope=cached-seechange-postgres,mode=max
setuptables.tags=ghcr.io/${{ github.repository_owner }}/runtests
setuptables.cache-from=type=gha,scope=cached-seechange
setuptables.cache-to=type=gha,scope=cached-seechange,mode=max
runtests.tags=ghcr.io/${{ github.repository_owner }}/runtests
runtests.cache-from=type=gha,scope=cached-seechange
runtests.cache-to=type=gha,scope=cached-seechange,mode=max
shell.tags=ghcr.io/${{ github.repository_owner }}/runtests
shell.cache-from=type=gha,scope=cached-seechange
shell.cache-to=type=gha,scope=cached-seechange,mode=max

- name: run test
run: |
df -h
shopt -s nullglob
TEST_SUBFOLDER=$(ls tests/pipeline/test_{a..o}*.py) docker compose run runtests
63 changes: 63 additions & 0 deletions github_temp/run-pipeline-tests-2.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
name: Run Pipeline Tests 2

on:
push:
branches:
- main
pull_request:
workflow_dispatch:

jobs:
tests:
name: run tests in docker image
runs-on: ubuntu-latest
env:
REGISTRY: ghcr.io
COMPOSE_FILE: tests/docker-compose.yaml

steps:
- name: Dump docker logs on failure
if: failure()
uses: jwalton/gh-docker-logs@v2

- name: checkout code
uses: actions/checkout@v3
with:
submodules: recursive

- name: log into github container registry
uses: docker/login-action@v2
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: setup docker buildx
uses: docker/setup-buildx-action@v2
with:
driver: docker-container

- name: bake
uses: docker/[email protected]
with:
workdir: tests
load: true
files: docker-compose.yaml
set: |
seechange_postgres.tags=ghcr.io/${{ github.repository_owner }}/seechange-postgres
seechange_postgres.cache-from=type=gha,scope=cached-seechange-postgres
seechange_postgres.cache-to=type=gha,scope=cached-seechange-postgres,mode=max
setuptables.tags=ghcr.io/${{ github.repository_owner }}/runtests
setuptables.cache-from=type=gha,scope=cached-seechange
setuptables.cache-to=type=gha,scope=cached-seechange,mode=max
runtests.tags=ghcr.io/${{ github.repository_owner }}/runtests
runtests.cache-from=type=gha,scope=cached-seechange
runtests.cache-to=type=gha,scope=cached-seechange,mode=max
shell.tags=ghcr.io/${{ github.repository_owner }}/runtests
shell.cache-from=type=gha,scope=cached-seechange
shell.cache-to=type=gha,scope=cached-seechange,mode=max

- name: run test
run: |
shopt -s nullglob
TEST_SUBFOLDER=$(ls tests/pipeline/test_{p..z}*.py) docker compose run runtests
File renamed without changes.
2 changes: 1 addition & 1 deletion improc/alignment.py
Original file line number Diff line number Diff line change
Expand Up @@ -414,7 +414,7 @@ def _align_swarp( self, image, target, sources, target_sources ):

# re-calculate the source list and PSF for the warped image
extractor = Detector()
extractor.pars.override(sources.provenance.parameters, ignore_addons=True)
extractor.pars.override(sources.provenance.parameters['sources'], ignore_addons=True)
warpedsrc, warpedpsf, _, _ = extractor.extract_sources(warpedim)
warpedim.sources = warpedsrc
warpedim.psf = warpedpsf
Expand Down
5 changes: 4 additions & 1 deletion improc/photometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -419,6 +419,8 @@ def calc_at_position(data, radius, annulus, xgrid, ygrid, cx, cy, local_bg=True,
the iterative process.
"""
flux = area = background = variance = norm = cxx = cyy = cxy = 0
if np.all(np.isnan(data)):
return flux, area, background, variance, norm, cx, cy, cxx, cyy, cxy, True

# make a circle-mask based on the centroid position
if not np.isfinite(cx) or not np.isfinite(cy):
Expand Down Expand Up @@ -447,7 +449,8 @@ def calc_at_position(data, radius, annulus, xgrid, ygrid, cx, cy, local_bg=True,
return flux, area, background, variance, norm, cx, cy, cxx, cyy, cxy, True

annulus_map_sum = np.nansum(annulus_map)
if annulus_map_sum == 0: # this should only happen in tests or if the annulus is way too large
if annulus_map_sum == 0 or np.all(np.isnan(annulus_map)):
# this should only happen in tests or if the annulus is way too large or if all pixels are NaN
background = 0
variance = 0
norm = 0
Expand Down
3 changes: 3 additions & 0 deletions improc/sextrsky.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@

from util.logger import SCLogger


def single_sextrsky( imagedata, maskdata=None, sigcut=3 ):
"""Estimate sky and sky sigma of imagedata (ignoreing nonzero maskdata pixels)

Expand Down Expand Up @@ -66,6 +67,7 @@ def single_sextrsky( imagedata, maskdata=None, sigcut=3 ):
skysig = 1.4826 * ( np.median( np.abs( imagedata[w] - sky ) ) )
return sky, skysig


def sextrsky( imagedata, maskdata=None, sigcut=3, boxsize=200, filtsize=3 ):
"""Estimate sky using an approximation of the SExtractor algorithm.

Expand Down Expand Up @@ -178,6 +180,7 @@ def sextrsky( imagedata, maskdata=None, sigcut=3, boxsize=200, filtsize=3 ):

# ======================================================================


def main():
parser = argparse.ArgumentParser( description="Estimate image sky using sextractor algorithm" )
parser.add_argument( "image", help="Image filename" )
Expand Down
10 changes: 6 additions & 4 deletions improc/tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,13 +52,15 @@ def sigma_clipping(values, nsigma=3.0, iterations=5, axis=None, median=False):
raise ValueError("values must be a vector, image, or cube")

values = values.copy()

# first iteration:
mean = np.nanmedian(values, axis=axis)
rms = np.nanstd(values, axis=axis)

# how many nan values?
nans = np.isnan(values).sum()
if nans == values.size:
return np.nan, np.nan

# first iteration:
mean = np.nanmedian(values, axis=axis)
rms = np.nanstd(values, axis=axis)

for i in range(iterations):
# remove pixels that are more than nsigma from the median
Expand Down
Loading
Loading