Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Validation of Image simulations #34

Closed
katrinheitmann opened this issue Nov 7, 2017 · 13 comments
Closed

Validation of Image simulations #34

katrinheitmann opened this issue Nov 7, 2017 · 13 comments

Comments

@katrinheitmann
Copy link
Contributor

In this epic we collect all the tasks that are connected to the validation of the image generation tools.

@yymao
Copy link
Member

yymao commented Dec 15, 2017

@katrinheitmann @cwwalter (hopefully this is the right place to ask this)

What's the status of the validation tools for the image simulations and the DM catalogs? Is there something similar to DESCQA that people can contribute their validation tests to? If not, would some of the components of DESCQA be helpful for preparing the validation tools for the images and DM catalogs?

@salmanhabib
Copy link

salmanhabib commented Dec 16, 2017

@yymao We are starting to work on setting up a formal testing framework for PhoSim. Some of it will be automated along the lines of nightly regression (also cross-platform) testing and some will be along the lines of validation you are thinking about (right now CCD test data and eventually camera data). It's too early to say if something along the lines of DESCQA will be needed but if so, we'll be in touch (it might be).

Also, PhoSim has been extensively validated including a recent suite of tests for DECam. The validation tests are detailed in a number of reports.

@yymao
Copy link
Member

yymao commented Dec 16, 2017

@salmanhabib thanks! Good to know someone is taking care of testing the image generation.

As you pointed out, my use of the term "validation" was probably too general. Part of what I was wondering is how users (e.g., the Analysis WGs) can make sure that the image output or the DM catalogs fit their science needs. I guess the answer can range from "they don't need to worry about the images at all" to "we need validation tests (on images/DM catalogs) from the Analysis WGs", so I was curious where do we stand on this front.

@salmanhabib
Copy link

@yymao You are right we need to be more formal about this process with the SWGs as time goes on. Currently we are very resource-limited but hopefully we will get there (BTW, the answer is certainly not that "they don't have to worry"!). We should worry about everything there is to worry about ;-).

@fjaviersanchez
Copy link

I'll mention #69 here to keep track of its existence since it's related to this epic. @sethdigel @TomGlanzman should we start a repository with software to produce image QA plots like the ones in #69 (background levels, single visit depth, Moon altitude, etc) or is there something already set up?

@sethdigel
Copy link

@fjaviersanchez, I'm not aware of something already set up. I can see a use for something like that, as distinct from the formal testing framework for PhoSim that @salmanhabib described, and the (I think) related integration tests that @johnrpeterson describes in Chapter 10 of the PhoSim Reference Document (linked here; the provided URL for the test results is currently unresponsive). What there's room for is more along the lines of sanity checking our use of PhoSim, and as we have seen there's reason redo this when PhoSim versions change. On the other hand, some sanity checks (like 'are there stars and galaxies' or 'do the galaxies look like galaxies' or 'are the stars speckly') might be more work to implement as automated tests than would be worthwhile. Or maybe put another way, they'd come for free if we had end-to-end tests, including L2 processing, which would also validate that the stars and galaxies have the right positions, spectra, and shapes. Others, like investigations of what the sky brightness depends on, require a large number of sensor visits and gathering visit metadata from the OpSim database or the PhoSim command files. But keeping track of the scripts we are using, for reference with DC2 generation and eventually DC3, certainly makes sense.

@cwwalter
Copy link
Member

cwwalter commented Jan 3, 2018

I think here we are trying to build a image validation tools for all of DC2 including for phoSim and imSim where we can see if things are working as we use then in our environment as we expect along with our particular inputs. This should also extend to the L2 outputs like the co-adds. As Seth says, this also means comparing with the metadata with the OpSim runs etc.

So, @fjaviersanchez I think for right now, we can build tools in a directory in this repository. It will keep all of the DC2 items together and we can use the same issues etc. In the future if if expands into something more formal (or if we leverage something like DESCQA) we could move the pieces to a another or new repo. Why don't you start with just a "Validation" directory here? Thanks!

@fjaviersanchez
Copy link

fjaviersanchez commented Feb 20, 2018

Copying @yymao's style I am going to create a table to track the progress on the image validation tasks:

Test Validation criteria Status Implemented
Visual inspection --- #desc-dc2-eyeballs @kadrlica
Implementation of image reader for DESCQA Working e-image reader here
Background levels Compatibility with OpSim Created tool for DESCQA here @fjaviersanchez
Speckling No speckles #105
Noise correlations ? Tested in #142 #140 @adrianpope @bleeml
Rough size/shape measurements Compatibility between input and output Will use DM stack -
PSF FWHM Compatibility between input and output Created interface with GIR, @yymao working on porting to DESCQA @fjaviersanchez
Approximate BKG vs Full BKG Compatible #113 #142 #140 Do we need to add tests?

@katrinheitmann
Copy link
Contributor Author

@fjaviersanchez Hi Javier, is this issue still up to date or superseded by another one? It would be good if we can collect/rationalize the image validation work that is going on. What do you think? Thanks!

@fjaviersanchez
Copy link

@katrinheitmann sorry for the delay, I just updated this. I am also preparing a document with extended information and will post here a link to it as soon as I finish it. Thanks for the reminder!

@fjaviersanchez
Copy link

@katrinheitmann, @rmandelb, @cwwalter, @jchiang87 I created this document to try to organize and follow the image validation process.

I know that several more tests have been performed on imSim to validate several features but I couldn't find the relevant information. Could you please fill it up? Also, I know that the PhoSim repository has several validation tests as well, should we point to them in the document or ping John?

@jchiang87
Copy link
Contributor

@fjaviersanchez I updated the imsim table in that document with the relevant entries. I'm not sure that's entirely what you had in mind in terms of validation, but the tracking info on the development is there for all of the items now, and some of the testing for the various features are discussed in the PR and issue threads that are linked in.

@katrinheitmann
Copy link
Contributor Author

This issue is summarized in [https://docs.google.com/document/d/1dP0OugTMgNAuKCdjXHldqqtX2_b4kEkmomjbx97exu0/edit] a google doc and at some point will become part of a DC2 validation paper. I close this up now since I don't think the issue itself will be updated anymore. Feel free to open it again if needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants