LSST Data Management Applications UML Use Case and Activity Model LDM-134 8/18/2011
Large Synoptic Survey Telescope (LSST)
Data Management Applications UML Use Case Model
Mario Juric, Robyn Allsman, Jeff Kantor
LDM-134
Latest Revision: October 10, 2013
This LSST document has been approved as a Content-Controlled Document by the LSST DM Technical Control Team. If this document is changed or superseded, the new document will retain the Handle designation shown above. The control is on the most recent digital document with this Handle in the LSST digital archive and not printed versions. Additional information may be found in the LSST DM TCT minutes.
The contents of this document are subject to configuration control by the LSST DM Technical Control Team.
i
LSST Data Management Applications UML Use Case and Activity Model LDM-134 10/10/2013
Change Record
Version / Date / Description / Owner name1 / 1/28/2011 / Update Document to reflect Model based on Data Challenge 3 / J. Kantor
2 / 7/12/2011 / Update Document to reflect Model based on Data Challenge 3B PT1 / R. Allsman
3 / 8/18/2011 / General updates / R. Allsman
4 / 9/11/2013 / Final Design updates (revision 1.125) / R.Allsman
5 / 9/26/2013 / Formatting updates and WBS inclusion / R. Allsman
6 / 10/10/2013 / Returned SDQA to the Apps model (revision 1.144); TCT approved / R. Allsman
Table of Contents
Change Record i
DMS Use Cases 1
Actors 2
Science Data Calibration and Quality Assessment 4
Science Data Quality Assessment Pipeline 4
Assess Data Quality 6
Assess Data Quality for Calibration Products 7
Assess Data Quality for Nightly Processing at Archive 7
Assess Data Quality for Data Release 8
Assess Data Quality for Nightly Processing 8
Science Data Quality Analyst Toolkit 8
Analyze SDQA Metrics 9
Correlate SDQA metric with other data 9
Correlate SDQA metrics 10
Display SDQA Metrics 10
Science Pipeline Toolkit 10
Science Pipeline Toolkit 11
Configure Pipeline Execution 12
Execute Pipeline 12
Incorporate User Code into Pipeline 12
Monitor Pipeline Execution 13
Select Data to be Processed 13
Select Data to be Stored 13
Upload User Codes 13
Calibration Processing 13
Periodic Calibration Products Production 14
Produce Calibration Data Products 14
Acquire Raw Calibration Exposures 15
Calculate System Bandpasses 15
Calculate Telescope Bandpasses 16
Construct Defect Map 16
Produce Crosstalk Correction Matrix 16
Produce Master Bias Exposure 17
Produce Master Dark Exposure 17
Produce Master Fringe Exposures 18
Produce Master Pupil Ghost Exposure 18
Produce Optical Ghost Catalog 18
Produce Synthetic Flat Exposures 19
Determine Illumination Correction 20
Produce Master Flat-Spectrum Flat Exposures 20
Correct Monochromatic Flats 21
Create Master Flat-Spectrum Flat 21
Create Master Illumination Correction 21
Determine CCOB-derived Illumination Correction 21
Determine Optical Model-derived Illumination Correction 22
Determine Self-calibration Correction-Derived Illumination Correction 22
Determine Star Raster Photometry-derived Illumination Correction 22
Nightly Calibration Products 23
Calculate Atmospheric Models from Calibration Telescope Spectra 23
Prepare Nightly Flat Exposures 24
Reduce Spectrum Exposure 24
Common Image Processing 24
Low-level Image Operations 25
Raw Exposure Processing 25
Calibrate Exposure 26
Combine Raw Exposures 27
Process Raw Exposures to Calibrated Exposure 27
Remove Instrument Signature 27
Assemble CCD 28
Detect Sources 28
Determine Aperture Correction 28
Determine Photometric Zeropoint 29
Determine PSF 29
Determine Sky Background Model 29
Determine WCS 29
Remove Exposure Artifacts 30
Sum Exposures 30
Nightly Processing 31
Prepare for Observing 33
Process Nightly Observing Run 34
Association 35
Perform DIA Source Association 36
Perform DIA Object Association 36
Create Instance Catalog for Visit 36
Associate with Instance Catalog 37
Alert Generation and Distribution 37
Generate and Distribute Alerts 38
Generate Alerts 38
Distribute to Subscribed Brokers 39
Distribute to Subscribed Users 39
DIA Source Detection and Characterization 39
Detect and Characterize DIA Sources 40
Estimate Detection Efficiency 40
Subtract Calibrated Exposure from Template Exposure 41
Detect DIA Sources in Difference Exposure 41
Measure DIA Sources 42
Measure Snap Difference Flux 42
Identify DIA Sources caused by Artifacts 43
Perform Difference Image Forced Photometry 43
Perform Precovery Forced Photometry 43
DIA Object Characterization 44
Update DIA Object Properties 44
Calculate DIA Object Flux Variability Metrics 45
Fit DIA Object Position and Motion 45
Moving Objects Processing 45
Process Moving Objects 46
Find Tracklets 46
Link Tracklets into Tracks 47
Fit Orbit 47
Prune Moving Object Catalog 47
Perform Precovery 48
Recalculate Solar System Object Properties 48
Data Release Processing 49
Perform Global Self-Calibration 50
Produce a Data Release 51
Cross-match Previous Release AstroObject IDs 51
Global Photometric Calibration 51
Perform Global Photometric Calibration 51
Global Astrometric Calibration 52
Perform Global Astrometric Calibration 52
Single Visit Processing 52
Perform Single Visit Processing 53
Measure Single Visit Sources 53
PSF Estimation 54
Perform Full Focal Plane PSF Estimation 54
Difference Image Characterization 54
Detect and Characterize DIA Objects 55
Deep Detection 55
Detect and Characterize AstroObjects 56
Detect Sources on Coadds 57
Image Coaddition 57
Create Template Exposures 58
Create Coadd Exposures 58
Coadd Calibrated Exposures 59
Create Deep Coadd Exposures 59
Create Short Period Coadd Exposures 60
Create Best Seeing Coadd Exposures 60
Create PSF-matched Coadd Exposures 60
Object Characterization 61
Characterize AstroObject Flux Variability 61
Create Sky Coverage Maps 61
Measure AstroObjects 61
Perform Deblending and Association 62
Perform Forced Photometry 62
The contents of this document are subject to configuration control by the LSST DM Technical Control Team.
i
LSST Data Management Applications UML Use Case and Activity Model LDM-134 10/10/2013
Model Documentation
Section: Section1
DMS Use Cases
WBS:: 02C
The DMS Use Case Model captures the conceptual definition and relationships of the DMS processing elements. It is semantically very close in level to the OSS and DMSR.
The Use Case Model is described in the form of Unified Modeling Language (UML) 2.0. There are two types of diagrams included:
Package diagrams - Show the overall grouping of model elements into topical areas or modeling packages.
Use Case Diagrams - Show the interactions between human users and external systems (actors) that interact with the DMS. Also show the main processes (use cases) that occur within the DMS during operation of the system in response to these interactions.
The elements on the diagrams are each further defined in structured text. This text describes how the processing creates, updates, uses, and/or destroys Domain Classes. In certain cases, a Use Case may "invoke" (perform in-line) another Use Case. Sequencing of the structured text allows branching from and rejoining to the basic path based on specific criteria.
Figure 1 : DMS Use Cases Packages
This diagram depicts the packages contained in the DMS Use Case Model.
Actors
WBS:: 02C
This package contains the DMS Actors, which are human users of the DMS and external systems with which the DMS interacts.
Figure 2 : Actors
Actor / Description /Public Interface User / This actor represents all users/systems that access LSST public interfaces
Public Resource Locator / This is an external system that contains locations and/or access information to public astronomical resources, such as surveys, tools, and services.
Pipeline Creator / This actor is any user that has the access necessary to create a new component or pipeline type or a new instance of an existing component or pipeline type and to cause that instance to be available for execution.
Pipeline Operator / This actor is any user with access to cause pipelines to execute, to terminate, or to be stopped and started.
Science User / This actor is any user who has access to LSST Data Products, Pipelines, or both.
Simulator / This actor represents any source of simlulated LSST science data, including images, meta-data, catalog data, alerts, etc.
Telescope / This is the main LSST Observatory Telescope.
Observatory Operations / This actor has authority to permit LSST Data Products to be released external to the project.
Camera / This actor represents the Camera subsystem of the LSST, including the Science Data Subsystem (SDS) which is the primary Camera interface to the DMS.
Catalog Creator / This actor is any user that has the access necessary to create a new catalog type or a new instance of an existing catalog type and to cause that instance to be populated with data.
Alert Category Author / This is a user that sets up LSST Alert Categories, allowing for later Subscriptions to these Categories
Auxiliary Telescope / This is the auxiliary telescope used for calibration.
Data Management System Administrator / This actor is any user that has the access necessary to invoke system administration operations (e.g. configure security, equipment, system parameters, etc.) in the LSST Data Management Control System.
LSST Operations / This actor is any user that performs an operational role in the LSST Observatory, including operators and administrators.
Observatory Control System / This actor represents the overall master control system that coordinates the operation of all LSST subsystems.
DMS User / This actor is any user that can access the DMS in any manner. It is the most general class of user, and therefore the least privileged.
DMS-External System / This is any system not part of the DMS with which the DMS has an interface.
Science Data Calibration and Quality Assessment
WBS:: 02C.01.02
Science Data Calibration and Quality Assessment includes
- Science Data Quality Assessment pipelines and toolkits for use by data analysts and scientists to assess the quality of the DMS-generated data;
- Science Data Pipeline Toolkit; and
- Calibration Products Pipeline.
Science Data Quality Assessment Pipeline
WBS:: 02C.01.02.01
Science Data Quality Assessment Pipeline implements the SDQA Pipeline capabilities.
Figure 3 : Science Data Quality Assessment Pipeline
Figure 4 : Assess Data Quality
Assess Data Quality
WBS: 02C.01.02.01
Assess Data Quality allows the user to choose the data type to be examined and assessed.
Scenario / Steps Summary / Rejoins at /Basic Path / 1. When ( user selects Calibration Products ):(see AltPath: User Selects Data Release)(see AltPath: User selects Nightly Processing at Base)(see AltPath: User selects Nightly Processing at Archive)
2. invoke: Assess Calibration Products
3. fin:
AltPath: User Selects Data Release / 1. invoke: Assess Data Quality for Calibration Products / Basic Path step:3
AltPath: User selects Nightly Processing at Archive / 1. invoke: Assess Data Quality for Nightly Processing at Archive / Basic Path step:3
AltPath: User selects Nightly Processing at Base / 1. invoke: Assess Data Quality for Nightly Processing at Base / Basic Path step:3
Assess Data Quality for Calibration Products
WBS: 02C.01.02.01
Assess Data Quality for Calibration Products
Scenario / Steps Summary / Rejoins at /Basic Path / 1. TBD
Assess Data Quality for Nightly Processing at Archive
WBS: 02C.01.02.01
Assess Data Quality for Nightly Processing at Archive - On completion of a pre-defined number of observing nights or on command by Observatory Operations, the DMS does a complete assessment of the overall state of the LSST Data Products and produces Data Product Quality Reports. This assessment looks at the SRD-required observatory and mission satisfaction metrics, such as fields visited in each filter, % of raw images within photometric/astrometric specifications, etc.
Scenario / Steps Summary / Rejoins at /Basic Path / 1. Do Analyze Image Quality:
2. ....Quality of calibration steps (flatfield, debias, defringe, etc); Flag Outliers
3. ....Analyse artifacts: cosmic rays; ccd traps, bad columns, etc; satellite trails; stray light; Flag Outliers
4. ....Analyze telescope optical performance: psf shape over the field and associated wavefront params; Flag Outliers
5. ....Determine atmospheric seeing parameters - including spatial correlation; Flag Outliers
6. done
7. Do Analyze Photometric Qualtiy using several methods:
8. ....Lightcurve analysis
9. ....CMD analysis
10. ....Global consistency of standards; Flag Outliers
11. done:
12. Do Analyze Astrometric Quality:
13. ....Analyze astrometric solutions in the image WCS; Flag outliers
14. ....Analyze proper motion/parallax solutions in the object database; Flag outliers
15. done:
16. Do Analyze Orbit Quality:
17. ....Analyze quality of fit of orbits to observations; Flag Outliers
18. done:
19. Do Analyze Object Properties' Quality:
20. ....Shape; Flag Outliers
21. ....Type classification; Flag Outliers
22. ....Deblending; Flag Outliers
23. ....Photo Z; Flag Outliers
24. done:
25. Analyze Outliers
Assess Data Quality for Data Release
WBS: 02C.01.02.01
Assess Data Quality for Data Release
Scenario / Steps Summary / Rejoins at /Basic Path / 1. TBD
Assess Data Quality for Nightly Processing
WBS: 02C.01.02.01
Assess Data Quality for Nightly Processing
Scenario / Steps Summary / Rejoins at /Basic Path / 1. TBD
Science Data Quality Analyst Toolkit
WBS:: 02C.01.02.02
Figure 5 : Science Data Quality Analysis
Analyze SDQA Metrics
WBS: 02C.01.02.02
Analyze SDQA Metrics - when provided with selection and output criteria, the system acquires and formats the data according to the users' output preference.
GIVEN:
Analyst is using web-base tool to perform these tasks
Scenario / Steps Summary / Rejoins at /Basic Path / 1. System displays the Task Selection Page.
2. SDQA Analyst selects the "Analyze SDQA Metrics".
3. When (SDQA Metrcis selections exists):(see AltPath: SDQA Metric not populated yet)
4. System displays the SDQA Metric Set Up Page.
5. SDQA Analyst selects the SDQA Metrics, Sky Region, Focal Plane Region, Time Range, and output format.
6. System queries the SDQA Data Archive,
7. System generates SDQA Results,
8. System formats them for output, and displays them on the SDQA Results Page.
9. fin:
AltPath: SDQA Metric not populated yet / 1. System generates partial SDQA Results and displays warning that not all SDQA Results are available. / Basic Path step:1
Correlate SDQA metric with other data
WBS: 02C.01.02.02