Skip to content

Setting up the SGNL online analysis

This document describes how to set up an online CBC analysis with SGNL.

Prepare container

Follow the installation guide and make a singularity container.

Install pastro

To run an online analysis with the sgnl-ll-pastro-uploader, the pastro repo needs to be installed. A common practice is to install git repos under CONTAINER_NAME/src.

git clone -b sgnl git@git.ligo.org:gstlal/pastro.git
cd pastro
pip install -e .

Build sif container

singularity build CONTAINER_NAME.sif CONTAINER_NAME

Prepare working directory

In your working directory, copy over the following files from the repo

  1. config/online_dag.yml:

    This is the config file for generating the online analysis workflow. Modify the config file options to setup the configuration. Note that the container: field shoud be the sif container.

  2. config/cbc_db.yaml:

    This is the config file for creating trigger databases.

Download DCC files

Input files such as the template bank, reference PSD, and mass model can be downloaded from the LIGO DCC by adding a dcc section to your config file. The files entries use dot notation to reference paths elsewhere in the config:

dcc:
  number: T2200343-v3
  public: True
  files:
    - paths.reference-psd
    - paths.template-bank
    - prior.mass-model
    - pastro.FGMC.mass-model

Then run:

singularity exec CONTAINER_NAME sgnl-ll-dagger --dcc -c <online config file>

This will automatically download the specified files, move them to the directories indicated in the config, and create symlinks back to the DCC archive directory.

Initialize

After preparing the config file and downloading the necessary DCC files, run the initialization step to split the template bank and generate SVD bin options:

singularity exec CONTAINER_NAME sgnl-ll-dagger --init -c <online config file>

Create Workflow

Workflows can be created by:

singularity exec CONTAINER_NAME sgnl-ll-dagger -c <online config file> -w <workflow>

Currently the supported online workflows are:

  1. setup
    1. setup-prior
  2. inspiral

After creating a workflow, launch the corresponding dag:

condor_submit_dag <dag_name>.dag

Setup workflow

The setup workflow prepares all data products needed before running the online inspiral analysis. It consists of two stages:

  1. SVD bank generation: Using the split bank files from the init step and the reference PSD, generates SVD banks for each interferometer. The SVD decomposition provides a compact representation of the template bank that reduces computational cost during filtering.

  2. Prior likelihood ratio: Creates the initial prior likelihood ratio files and empty zerolag PDF files that the inspiral workflow will populate during the analysis.

Setup-prior workflow

The setup-prior workflow runs only the prior likelihood ratio stage of setup. Use this when SVD banks are already available from a previous run and only the prior files need to be regenerated.

Inspiral workflow

The inspiral workflow is the main online analysis. It creates a long-running dag with the following components:

  • Matched filtering: Reads detector data from shared memory and performs matched filtering using the SVD banks to identify CBC candidates in real time. If injections are enabled, a parallel injection filtering job runs alongside.

  • Likelihood ratio marginalization: Continuously updates the background noise model by marginalizing likelihood ratio statistics across SVD bins.

  • Noise tracking: Monitors PSD evolution and detector data quality metrics.

  • Event handling (when Kafka is configured):

    • Event counting: Tracks candidate events from the zerolag analysis.
    • Event uploading: Uploads significant candidates to GraceDB.
    • SNR optimization: Optionally searches for a better-matched template with higher SNR for uploaded candidates.
    • P-astro uploading: Computes and uploads source classification probabilities.
    • Event plotting: Generates diagnostic plots for uploaded candidates.
    • Metrics collection: Aggregates pipeline performance metrics for monitoring.