.. _data-full: Full data release ================== You can download the full SNR time-series from the GstLAL search and the sky-localization data here (download the file called online.tar.gz): `online.tar.gz`_ .. _`online.tar.gz`: ftp://data1.commons.psu.edu/pub/commons/physics/jfs6271/ .. code-block:: shell $ tar -xvzf online.tar.gz Description of the data ----------------------- Inside the folder data, there are 6 folders corresponding to the 6 high frequency analyses: 29, 32, 38, 49, 56, 1024 Inside each of these directories, there is: 1. a directory for the **snr time series** of signals gstlal-snr-series 2. a directory for the **fits files** from bayestar process_dag/bayestar-localize-coincs 3. a directory for the **stats** of skymaps process_dag/ligo-skymap-stats 4. a directory containing **skymap plots** process_dag/ligo-skymap-plot These 4 directories all contain sub-directories corresponding to the first 5 digits of the gpstime of injection (From 10000 to 10025). Each injection has one file corresponding to it in each of these 4 directories with the name containing the unique gsptime corresponding to each injection: | gstlal-snr-series/${five gps digits}/$*_${gpstime}_event.xml.gz | process_dag/bayestar-localize-coincs/${five gps digits}/${gpstime}.fits | process_dag/ligo-skymap-stats/${five gps digits}/${gpstime}.dat | process_dag/ligo-skymap-plot/${five gps digits}/${gpstime}.pdf Note that the $* in the snr time series file name corresponds to a simulation id. But the file can be uniquely identified using the gpstime. Reproduce results ----------------- Note that we have made use of `HTCondor`_ for our workflows. .. _`HTCondor`: https://research.cs.wisc.edu/htcondor/dagman/dagman.html However, the skymaps can be made without Condor. At the moment, the other workflows require the use of HTCondor. `Contact us`_ for more information. .. _`Contact us`: _static/contact.html Remake the skymaps ------------------ We used ligo-skymap 0.4. To use later versions, you will need to convert the format of the snr time series to use python-lw instead of glue. .. code-block:: shell $ python3.6 -m venv skimps_0p4 $ source skimps_0p4/bin/activate $ pip install --upgrade pip $ pip install 'astropy<4' $ pip install 'ligo.skymap<0.5.0' If you used the method described above to install ligo.skymap and want to make use of --tmp-space option: edit the file in skimps_0p4/lib/python3.6/site-packages/glue/ligolw/dbtables.py according to https://git.ligo.org/kipp.cannon/python-ligo-lw/-/commit/bdc247db729dac346a3ea691973dddeb141a6487?merge_request_iid=17 **NOTE**: If you install ligo.skymap using pip, it will also install glue. If you want to make use of --tmp-space option for the ligolw_sqlite jobs inside the process_input.dag, you will need to patch the glue installation with https://git.ligo.org/kipp.cannon/python-ligo-lw/-/merge_requests/17/diffs?commit_id=bdc247db729dac346a3ea691973dddeb141a6487. Or you can choose to remove the --tmp-space options from the ligolw_sqlite jobs. .. code-block:: shell $ cd data/ $ sh make_dags.sh This will make a dag for each frequency directory to make the skymap, stats for the skymap, and the plot of skymap for each recovered injection at that frequency. The output files will be in the format mentioned above. For each directory inside ${f_high}/process_dag/: .. code-block:: shell $ condor_submit_dag process_input.dag Once all the dags finish, run: .. code-block:: shell $ sh post_process.sh This will produce the paper plots inside the post_process directory. Bayestar output txt files are also inside the post_process directory in form of post_process/29.dat, etc. These contain the aggregated statistics for all the recovered injections at the corresponding frequency. Remake the snr time series from GstLAL: -------------------------------------- To make the snr-time-series from gstlal, there are two steps. The first one is performing the six searches, the second is making the snr-time-series files for each recovered injection for the six runs: 1. GstLAL runs * Download the `GW frames`_ (download the files H1.tar.gz, L1.tar.gz, and V1.tar.gz): .. _`GW frames`: ftp://data1.commons.psu.edu/pub/commons/physics/jfs6271/ .. code-block:: shell $ tar -xvzf H1.tar.gz $ tar -xvzf L1.tar.gz $ tar -xvzf V1.tar.gz Make a frame.cache to provide to the analysis: .. code-block:: shell $ ls H1/H-H1_FAKE-100*/* L1/L-L1_FAKE-100*/* V1/V-V1_FAKE-100*/* | lalapps_path2cache > frame.cache * Clone data release repository for easy access to files: .. code-block:: shell $ git clone https://git.ligo.org/gstlal/ewgw-data-release.git * Install the necessary libraries: The libraries we used for our runs are: .. code-block:: shell gds-2.18.17 ldas-tools-framecpp-2.6.5 ldas-tools-al-2.6.2 swig-3.0.7 metaio-8.5.1 libframe-8.30 gst-python-1.14.3 gst-plugins-good-1.14.3 gst-plugins-base-1.14.3 gstreamer-1.14.3 gsl-1.16 orc-0.4.26 fftw-3.3.8 ligo-scald-0.6.0 lalapps-6.24.0 lalinference-1.11.0 lalpulsar-1.18.0 lalinspiral-1.9.0 lalburst-1.5.2 lalsimulation-1.9.0 lalmetaio-1.5.1 lalframe-1.4.5 lal-6.20.1 gstlal from git, O3b_early_warning branch, git hash 374821e315774368a0d6568ec3da5201c4b5833d * GstLAL analysis Makefiles for different frequencies are here (in the ewgw-data-release repository): ewgw-data-release/analysis-files/Makefiles/${freq}/Makefile.early_warning_offline Open the Makefile.early_warning_offline in an editor and fill in the path to the following dorectories: | GSTLAL_GIT_DIR= # gstlal git repo | FRAMES_DIR= # path to directory with GW frames | EWGW_DATA_RELEASE_DIR= # path to the ew-gw data release repo | WEBDIR = # path to web-page for results After sourcing the env with all the libraries: .. code-block:: shell $ export GSTLAL_FIR_WHITEN=0 $ make -f Makefile.early_warning_offline $ condor_submit_dag trigger_pipe.dag 2. Producing SNR time series: Once all the dags finish: Download this patch for gstlal and install it: :download:`Download patch <_static/0001-extractor_dag_patch.patch>` Create a new directory inside each of the run directories and run the following command: .. code-block:: shell $ gstlal_inspiral_coinc_extractor_dag --injection-database=../H1L1V1-ALL_LLOID_bns_astrophysical_1000000000_1002592000-1000000000-2591000.sqlite --far-threshold 3.8580246913580245e-07 --tmp-space=_CONDOR_SCRATCH_DIR $ condor_submit_dag trigger_pipe.dag This dag will create the snr-time-series inside a directory called bayestar_input. .. _`Numpy`: https://numpy.org/install/ .. _`AstroPy`: https://www.astropy.org .. _`Matplotlib`: https://matplotlib.org/users/installing.html .. _`here`: ftp://data1.commons.psu.edu/pub/commons/physics/jfs6271/