Skip to content

Develop on a ldas instance.

To deveop any of the two main compenets (GraceDB and GWCelry) of the LLAI (Low Latency Alert Infrastructure) one need a working test/deployment infrastructure and of a pipeline emulator that mimick the upload from tha analysis pipelines (meg, mock-event-generator). In this part of the guide we show a quick tour on how to have such environment ready (using minikube) and working. We will refer to the instruction for the single componet to intsall each of the them in a different K8S system.

Projects involved in LLAI development

The first step is to clone the different project repositories: The general procedure to be used to propse change in teh code involve that you created a private forks. Just for testing purpose and doing some test you may clone the master repository of thr two projects.

git clone https://git.ligo.org/computing/gracedb/server.git      server-gracedb
git clone https://git.ligo.org/emfollow/gwcelery.git             gwcelery
git clone https://git.ligo.org/emfollow/mock-event-generator.git meg

The suggested way for doing development is that you create two private forks of these projects in your name space. Once you have your private fork you may clone them with the command:

git@git.ligo.org:roberto.depietri/gracedb.git               server-gracedb
git@git.ligo.org:roberto.depietri/gwcelery.git              gwcelery
git@git.ligo.org:roberto.depietri/mock-event-generator.git  meg

Projects needed for LLAI deployment

Here is a description of all the components and projects needed for a fast and cutting edge installation on the whole LLAI in a Minikube K8s cluster.

git clone https://git.ligo.org/computing/gracedb/k8s/helm.git         gracedb-helm
git clone https://git.ligo.org/emfollow/k8s/helm.git                  gwcelery-helm
git clone https://git.ligo.org/emfollow/k8s/llai-deploy-sandboxed.git llai-deploy

Fast deployment instructions for Minikube

A fast way to have a functional sandboxed installation of the LLAI, including: GWCelery, GraceDB and the Hopskotch server (SCIMMA).