ATDB-LDV
ATDB-LDV is the Django backend application for LDV. It's main purpose is to hold the state of all the tasks that are in progress for LDV. ATDB does not control the flow itself, that is handled by a cloud of micro services. Communication is done through the REST API.
Micro Services (in separate repo)
Project Documentation
Confluence Page:
Specifications:
These diagrams roughly serves as the specifications for adapting ATDB for LDV.
- workflow: https://support.astron.nl/confluence/display/LDV/WORKFLOW
- datamodel: https://dbdiagram.io/d/5ffc5fb180d742080a35d560
Datamodel:
Workflow/Status Diagram:
GUI implementation
Deployed Instances
Main GUI:
Admin interface:
REST API (prod)
serializers:
get_size:
Return the sum the sizes of all tasks with a given list of statuses
- https://sdc.astron.nl:5554/atdb/tasks/get_size
- https://sdc.astron.nl:5554/atdb/tasks/get_size?status__in=staged,processing,processed,validating,validated,ingesting,removing,removed
Build & Deploy
Deployment Diagram:
CI/CD (semi) automatic deploy in Docker
A gitlab CI/CD pipeline will automatically build after every commit. Branches can be deployed to sdc-dev (test) Master can be deployed to sdc-dev (test) or sdc (production)
Manual deploy in Docker (alternative to CI/CD)
initial
> cd ~/my_docker
> cd atdb-ldv
> git clone https://git.astron.nl/astron-sdc/atdb-ldv.git
update:
> export DOCKER_BUILD_DIR=$HOME/my_docker/atdb-ldv/atdb-ldv/atdb
> export DOCKER_COMPOSE_DIR=$DOCKER_BUILD_DIR/docker
> cd $DOCKER_BUILD_DIR
> git pull
> docker build -t atdb-ldv:latest .
> cd $DOCKER_COMPOSE_DIR
> docker-compose -p atdb up -d
Database changes and migrations
When the models.py
is changed, then the database must be migrated.
This is the procedure for that.
on CI/CD page: https://git.astron.nl/astron-sdc/atdb-ldv/-/pipelines
- when automatic build is finished, push >> to deploy
on 'sdc-dev.astron.nl' (sdc@dop814) and 'sdc.astron.nl' (sdco@dop821) machine's:
> docker exec -it atdb-ldv python manage.py makemigrations --settings atdb.settings.docker_sdc
(this should say 'No changes detected', but do this step anyway as a check)
> docker exec -it atdb-ldv python manage.py migrate --settings atdb.settings.docker_sdc
CI tests
Tests are executed in the CI pipeline through the test stage in the .gitlab-ci.yml
file.
For running the tests in the CI pipeline, the settings file atdb/settings/test_ci.dev
is used.
Local development
Set up
- Clone the repo
- Install Postgres and make a database:
sudo apt install postgresql postgresql-contrib libpq-dev python3-dev sudo -u postgres psql
atdb/atdb/settings/dev.py
:CREATE DATABASE atdb_ldv_12jan2024; CREATE USER atdb_admin WITH PASSWORD 'atdb123'; ALTER ROLE atdb_admin SET client_encoding TO 'utf8'; ALTER ROLE atdb_admin SET default_transaction_isolation TO 'read committed'; ALTER ROLE atdb_admin SET timezone TO 'UTC'; GRANT ALL PRIVILEGES ON DATABASE atdb_ldv_12jan2024 TO atdb_admin; CREATE ROLE ldvstats; \q
atdbdb.sql
from its parent directory:sudo pg_restore -h localhost -U atdb_admin -d atdb_ldv_12jan2024 < atdbdb.sql
atdb123
, not your sudo password.
- Make sure you have Python installed. Then go to the
atdb
directory (not the secondatdb/atdb
) - it's where themanage.py
lives. Create a virtual environment and install dependencies:python3 -m venv venv source venv/bin/activate python -m pip install -r requirements/dev.txt python manage.py migrate --settings=atdb.settings.dev
Start the server
To start the server, run:
python manage.py runserver --settings=atdb.settings.dev
However, if you need to login to the local server, then you will need some environment variables for Keycloak.
In atdb
directory where the file run.sh.example
lives, duplicate it and rename it run.sh
.
Fill in the Keycloak client secret, you can find that in the Keycloak dashboard. Do not commit this file.
Then you can start the server by running the file: source ./run.sh
.
Migrations
When the models.py is changed, then the database must be migrated.
In the atdb
directory, run:
python manage.py makemigrations --settings=atdb.settings.dev
python manage.py migrate --settings=atdb.settings.dev
Then add and commit the resulting migration file.
Running tests
To run tests, you can spin up a dedicated test database locally with docker. This test database will not interfere with your local development database.
docker compose -f docker/docker-compose-test-local.yml up -d
python manage.py test --settings atdb.settings.test_local
Dedicated settings for running the tests are provided in the atdb/settings/test_local.dev
file. For convenience, test.bat
is provided to run the above command (Windows only).