Skip to main content

Import tool from GeoNature to a PostgreSQL database through Export module API (client side)

Project description

https://img.shields.io/badge/python-3.7+-yellowgreen https://img.shields.io/badge/PostgreSQL-10+-blue https://img.shields.io/badge/packaging%20tool-poetry-important https://img.shields.io/badge/code%20style-black-black https://img.shields.io/badge/licence-AGPL--3.0-blue https://app.fossa.com/api/projects/git%2Bgithub.com%2Flpoaura%2FGN2PG.svg?type=shield

This project provides an import data from GeoNature instances to a PostgreSQL database (client side). Widely inspired from ClientApiVN

Project logo

Project Setup

GN2PG Client can be installed by running pip. It requires Python 3.7.4 or above to run.

pip install gn2pg-client

Issues

Please report any bugs or requests that you have using the GitHub issue tracker!

HowTo

Help

gn2pg_cli --help

Init config file

This command init a TOML config file within ~/.gn2pg hidden directory (in user HOME directory), named as you want. PLEASE DO NOT SPECIFY PATH!

gn2pg_cli --init <myconfigfile>

Config file is structured as this. [[source]] block can be duplicate as many as needed (one block for each source).

The data_type value on each source is used to characterize the type of data. This value is used to identify which triggers to be triggered when inserting, updating or deleting data. Current provided trigger configs are:

  • synthese_with_cd_nomenclature which provide triggers to insert basically data on synthese and generate basic metadatas (acquisition framework and datasets). Source query sample is provided in file geonature_export_sinp_with_cd_nomenclature.sql

  • synthese_with_metadata which provide triggers to insert data in synthese and populate most of the metadata data (acquisition frameworks, datasets, actors such as organisms and roles, territories, etc.). Source query sample is provided in file geonature_export_sinp_with_metadata.sql

# GN2PG configuration file

# Local db configuration
[db]
db_host = "localhost"
db_port = 5432
db_user = "<dbUser>"
db_password = "<dbPassword>"
db_name = "<dbName>"
db_schema_import = "schema"
    # Additional connection options (optional)
    [db.db_querystring]
    sslmode = "prefer"


# Source configuration,
# Ducplicate this block for each source (1 source = 1 export)
[[source]]
# Source name, will be use to tag stored data in import table
name = "Source1"
# GeoNature source login
user_name = "<monuser>"
# GeoNature source password
user_password = "<monPwd>"
# GeoNature source URL
url = "<http://geonature1/>"
# GeoNature source Export id
export_id = 1
# Data type is facultative. By default the value is 'synthese'. Therefore, triggers from to_gnsynthese.sql are not activated.
# If you want to insert your data into a GeoNature database please choose either 'synthese_with_cd_nomenclature' or 'synthese_with_metadata'.
# If not, delete the line.
data_type = "synthese_with_cd_nomenclature"


[[source]]
# Source configuration
name = "Source2"
user_name = "<monuser>"
user_password = "<monPwd>"
url = "<http://geonature2/>"
export_id = 1
data_type = "synthese_with_cd_nomenclature"

InitDB Schema and tables

To create json tables where datas will be stored, run :

gn2pg_cli --json-tables-create <myconfigfile>

Full download

To download all datas from API, run :

gn2pg_cli --full <myconfigfile>

Incremental download

To update data since last download, run :

gn2pg_cli --update <myconfigfile>

To automate the launching of updates, you can write the cron task using the following command, for example every 30 minutes.

*/30 * * * * /usr/bin/env bash -c "source <path to python environment>/bin/activate && gn2pg_cli --update <myconfigfile>" > /dev/null 2>&1

Debug mode

Debug mode can be activated using --verbose CLI argument

Logs

Log files are stored in $HOME/.gn2pg/log directory.

Import datas into GeoNature datas

Default script to auto populate GeoNature is called “to_gnsynthese”.

gn2pg_cli --custom-script to_gnsynthese <myconfigfile>

Contributing

All devs must be done in forks (see GitHub doc).

Pull requests must be pulled to dev branch.

Install project and development requirements (require poetry):

poetry install

Make your devs and pull requests.

Test gn2pg_cli in dev mode by running this command:

poetry run gn2pg_cli <options>

Renew requirements file for non poetry developers

poetry export -f requirements.txt > requirements.txt

Licence

GNU AGPLv3

Team

Logo LPOAuRA
  • @ophdlv (Natural Solution), contributor

  • @mvergez (Natural Solution), contributor

  • @andriacap (Natural Solution), contributor

  • @Adrien-Pajot (Natural Solution), contributor


With the financial support of the DREAL Auvergne-Rhône-Alpes and the Office français de la biodiversité.

Logo DREAL AuRA Logo OFB

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gn2pg_client-1.4.0.tar.gz (44.9 kB view details)

Uploaded Source

Built Distribution

gn2pg_client-1.4.0-py3-none-any.whl (47.0 kB view details)

Uploaded Python 3

File details

Details for the file gn2pg_client-1.4.0.tar.gz.

File metadata

  • Download URL: gn2pg_client-1.4.0.tar.gz
  • Upload date:
  • Size: 44.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.10.9 Linux/5.15.0-1031-azure

File hashes

Hashes for gn2pg_client-1.4.0.tar.gz
Algorithm Hash digest
SHA256 dac48b31b2336d6b4b2ce43e401cac8d9afb80642801bf1a5c5ab9695c9aeae0
MD5 e1a2e85c4a04b37d5b0a28c82bab2aa6
BLAKE2b-256 bbf5fa2955505babf6bd4beba51a4e7604aba52740dbd551c5324bb609c83ff7

See more details on using hashes here.

File details

Details for the file gn2pg_client-1.4.0-py3-none-any.whl.

File metadata

  • Download URL: gn2pg_client-1.4.0-py3-none-any.whl
  • Upload date:
  • Size: 47.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.10.9 Linux/5.15.0-1031-azure

File hashes

Hashes for gn2pg_client-1.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 31f6046da85d4f797d57fc89cffecef1c96af25d79b36ef834013dd37bdc6c6a
MD5 33b391331160258ce8576a3cd6960484
BLAKE2b-256 239cba59ca8c39e5b0df161daf5cf76ea10f33980630d50e41b60620ef04fc1e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page