Skip to main content

Import tool from GeoNature to a PostgreSQL database through Export module API (client side)

Project description

https://img.shields.io/badge/python-3.7+-yellowgreen https://img.shields.io/badge/PostgreSQL-10+-blue https://img.shields.io/badge/packaging%20tool-poetry-important https://img.shields.io/badge/code%20style-black-black https://img.shields.io/badge/licence-AGPL--3.0-blue https://app.fossa.com/api/projects/git%2Bgithub.com%2Flpoaura%2FGN2PG.svg?type=shield

This project provides an import data from GeoNature instances to a PostgreSQL database (client side). Widely inspired from ClientApiVN

Project logo

Project Setup

GN2PG Client can be installed by running pip. It requires Python 3.7.4 or above to run.

pip install gn2pg-client

Issues

Please report any bugs or requests that you have using the GitHub issue tracker!

HowTo

Help

gn2pg_cli --help

Init config file

This command init a TOML config file within ~/.gn2pg hidden directory (in user HOME directory), named as you want. PLEASE DO NOT SPECIFY PATH!

gn2pg_cli --init <myconfigfile>

Config file is structured as this. [[source]] block can be duplicate as many as needed (one block for each source).

The data_type value on each source is used to characterize the type of data. This value is used to identify which triggers to be triggered when inserting, updating or deleting data. Current provided trigger configs are:

  • synthese_with_cd_nomenclature which provide triggers to insert basically data on synthese and generate basic metadatas (acquisition framework and datasets). Source query sample is provided in file geonature_export_sinp_with_cd_nomenclature.sql

  • synthese_with_metadata which provide triggers to insert data in synthese and populate most of the metadata data (acquisition frameworks, datasets, actors such as organisms and roles, territories, etc.). Source query sample is provided in file geonature_export_sinp_with_metadata.sql

# GN2PG configuration file

# Local db configuration
[db]
db_host = "localhost"
db_port = 5432
db_user = "<dbUser>"
db_password = "<dbPassword>"
db_name = "<dbName>"
db_schema_import = "schema"
    # Additional connection options (optional)
    [db.db_querystring]
    sslmode = "prefer"


# Source configuration,
# Ducplicate this block for each source (1 source = 1 export)
[[source]]
# Source name, will be use to tag stored data in import table
name = "Source1"
# GeoNature source login
user_name = "<monuser>"
# GeoNature source password
user_password = "<monPwd>"
# GeoNature source URL
url = "<http://geonature1/>"
# GeoNature source Export id
export_id = 1
# Data type is facultative. By default the value is 'synthese'. Therefore, triggers from to_gnsynthese.sql are not activated.
# If you want to insert your data into a GeoNature database please choose either 'synthese_with_cd_nomenclature' or 'synthese_with_metadata'.
# If not, delete the line.
data_type = "synthese_with_cd_nomenclature"


[[source]]
# Source configuration
name = "Source2"
user_name = "<monuser>"
user_password = "<monPwd>"
url = "<http://geonature2/>"
export_id = 1
data_type = "synthese_with_cd_nomenclature"

InitDB Schema and tables

To create json tables where datas will be stored, run :

gn2pg_cli --json-tables-create <myconfigfile>

Full download

To download all datas from API, run :

gn2pg_cli --full <myconfigfile>

Incremental download

To update data since last download, run :

gn2pg_cli --update <myconfigfile>

To automate the launching of updates, you can write the cron task using the following command, for example every 30 minutes.

*/30 * * * * /usr/bin/env bash -c "source <path to python environment>/bin/activate && gn2pg_cli --update <myconfigfile>" > /dev/null 2>&1

Debug mode

Debug mode can be activated using --verbose CLI argument

Logs

Log files are stored in $HOME/.gn2pg/log directory.

Import datas into GeoNature datas

Default script to auto populate GeoNature is called “to_gnsynthese”.

gn2pg_cli --custom-script to_gnsynthese <myconfigfile>

Contributing

All devs must be done in forks (see GitHub doc).

Pull requests must be pulled to dev branch.

Install project and development requirements (require poetry):

poetry install

Make your devs and pull requests.

Test gn2pg_cli in dev mode by running this command:

poetry run gn2pg_cli <options>

Renew requirements file for non poetry developers

poetry export -f requirements.txt > requirements.txt

Licence

GNU AGPLv3

Team

Logo LPOAuRA

With the financial support of the DREAL Auvergne-Rhône-Alpes.

Logo DREAL AuRA

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gn2pg_client-1.1.0.tar.gz (42.3 kB view details)

Uploaded Source

Built Distribution

gn2pg_client-1.1.0-py3-none-any.whl (43.2 kB view details)

Uploaded Python 3

File details

Details for the file gn2pg_client-1.1.0.tar.gz.

File metadata

  • Download URL: gn2pg_client-1.1.0.tar.gz
  • Upload date:
  • Size: 42.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.12 CPython/3.8.10 Linux/5.13.0-39-generic

File hashes

Hashes for gn2pg_client-1.1.0.tar.gz
Algorithm Hash digest
SHA256 bf3c482fba1782098d90944053ae49dae21d012bc6ca56f42fb79275df0ab3b6
MD5 a2c910a02f008bc8715e38c8b1aea7de
BLAKE2b-256 8402d455a2c1cd04815621f3b7640edd3f3c070a7fefefa66f7d7bd9a99b5872

See more details on using hashes here.

File details

Details for the file gn2pg_client-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: gn2pg_client-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 43.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.12 CPython/3.8.10 Linux/5.13.0-39-generic

File hashes

Hashes for gn2pg_client-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7cd5e917c4d9722c391e0322e512d93548ff9dad9516b711b1950f7ff5b7d291
MD5 9d278f9a85fc56fc95a699dcbf9c46e1
BLAKE2b-256 db73452f862ad14cd6acd189baeab3217446c908d4a11579fdec94d150e4ba7e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page