`target-db2` is a Singer target for db2, built with the Meltano Singer SDK.
Project description
target-db2
target-db2
is a Singer target for db2.
Build with the Meltano Target SDK, by the Infostrux Team
Installation
Install from PyPi:
pipx install target-db2
Install from GitHub:
pipx install git+https://github.com/Infostrux-Solutions/target-db2.git@main
Install via Meltano Configuration (meltano.yml)
loaders:
- name: target-db2
namespace: target_db2
# uncomment one of the following
# pip_url: git+https://github.com/Infostrux-Solutions/target-db2.git@main
# pip_url: target-db2
complete the meltano installation with appropriate configuration settings described in Settings section below.
Configuration
Capabilities
about
stream-maps
schema-flattening
validate-records
Settings
Setting | Required | Default | Description |
---|---|---|---|
host | True | None | IBM Db2 Database Host |
port | True | None | IBM Db2 Database Port |
user | True | None | IBM Db2 Database User Name |
password | True | None | IBM Db2 Database User Password |
database | True | None | IBM Db2 Database Name |
varchar_size | False | None | Field size for Varchar type. Default 10000. Since JSON values are serialized to varchar, it may be necessary to increase this value. Max possible value 32764 |
add_record_metadata | False | None | Add metadata to records. |
load_method | False | TargetLoadMethods.APPEND_ONLY | The method to use when loading data into the destination. append-only will always write all input records whether that records already exists or not. upsert will update existing records and insert new records. overwrite will delete all existing records and insert all input records. |
batch_size_rows | False | None | Maximum number of rows in each batch. |
validate_records | False | 1 | Whether to validate the schema of the incoming streams. |
stream_maps | False | None | Config object for stream maps capability. For more information check out Stream Maps. |
stream_map_config | False | None | User-defined config values to be used within map expressions. |
faker_config | False | None | Config for the Faker instance variable fake used within map expressions. Only applicable if the plugin specifies faker as an addtional dependency (through the singer-sdk faker extra or directly). |
faker_config.seed | False | None | Value to seed the Faker generator for deterministic output: https://faker.readthedocs.io/en/master/#seeding-the-generator |
faker_config.locale | False | None | One or more LCID locale strings to produce localized output for: https://faker.readthedocs.io/en/master/#localization |
flattening_enabled | False | None | 'True' to enable schema flattening and automatically expand nested properties. |
flattening_max_depth | False | None | The max depth to flatten schemas. |
A full list of supported settings and capabilities is available by running: target-db2 --about
Supported Python Versions
- 3.8
- 3.9
- 3.10
- 3.11
- 3.12
Configure using environment variables
This Singer target will automatically import any environment variables within the working directory's
.env
if the --config=ENV
is provided, such that config values will be considered if a matching
environment variable is set either in the terminal context or in the .env
file.
Db2 Authentication and Authorization
Currently, only username / password (UID / PWD) based authentication is supported. If you need support for additional authentication mechanisms, please open an issue.
The username & password can be provided through meltano.yml
or the target-db2
's config.json. The user must have the following permissions in order to be able to load data into Db2.
Minimal Permissions Required on DB2
This library will perform the following actions on DB2.
- CREATE TABLE
- DROP TABLE
- ALTER TABLE ADD COLUMN
- ALTER TABLE ALTER COLUMN
- INSERT INTO TABLE
- MERGE INTO TABLE USING
- [OPTIONALLY] CREATE SCHEMA
NOTE: CREATE SCHEMA
is used to create a new schema where data will be loaded. If the stated target_schema, specified via default_target_schema
exists, this library will not issue a CREATE SCHEMA
command
Known Limitations & Issues
Complex Data Structures (arrays & maps)
Complex values such as dict
or list
will be json encoded and stored as VARCHAR
. The VARCHAR
column has a default size of 10000, and it is user configurable via the setting varchar_size
.
IBM Db2 allows VARCHAR columns up to 32704 bytes.
This target currently does not write to CLOB fields, PRs welcome!
Usage
You can easily run target-db2
by itself or in a pipeline using Meltano.
Executing the Target Directly
note: requires installing tap-carbon-intensity
target-db2 --version
target-db2 --help
# Test using the "Carbon Intensity" sample:
tap-carbon-intensity | target-db2 --config /path/to/target-db2-config.json
Developer Resources
Follow these instructions to contribute to this project.
Initialize your Development Environment
pipx install poetry
poetry install
Create and Run Tests
Create tests within the tests
subfolder and
then run:
poetry run pytest
You can also test the target-db2
CLI interface directly using poetry run
:
poetry run target-db2 --help
Testing with Meltano
Note: This target will work in any Singer environment and does not require Meltano. Examples here are for convenience and to streamline end-to-end orchestration scenarios.
This project is supplied with a custom meltano.yml
project file already created, as well as a docker-compose.yml file to set up source & target systems for end-to-end testing.
Next, install Meltano (if you haven't already) and any needed plugins:
# Install meltano
pipx install meltano
# Initialize meltano within this directory
cd target-db2
meltano install
Now you can test and orchestrate using Meltano:
# Test invocation:
meltano invoke target-db2 --version
# OR run a test `elt` pipeline with the Carbon Intensity sample tap:
docker compose up -d
meltano run tap-carbon-intensity target-db2
# after testing is complete, remember to shut down the container resources
docker compose down
# OR run a test `elt` pipeline with the supplied postgresql
# source & sample data generator
docker compose up -d
python generate_postgresql_data.py
meltano run tap-postgres target-db2
# after testing is complete, remember to shut down the container resources
docker compose down
SDK Dev Guide
See the dev guide for more instructions on how to use the Meltano Singer SDK to develop your own Singer taps and targets.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file target_db2-0.1.5.tar.gz
.
File metadata
- Download URL: target_db2-0.1.5.tar.gz
- Upload date:
- Size: 38.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6b2b0097f1d8c4881b5f48ca01d2f7d030309079d5d939d6d62d8b1ee121bb2b |
|
MD5 | eda29cdfb2d525feeeb5b6530567b1cb |
|
BLAKE2b-256 | b702fc44b4b95c91eadaf44e67b058fde04cc575cd075d27e1997414c9b0c78d |
File details
Details for the file target_db2-0.1.5-py3-none-any.whl
.
File metadata
- Download URL: target_db2-0.1.5-py3-none-any.whl
- Upload date:
- Size: 40.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cb35f00b95ee66e798cd2cc257fd692a5a0452ab7f466225be951eb21d24b10e |
|
MD5 | 933b618f3c186886983339543a9707b2 |
|
BLAKE2b-256 | a06712acc40ef2dbab38a110b1e3e2208d2cded0631c033eb860b7be9ffd21d7 |