Skip to main content

A CLI client for exporting elasticsearch data to csv

Project description

Build Status Latest Version Python versions Package status Package coverage

This project is to just have a simple cli command to export data from ES, postgres, etc using the CPU’s, It’s intended to be used in Data workflow for extracting data out.

Note

This is still early in the development and a bit rough around the edges. Any bug reports, feature suggestions, etc are greatly appreciated. :)

Installation and usage

Installation Since this is a Python package available on PyPi you can install it like any other Python package.

# on modern systems with Python you can install with pip
$ pip install bq-sqoop
# on older systems you can install using easy_install
$ easy_install bq-sqoop

Usage The commands should be mostly self-documenting in how they are defined, which is made available through the help command.

$ bq-sqoop
usage: bq-sqoop -h

arguments:
    -h, --help            show this help message and exit
    -v, --version         Show version and exit.
    -c CONFIG_FILE,       --config_file CONFIG_FILE
                            Toml Config file for the bq-sqoop job.Can be a local
                            file path or a public http link or a GCS fileeg,
                            https://storage.googleapis.com/sample_config.toml or
                            gs://gcs_bucket/sample_config.toml or
                            /tmp/sample_config.toml
    -d, --debug           Debug mode on.

Configuration files

You can find an example repository at https://github.com/therako/bqsqoop-examples.git

Configuration objects

  1. Bigquery
  2. Extractor

Bigquery

[bigquery]
project_id="destination-google-project-id"
dataset_name="destination-dataset"
table_name="destination-table-name"
gcs_tmp_path="gs://gcs-tmp-bucket/bqsqoop/"

Extractor

Elasticsearch

[extractor.elasticsearch]
url="localhost:9200,localhost:9201"
index="source-es-index-name"
timeout="60s"
scroll_size=500
fields=["_all"]

SQL

[extractor.sql]
sql_bind="postgresql+psycopg2://username:password@127.0.0.1:5432/database"
query="select * from table_name"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bq-sqoop-0.0.13.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

bq_sqoop-0.0.13-py3-none-any.whl (20.9 kB view details)

Uploaded Python 3

File details

Details for the file bq-sqoop-0.0.13.tar.gz.

File metadata

  • Download URL: bq-sqoop-0.0.13.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.19.8 CPython/3.6.5

File hashes

Hashes for bq-sqoop-0.0.13.tar.gz
Algorithm Hash digest
SHA256 2fd2f2a9fced263c6a6713306459ca4a64f27e29cc28e18c6e0b4ff7ae52a3ec
MD5 ef896b1f7ab32ad68317979151800d3e
BLAKE2b-256 a05718b4a45e45b2cabbd6ef98f2a07d0c7ed328f3915b98e7b1428d6dcaa742

See more details on using hashes here.

File details

Details for the file bq_sqoop-0.0.13-py3-none-any.whl.

File metadata

  • Download URL: bq_sqoop-0.0.13-py3-none-any.whl
  • Upload date:
  • Size: 20.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.0.0 requests-toolbelt/0.8.0 tqdm/4.19.8 CPython/3.6.5

File hashes

Hashes for bq_sqoop-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 0d78a164b63aeea7881eb4515cf1e2e0a806aa63ce939e9a56948a3892f8cb47
MD5 5ae455d9459e8a27f3dd7d095731c6f0
BLAKE2b-256 77315e16ff73ce2b5474147f8960e7c523ad405475ed7731a14788363d7d539e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page