A simple package to let you Sqoop into HDFS/Hive/HBase with python
Project description
pysqoop
A python package that lets you sqoop into HDFS/Hive/HBase data from RDBMS using sqoop.
To install the package via pip, run
pip install pysqoop
You can then use the package using
from pysqoop.SqoopImport import Sqoop
sqoop = Sqoop(help=True)
code = sqoop.perform_import()
This will print the output of the command
sqoop --help
to your stoud; e.g.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.3.0-235/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/08/13 20:25:13 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.3.0-235
usage: sqoop import [GENERIC-ARGS] [TOOL-ARGS]
Common arguments:
--connect <jdbc-uri> Specify JDBC
connect
string
--connection-manager <class-name> Specify
connection
manager
class name
...
Useful Resources
- HBase Client for Python : happybase
A more concrete example
The following code
sqoop = Sqoop(fs='hdfs://remote-cluster:8020', hive_drop_import_delims=True, fields_terminated_by='\;',
enclosed_by='\'"\'', escaped_by='\\\\', null_string='\'\'', null_non_string='\'\'',
table='sample_table', target_dir='hdfs://remote-cluster/user/hive/warehouse/db/sample_table',
delete_target_dir=True, connect='jdbc:oracle:thin:@//your_ip:your_port/your_schema',
username='user', password='pwd', num_mappers=2,
bindir='/path/to/bindir/folder')
sqoop.perform_import()
will execute the following command
sqoop import -fs hdfs://remote-cluster:8020 --hive-drop-import-delims --fields-terminated-by \; --enclosed-by \'\"\' --escaped-by \\\\ --null-string \'\' --null-non-string \'\' --table sample_table --target-dir hdfs://remote-cluster/user/hive/warehouse/db/sample_table --delete-target-dir --connect jdbc:oracle:thin:@//your_ip:your_port/your_schema --username user --password pwd --num-mappers 2 --bindir /path/to/bindir/folder
Conditional Building
Use the set_param, unset_param function to build conditioned sqoop imports.
sqoop = Sqoop(table="MyTable")
sqoop.set_param(param="--connect", value="jdbc:a_valid_string")
if taget_is_hbase :
added_table = sqoop.set_param(param="--hbase-table", value="MyTable")
added_key = sqoop.set_param(param="--hbase-row-key", value="Id_MyTable")
if added_table and added_key:
print("all params added :D")
sqoop.perform_import()
Unit Testing
In order to run unit tests open the terminal and change the current directory to unittests folder.
Then, simply run python unintary_tests.py. Add your unit tests in this file
Doing
- handle sqoop jobs
TODOs
- add missing parameters
- more tests coverage
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pysqoop-0.0.16.tar.gz.
File metadata
- Download URL: pysqoop-0.0.16.tar.gz
- Upload date:
- Size: 5.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.4.2 requests/2.25.1 setuptools/47.1.1.post20200604 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.6.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1343561071b7b515b1f32141c0e956bc24fb03a669adbfe00f2962217ca22c59
|
|
| MD5 |
477b24692565f6e3596f34bbf0314a47
|
|
| BLAKE2b-256 |
27fc8edbc61a8d65e1e238c53aecb8cc168a59dd748d7dcc09ac99b7c406e127
|
File details
Details for the file pysqoop-0.0.16-py3-none-any.whl.
File metadata
- Download URL: pysqoop-0.0.16-py3-none-any.whl
- Upload date:
- Size: 7.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.4.2 requests/2.25.1 setuptools/47.1.1.post20200604 requests-toolbelt/0.8.0 tqdm/4.24.0 CPython/3.6.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
62529f181997c078ed6fa0f5c08506e26ce586b6d57abcb3d8d2bceb40dca680
|
|
| MD5 |
e68b088c03e50e0c001db084949c205b
|
|
| BLAKE2b-256 |
8d91c8196f1f7c4e35cfbf929e3d8fe3f724c0d14ffc1ca84d09cbd2359691dd
|