Hadoop HDFS cli
Project description
dfs_tool
It is a HDFS cli tool. You can use it to manage your HDFS file system.
It calls the WebHDFS API.
Configuration
You need to put a config file. By default, the config file is at ~/.dfs_tool/config.json
, however you can change it's location by setting environment variable DFS_TOOL_CFG
The configuration looks like below:
{
"api_base_uri": "https://my_hdfs_cluster.com/gateway/ui/webhdfs/v1/",
"username": "superman",
"password": "mypassword",
"io_chunk_size": 16777216
}
In some case, server uses certificate to authenticate client, then you can config like below:
{
"api_base_uri": "https://my_hdfs_cluster.com/gateway/ui/webhdfs/v1/",
"auth_cert": "/Users/shizhong/.dfs_tool/sso_client.cer",
"auth_key" : "/Users/shizhong/.dfs_tool/sso_client.key",
"ca_cert" : "/Users/shizhong/.dfs_tool/CombinedDigicertCA.cer"
}
api_base_url
: You need to put your WebHDFS endpoint hereusername
: You need to specify your HDFS account usernamepassword
: You need to specify your HDFS account passwordio_chunk_size
: optional, if not set, the default value is 1048576. It is the chunk size for downloading data from HDFS or uploading data to HDFS, you may want to bump this value if your bandwidth is high
Command supported
dfs_tool ls [-R] <remote_path> -- list directory or file
dfs_tool download [-R] <remote_filename> <local_path> -- download file
dfs_tool cat <remote_filename> -- cat a file
dfs_tool mkdir <remote_dir_name> -- make a directory
dfs_tool rm [-RM] <remote_path> -- remove a file or directory
dfs_tool upload [-R] [-F] <local_filename> <remote_path> -- upload file
dfs_tool mv <source_location> <destination_location> -- move file or directory
Options
Some command support options, here are options:
-
-R
It means "recursive" -
-F
It means "force", in upload command, when-F
is specified, it will override the file already exist there. -
-M
In "rm" command, you can specify a pattern to match the files you want to delete, for example:
dfs_tool rm -M "/tmp/*.parquet"
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file dfs_tool-0.0.10.tar.gz
.
File metadata
- Download URL: dfs_tool-0.0.10.tar.gz
- Upload date:
- Size: 6.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.6.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5ee8bdbbe3346939d99fd9591668096524839a2e04f1af344cb8ebac3d66888a |
|
MD5 | 96a41008a43462313ebca5bf1480f7f8 |
|
BLAKE2b-256 | 16133e0cc46beb07ea3f386b1f4f0d72738e04a80bd0e20deedc3647f5cb9dae |
File details
Details for the file dfs_tool-0.0.10-py2.py3-none-any.whl
.
File metadata
- Download URL: dfs_tool-0.0.10-py2.py3-none-any.whl
- Upload date:
- Size: 8.1 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.6.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8f131f4e944b7a0225d5e08282edb1a068efd25be741263f7725cb8def810b27 |
|
MD5 | 9354668c1562836464c3a41b2ab48250 |
|
BLAKE2b-256 | 872e1f6a7b71c8d266c68f8d2f8e50e653cbdb8b9e49394f4b907b731023d511 |