Hadoop HDFS cli
Project description
dfs_tool
It is a HDFS cli tool. You can use it to manage your HDFS file system.
It calls the WebHDFS API.
Configuration
You need to put a config file. By default, the config file is at ~/.dfs_tool/config.json
, however you can change it's location by setting environment variable DFS_TOOL_CFG
The configuration looks like below:
{
"api_base_uri": "https://my_hdfs_cluster.com/gateway/ui/webhdfs/v1/",
"username": "superman",
"password": "mypassword",
"io_chunk_size": 16777216
}
api_base_url
: You need to put your WebHDFS endpoint hereusername
: You need to specify your HDFS account usernamepassword
: You need to specify your HDFS account passwordio_chunk_size
: optional, if not set, the default value is 1048576. It is the chunk size for downloading data from HDFS or uploading data to HDFS, you may want to bump this value if your bandwidth is high
Command supported
dfs_tool ls <remote_path> -- list directory or file
dfs_tool download <remote_filename> <local_path> -- download file
dfs_tool cat <remote_filename> -- cat a file
dfs_tool mkdir <remote_dir_name> -- make a directory
dfs_tool rm -R <remote_path> -- remove a file or directory
dfs_tool upload <local_filename> <remote_path> -- upload file
dfs_tool mv <source_location> <destination_location> -- move file or directory
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
dfs_tool-0.0.1.tar.gz
(5.1 kB
view hashes)
Built Distribution
Close
Hashes for dfs_tool-0.0.1-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 49a64e9a34c33bff11f8800dfef67c58175eaed5b6b6560685616ca4b6fdebe1 |
|
MD5 | ce746035d2ca45ce807fc0c1a24a76d9 |
|
BLAKE2b-256 | bd5332a4f2fef28e0773a9eeaa195ffcc18fa5e2d00b8e2509fbf56719dafc11 |