Skip to main content

Get identifiers, names, paths, URLs and words from the command output.

Project description

Get identifiers, names, paths, URLs and words from the command output.
The xontrib-output-search for xonsh shell is using this library.

If you like the idea click ⭐ on the repo and stay tuned by watching releases.

Install

pip install -U tokenize-output

Usage

Words tokenizing

$ echo "Try https://github.com/xxh/xxh" | tokenize-output -p
Try
https://github.com/xxh/xxh

JSON, Python dict and JavaScript object tokenizing

$ echo '{"Try": "xonsh shell"}' | tokenize-output -p
Try
shell
xonsh
xonsh shell

env tokenizing

$  echo 'PATH=/one/two:/three/four' | tokenize-output -p
/one/two
/one/two:/three/four
/three/four
PATH

Development

Tokenizers

Tokenizer is a functions which extract tokens from the text.

Priority Tokenizer Text Tokens
1 dict {"key": "val as str"} ['key', 'val as str']
2 env PATH=/bin:/etc ['PATH', '/bin:/etc', '/bin', '/etc']
3 split Split me \n now! ['Split', 'me', 'now!']
4 strip {Hello} ['Hello']

You can create your tokenizer and add it to tokenizers_all in tokenize_output.py.

Tokenizing is a recursive process where every tokenizer returns final and new tokens. The final tokens directly go to the result list of tokens. The new tokens go to all tokenizers again to find new tokens. As result if there is a mix of json and env data in the output it will be found and tokenized in appropriate way.

Test and debug

Run tests:

cd ~
git clone https://github.com/tokenizer/tokenize-output
cd tokenize-output
python -m pytest tests/

To debug the tokenizer:

echo "Hello world" | ./tokenize_outupt -p

Related projects

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenize-output-0.4.3.tar.gz (5.3 kB view details)

Uploaded Source

Built Distribution

tokenize_output-0.4.3-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file tokenize-output-0.4.3.tar.gz.

File metadata

  • Download URL: tokenize-output-0.4.3.tar.gz
  • Upload date:
  • Size: 5.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for tokenize-output-0.4.3.tar.gz
Algorithm Hash digest
SHA256 420564be4a58a48f8a266cfd396a186d1ced480f02d65cd69c85e800b1a51701
MD5 34783e4c0ea958a68cea90d5b7120d14
BLAKE2b-256 7672948560a180d68d619afcb8e8f18301e4c779d39d9313adef46e744fc1312

See more details on using hashes here.

File details

Details for the file tokenize_output-0.4.3-py3-none-any.whl.

File metadata

  • Download URL: tokenize_output-0.4.3-py3-none-any.whl
  • Upload date:
  • Size: 5.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.1.0 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for tokenize_output-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 944c9edda752e867d98aeb52b885b85327c785c729c80a28cba11924a80cae9a
MD5 6ad0effa004fca881202764cec538d61
BLAKE2b-256 3e993e5643193b88f4a818170f4c74d9f56ce9814973a9cce03bc1773239809d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page