Skip to main content

Tiny DSL to generate training dataset for NLU engines

Project description



$ pip install pychatl


$ git clone
$ cd chatl/python
$ python install


$ pip install -e .


From the terminal

usage: pychatl [-h] [--version] [-a ADAPTER] [-m MERGE] [--pretty]
             files [files ...]

Generates training dataset from a simple DSL.

positional arguments:
  files                 One or more DSL files to process

optional arguments:
  -h, --help            show this help message and exit
  --version             show program's version number and exit
  -a ADAPTER, --adapter ADAPTER
                        Name of the adapter to use
  -m MERGE, --merge MERGE
                        Options file to merge with the final result
  --pretty              Pretty output

From the code

from pychatl import parse

result = parse("""
  will it rain in @[city] @[dateStart]

~[new york]

  at the end of the day

  ~[new york]

# Now you got a parsed dataset so you may want to process it for a specific NLU engines

from pychatl.adapters import snips

snips_dataset = snips(result) # Or give options with `snips(result, language='en')`

# And now you got your dataset ready to be fitted within snips-nlu!


$ pip install -e .[test]
$ python -m nose --with-doctest --with-coverage --cover-package=pychatl

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for pychatl, version 2.0.4
Filename, size File type Python version Upload date Hashes
Filename, size pychatl-2.0.4.tar.gz (17.1 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page