Skip to main content

Tiny DSL to generate training dataset for NLU engines

Project description

Installation

pip

$ pip install pychatl

source

$ git clone https://github.com/atlassistant/chatl.git
$ cd chatl/python
$ python setup.py install

or

$ pip install -e .

Usage

From the terminal

$ pychatl .\example\forecast.dsl .\example\lights.dsl -a snips -o '{ \"language\": \"en\" }'

From the code

from pychatl import parse

result = parse("""
# pychatl is really easy to understand.
#
# You can defines:
#   - Intents
#   - Entities (with or without variants)
#   - Synonyms
#   - Comments (only at the top level)

# Inside an intent, you got training data.
# Training data can refer to one or more entities and/or synonyms, they will be used
# by generators to generate all possible permutations and training samples.

%[my_intent]
  ~[greet] some training data @[date]
  another training data that uses an @[entity] at @[date#with_variant]

~[greet]
  hi
  hello

# Entities contains available samples and could refer to a synonym.

@[entity]
  some value
  other value
  ~[a synonym]

# Synonyms contains only raw values

~[a synonym]
  possible synonym
  another one

# Entities and intents can define arbitrary properties that will be made available
# to generators.
# For snips, `type`, `extensible` and `strictness` are used for example.
# If the type value could not be found in the entities declaration, it will assume its a builtin one
# and on snips, it will prepend the 'snips/' automatically

@[date](type=datetime)
  tomorrow
  today

# Variants is used only to generate training sample with specific values that should
# maps to the same entity name, here `date`. Props will be merged with the root entity.

@[date#with_variant]
  the end of the day
  nine o clock
  twenty past five
""")

# Now you got a parsed dataset so you may want to process it for a specific NLU engines

from pychatl.postprocess import snips

snips_dataset = snips(result) # Or give options with `snips(result, language='en')`

# And now you got your dataset ready to be fitted within snips-nlu!

Testing

$ pip install -e .[test]
$ python -m nose --with-doctest -v --with-coverage --cover-package=pychatl

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pychatl-1.2.7.tar.gz (6.2 kB view details)

Uploaded Source

File details

Details for the file pychatl-1.2.7.tar.gz.

File metadata

  • Download URL: pychatl-1.2.7.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.0 setuptools/40.4.3 requests-toolbelt/0.9.1 tqdm/4.28.1 CPython/3.7.2

File hashes

Hashes for pychatl-1.2.7.tar.gz
Algorithm Hash digest
SHA256 beb0a6020e87bf033b05b0e9c99aec5a413af04d37e25f8dae9ffdb9e194aabf
MD5 3e9dba49ec2ee8ae8f87dd7593257743
BLAKE2b-256 47438a679958270a876c6c9fd8ec9236eb45594ba8dd5c472849fe4e537f0c64

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page