No project description provided
Project description
Neon Transformers
About
Utterance Transformers
A utterance transformer takes utterances
and a context
as input,
then returns the modified utterances
and a new context
context
is simply a python dictionary, it can contain anything
utterances
is a list of transcription candidates, assumed to be a single utterance not a list of unrelated documents!
A transformer might change the utterances or simply return them unmodified with a context
eg.
- The translator transformer will detect language and translate as necessary, it returns modified utterances
- the NER transformer return unmodified utterances and the context contains extracted entities
Transformers may also depend on other transformers
from neon_utterance_KeyBERT_plugin import KeyBERTExtractor
from neon_utterance_wn_entailment_plugin import WordNetEntailments
# depends on keywords being tagged by a prev transformer
entail = WordNetEntailments()
kbert = KeyBERTExtractor() # or RAKE or YAKE ...
utts = ["The man was snoring very loudly"]
_, context = kbert.transform(utts)
_, context = entail.transform(utts, context)
print(context)
# {'entailments': ['exhale', 'inhale', 'sleep']}
mycroft integration
Usage with mycroft-core is limited to skills, it is useful for fallback and common_qa skills
You can import individual transformers directly in skills
neon integration
neon-core integrate the neon_transformers service in the nlp pipeline transparently
- neon_transformers are integrated into mycroft after STT but before Intent parsing
- all enabled transformer plugins (mycroft.conf) are loaded
- each plugin has a priority that the developer sets and the user can override (mycroft.conf)
- utterances are passed to each transformer sequentially
- utterances are replaced with the text returned by a transformer
- if utterances are transformed, the next transformer receives the transformed utterances
- context is merged with context returned by previous transformer
- the transformed utterances are passed to the intent stage
- context is available in message.context for skills during intent handling
- skills can add transformers to their requirements.txt
- for compatibility with vanilla mycroft-core skills should handle message.context as optional data
- if a certain transformer is absolutely necessary, load it directly if message.context is missing data
ovos-core integration
WIP - not available
Audio Transformers
TODO
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file neon_transformers-0.2.1a4.tar.gz
.
File metadata
- Download URL: neon_transformers-0.2.1a4.tar.gz
- Upload date:
- Size: 8.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e6154f042dfd5823490a5436edaaabee528d2192b1525b46eac7e1624b1696dd |
|
MD5 | eb09f72c207d9d4bfccfd9e707718092 |
|
BLAKE2b-256 | 259d2a4a9342c9a098ed77262a101d861421837510132b764b306868fb39dba9 |
File details
Details for the file neon_transformers-0.2.1a4-py3-none-any.whl
.
File metadata
- Download URL: neon_transformers-0.2.1a4-py3-none-any.whl
- Upload date:
- Size: 15.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 124246c4dc89359dfa922a2ce65002f0d6b80ceda0518875feac97394ea2e4bc |
|
MD5 | 786215c9c82bb909c5a353d1a560c317 |
|
BLAKE2b-256 | 9b85736ac0aed4bedbd68ac07b7631a66821659817e327535a0acfb7173b38cf |