Skip to main content

This repository includes an example of a First Class Swarmauri Example.

Project description

Swarmauri Logo

PyPI - Downloads Hits PyPI - Python Version PyPI - License PyPI - swarmauri_tool_sentencecomplexity


Swarmauri Tool · Sentence Complexity

A Swarmauri NLP tool that evaluates sentence complexity by measuring average sentence length and estimating clause counts. Use it to monitor writing style, enforce readability requirements, or trigger editorial suggestions in agents.

  • Tokenizes text with NLTK to compute sentence and word counts.
  • Approximates clause density via punctuation and coordinating/subordinating conjunctions.
  • Returns structured metrics suitable for analytics dashboards or conversational feedback.

Requirements

  • Python 3.10 – 3.13.
  • nltk (downloads the punkt_tab tokenizer data on first import).
  • Core Swarmauri dependencies (swarmauri_base, swarmauri_standard, pydantic).

Installation

Choose the packaging workflow that matches your project; each command resolves the dependencies.

pip

pip install swarmauri_tool_sentencecomplexity

Poetry

poetry add swarmauri_tool_sentencecomplexity

uv

# Add to the current project and update uv.lock
uv add swarmauri_tool_sentencecomplexity

# or install into the active environment without modifying pyproject.toml
uv pip install swarmauri_tool_sentencecomplexity

Tip: Pre-download the NLTK tokenizer resources in deployment images (python -m nltk.downloader punkt_tab) to avoid runtime network calls.

Quick Start

from swarmauri_tool_sentencecomplexity import SentenceComplexityTool

text = "This is a simple sentence. This is another sentence, with a clause."

complexity_tool = SentenceComplexityTool()
result = complexity_tool(text)

print(result)
# {
#   'average_sentence_length': 7.5,
#   'average_clauses_per_sentence': 1.5
# }

The tool raises ValueError when the input text is empty or whitespace.

Usage Scenarios

Flag Long Sentences During Editing

from swarmauri_tool_sentencecomplexity import SentenceComplexityTool

complexity = SentenceComplexityTool()
article = Path("drafts/whitepaper.txt").read_text(encoding="utf-8")
metrics = complexity(article)

if metrics["average_sentence_length"] > 25:
    print("Consider splitting long sentences to improve readability.")

Integrate With a Swarmauri Agent for Style Coaching

from swarmauri_core.agent.Agent import Agent
from swarmauri_core.messages.HumanMessage import HumanMessage
from swarmauri_standard.tools.registry import ToolRegistry
from swarmauri_tool_sentencecomplexity import SentenceComplexityTool

registry = ToolRegistry()
registry.register(SentenceComplexityTool())
agent = Agent(tool_registry=registry)

message = HumanMessage(content="Analyze the complexity of: 'While the system scales, it may introduce latency delays.'")
response = agent.run(message)
print(response)

Compare Versions of a Document Over Time

from swarmauri_tool_sentencecomplexity import SentenceComplexityTool

complexity = SentenceComplexityTool()
versions = {
    "draft": open("draft.txt").read(),
    "final": open("final.txt").read(),
}

for label, text in versions.items():
    metrics = complexity(text)
    print(f"{label}: {metrics['average_sentence_length']:.1f} words, {metrics['average_clauses_per_sentence']:.2f} clauses")

Track whether edits are making the writing clearer or more complex.

Troubleshooting

  • LookupError: Resource punkt_tab not found – Run python -m nltk.downloader punkt_tab before executing the tool, especially in offline environments.
  • Low clause counts for technical prose – The heuristic relies on commas/semicolons and common conjunctions; adjust or extend the tool if you need domain-specific parsing.
  • Non-English text – Tokenization models are optimized for English. Supply language-appropriate tokenizers before using the tool for other languages.

License

swarmauri_tool_sentencecomplexity is released under the Apache 2.0 License. See LICENSE for details.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file swarmauri_tool_sentencecomplexity-0.9.3.dev9.tar.gz.

File metadata

  • Download URL: swarmauri_tool_sentencecomplexity-0.9.3.dev9.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for swarmauri_tool_sentencecomplexity-0.9.3.dev9.tar.gz
Algorithm Hash digest
SHA256 473d17bcbc019aae0fb91d4980853653ab2226f9e6ae8b38ee14c2400b1acb85
MD5 c2f30828d519b2b24e142c353fdc7ccd
BLAKE2b-256 7d31c8dd3ce58726a7064db4572e1f2b0a60a412b706af1881668d460a8e96b9

See more details on using hashes here.

File details

Details for the file swarmauri_tool_sentencecomplexity-0.9.3.dev9-py3-none-any.whl.

File metadata

  • Download URL: swarmauri_tool_sentencecomplexity-0.9.3.dev9-py3-none-any.whl
  • Upload date:
  • Size: 9.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for swarmauri_tool_sentencecomplexity-0.9.3.dev9-py3-none-any.whl
Algorithm Hash digest
SHA256 c7605b8c6cf12c070563503519a846a193c29e53bb96546b3ebfbdc6c71c5c8b
MD5 5d6b924f4cffffcdd5c1dc74c3f09743
BLAKE2b-256 9eadc560e0aa16928756f3fff338d9fa30092729e6a5c3ea87928e5a0f9fe924

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page