Skip to main content

Command-line tool for LLMs to quickly search and understand OpenAPI endpoints

Project description

LLM API Scope (apiscope)

A command-line tool designed for Large Language Models (LLMs) and developers to index, search, and query structured API documentation (e.g., OpenAPI specifications). It assists LLMs in obtaining API information quickly and accurately within automated workflows.

Installation

apiscope is a command-line tool, not a Python library. For isolated installation without affecting your system Python, we recommend using pipx:

# don't use pip
pipx install llm-api-scope

Command Usage

apiscope init

Initialize the project by creating a configuration file (apiscope.ini) and cache directory (.apiscope/cache/). It automatically adds .apiscope/ to your project's .gitignore.

apiscope list

List all configured API specifications by displaying the <name> = <source> pairs from the configuration file.

apiscope search <name> <keywords> [--force]

Search within a specific API specification (<name>) for endpoints matching the given keywords. Returns the total count and displays up to 10 matching <path>:<method> identifiers.

apiscope describe <name> <path:method> [--force]

Generate and output a concise Markdown guide for using the specified endpoint (<path:method>) from the API specification (<name>). The guide includes essential calling information such as parameters, request body, and response structure.

apiscope note

Manage reflective notes for agent reasoning and knowledge capture. This command provides a structured notebook system with six cognitive note types: Observation (OBS), Reasoning (REA), Action (ACT), Reflection (REF), Question (QUE), and Inspiration (INS).

Available subcommands:

  • auth: Establish and manage agent identity authentication through three philosophical dimensions (Name, Role, Story). Required before creating notes to ensure the agent has a defined sense of self.
  • write: Create a new note with specified author and type (uses two-phase write mechanism)
  • read: Display notes for a specific author with pagination and size limits
  • add: Append annotations (REFERENCE, NOTE, or TIP) to existing notes
  • stats: Analyze note-taking patterns, temporal concentration, and thinking segments
  • readme: Display comprehensive documentation about the note system

Notes are stored in .apiscope/notes/ with automatic organization by author and timestamp. The system supports pattern recognition for classical thinking sequences like Empirical Induction, Hypothetico-Deductive reasoning, and Experimental Science. For complete documentation, run apiscope note readme.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_api_scope-0.3.0.tar.gz (32.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_api_scope-0.3.0-py3-none-any.whl (36.3 kB view details)

Uploaded Python 3

File details

Details for the file llm_api_scope-0.3.0.tar.gz.

File metadata

  • Download URL: llm_api_scope-0.3.0.tar.gz
  • Upload date:
  • Size: 32.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_api_scope-0.3.0.tar.gz
Algorithm Hash digest
SHA256 d91d50af58c8338d9e7b7a8b014ae6a4dc0ffcbdb69b014a011882a76a3bf086
MD5 03da06e76bd7565ef32b61b85501d6cf
BLAKE2b-256 ec4c7b98c55eae4f56c062915601bb4208a2afd2b87d5bc52d1bc64acef3fbf7

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_api_scope-0.3.0.tar.gz:

Publisher: publish.yml on D7x7z49/llm-api-scope

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_api_scope-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: llm_api_scope-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 36.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_api_scope-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 175322f0aa61e1cdeabf662b586c3f5120a67e395926e4264c53ffd269e4c76d
MD5 5174878920def2cf7b7833fcf4907b8e
BLAKE2b-256 8650c0407417c85034b26bc29a77c1f9e7684cae69af7d6a6d4eed07650525b2

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_api_scope-0.3.0-py3-none-any.whl:

Publisher: publish.yml on D7x7z49/llm-api-scope

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page