Skip to main content

Active-inference-driven meta tree-of-thought planning engine. Public protocol surface (LLMProvider, TraceSink, WorldModelProvider, MemoryProvider, ThoughtseedPromoter, DecisionLearner). Engine implementation arrives in 0.2.0. Distinct from kyegomez/Meta-Tree-Of-Thoughts.

Project description

metatotai

Status: name reserved. Real implementation arrives in 0.1.0.

metatotai will be an active-inference-driven meta tree-of-thought planning engine — a deterministic, replay-safe decision layer that consumes elume's cognitive substrate and linoss-dynamics physics primitives.

What this will be

  • Active-inference engine — variational free energy + expected free energy computation, belief updates, policy scoring.
  • Meta tree-of-thought planner — search and expansion over reasoning trajectories, scored by free-energy minimization rather than LLM-graded floats.
  • Theory of mindPartnerModel for inferring other agents' beliefs from observed behavior, built on elume.MentalModel.
  • Provider-injectedLLMProvider, MemoryProvider, WorldModelProvider, TraceSink protocols. Bring your own backend.

What this is not

  • Not a fork of kyegomez/Meta-Tree-Of-Thoughts. That is a LangChain-based prompt-rewriting meta-agent over LLM scoring; this is a different system. Zero shared code. Distinct name to avoid confusion.
  • Not the original Tree of Thoughts (Yao et al. 2023). Different mechanism.

Layering

linoss-dynamics  ← physics primitive (NumPy)
       ↑
   elume         ← cognitive substrate (mental models, basins, evolution)
       ↑
   metatotai     ← active-inference + meta-ToT planning

Install (when 0.1.0 ships)

pip install metatotai

Citations

metatotai's active-inference implementation adopts the conceptual framing from Karl Friston's free-energy principle. The implementation is original Python code; conceptual framing only is shared with upstream sources.

Foundational active inference:

Reference implementations consulted (no shared code):

Tree-of-thought lineage:

  • Yao, S., Yu, D., Zhao, J., Shafran, I., Griffiths, T. L., Cao, Y., & Narasimhan, K. (2023). Tree of Thoughts: Deliberate Problem Solving with Large Language Models. NeurIPS 2023. https://arxiv.org/abs/2305.10601

Substrate dependencies (used at runtime):

  • elume — cognitive substrate (mental models, basins, evolution)
  • linoss-dynamics — LinOSS physics primitive (Rusch & Rus, ICLR 2025)

Distinct from (zero shared code):

  • kyegomez/Meta-Tree-Of-Thoughts — LangChain-based prompt-rewriting meta-agent over LLM scoring. Different mechanism, different stack. Distinct name to avoid academic and licensing confusion.

License

MIT.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metatotai-0.1.0.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metatotai-0.1.0-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file metatotai-0.1.0.tar.gz.

File metadata

  • Download URL: metatotai-0.1.0.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for metatotai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 5a604a5a57f4dab4fd1cb9a1532348f65236a891e3eeaad037d0b3cd11c3f763
MD5 7f6cc5e4c0bdf135b83cd50694b8e60c
BLAKE2b-256 fc1a24c1f0de2ef31fb04ac5a9463023d80a03f1b5b51bb3803885596b73f5ab

See more details on using hashes here.

File details

Details for the file metatotai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: metatotai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.13

File hashes

Hashes for metatotai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 aa8bdc632c1b36243fcf64a0879dbdcc211c75d10e490ad6abbc160c838346e6
MD5 19f64d03e41fa1a983af5d798f574939
BLAKE2b-256 24adbbd52d41b2efd0283e3079c01a0bf2f58154d8d6da4ebdf228c145f20986

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page