A Python package for semi-formal modeling in neuro-symbolic systems.
Project description
Langda
Language-Driven Agent for Probabilistic Logic Programming
Automatically generate ProbLog code from natural language using LLM agents.
Installation
# From GitHub
pip install git+https://github.com/Symbolic-Intelligence-Org/langda-project.git
# Or clone and install locally
git clone https://github.com/Symbolic-Intelligence-Org/langda-project.git
cd langda-project
pip install -e .
To enable LangChain logging integration, you need to install langchain-logger manually:
pip install --no-deps langchain-logger==0.1.0
!We use --no-deps here to avoid pulling in incompatible dependencies.
To enable retrieve function, you could also install faiss cpu or faiss gpu, this is optional.
Update
pip install --upgrade git+https://github.com/Symbolic-Intelligence-Org/langda-project.git
If you feel you are unable to update, use:
pip install --upgrade --force-reinstall --no-cache-dir git+https://github.com/Symbolic-Intelligence-Org/langda-project.git
Quick Start
from langda import langda_solve
rules = """
langda(LLM:"Define factorial predicate").
query(factorial(5, X)).
"""
result = langda_solve(
agent_type="double_dc",
rule_string=rules,
model_name="deepseek-chat"
)
print(result)
langda_solve — Unified Entry for LangDa Execution
This is the central API for executing LangDa workflows. It dynamically selects the appropriate agent architecture and runs the full generation–evaluation–refinement process.
Function Signature
langda_solve(
rule_string: str,
**overrides: Unpack[SolveOverrides]
) -> str
Parameters
rule_string (str, required)
: ProbLog or hybrid LangDa rules to process. Must be provided.
agent_type (Literal["single_simple","double_simple","single_dc","double_dc"], default="single_dc")
: Select the agent architecture:
– single_*: generate-only
– double_*: generate–evaluate–refine
– _simple: simple agent
– _dc: double-chain (recommended)
model_name (str, default="deepseek-chat")
: The model name used by your API key.
prefix (str, default="")
: Optional prefix to differentiate output files or database entries.
save_dir (str | Path, default=current directory)
: Folder for outputs and cached results.
load (bool, default=False)
: If True, directly load from database, skipping generation when available.
langda_ext (dict, default={})
: Dynamic content mapping for placeholders.
Example: langda(LLM:"/* City */ weather") → {"City": "Berlin"}
query_ext (str, default="")
: For DeepProbLog tasks, add extra facts or queries if needed.
log_path (str, default="langda.log")
: Log file name; combined with prefix if set.
config (dict, optional)
: Optional session configuration.
api_key (str, optional)
: Optional override for model API key.
Default Configuration Example
config = {
"configurable": {
"thread_id": str(uuid4()),
"checkpoint_ns": "langda",
"checkpoint_id": None,
},
"metadata": {}
}
Agent Map
| Key | Class |
|---|---|
"single_simple" |
LangdaAgentSingleSimple |
"double_simple" |
LangdaAgentDoubleSimple |
"single_dc" |
LangdaAgentSingleDC |
"double_dc" |
LangdaAgentDoubleDC |
Return
str — The final executable code or result from the LangDa workflow.
Example
rules = """
% Simple example
langda(LLM:"Describe today's weather in Paris", LOT:"search").
weather(paris, sunny, 25).
"""
result = langda_solve(
rule_string=rules,
agent_type="double_dc",
model_name="deepseek-chat",
prefix="weather_demo",
save_dir="./outputs",
log_path="demo.log"
)
print(result)
Notes
langda_solveautomatically sets up logging and prints start/finish markers for each run.- The double-chain agent (
double_dc) is the most capable and recommended mode.
Configuration
Create .env file:
# DeepSeek (recommended)
GNRT_DEEPSEEK_PROVIDER=deepseek
GNRT_DEEPSEEK_MODEL=deepseek-chat
GNRT_DEEPSEEK_API_KEY=your-api-key
GNRT_DEEPSEEK_API_TYP=Bearer
GNRT_DEEPSEEK_API_VER=2025-03-15
# Optional: for web search function
TAVILY_API_KEY=your-tavily-api-key
For OpenAI or Groq, replace DEEPSEEK with OPENAI or GROQ.
Agent Types
single_simple- Basic generationdouble_simple- Generation with evaluationsingle_dc- Dual-phase generation ⭐ (recommended)double_dc- Dual-phase with evaluation
Langda Syntax
LangDa introduces a unified predicate langda/3 as the central interface between natural language and probabilistic logic programming.
It allows users to describe rules, facts, or reasoning steps directly in English while maintaining full ProbLog compatibility.
Each langda predicate can include up to three parameters:
langda(LLM:"<instruction>", LOT:"<tool>", FUP:"<policy>").
- LLM – (required) natural language instruction or description.
- LOT – (optional) external tool specification (
"search"for web,"retrieve"for local DB`). - FUP – (optional) forced update flag (
"true"regenerates every run,"false"only when changed`).
Example:
langda(LLM:"What was the weather yesterday in Darmstadt?", LOT:"search", FUP:"true").
LangDa automatically interprets linguistic uncertainty (e.g., often, rarely) into probabilistic annotations
and integrates surrounding facts to infer variable bindings, generating executable ProbLog clauses.
Examples
Dynamic Content
rules = """
langda(LLM:"Define rules for /* City */").
"""
result = langda_solve(
rule_string=rules,
agent_type="double_dc",
model_name="deepseek-chat",
langda_ext={"City": "Tokyo"}
)
print(result)
EXT Usage (langda_ext / query_ext)
from langda import langda_solve
# 1) Use langda_ext to inject dynamic placeholders into LLM prompts
rules_dynamic = r"""
% The placeholders /* City */ and /* Task */ will be replaced by langda_ext
langda(LLM:"Create /* Task */ rules for /* City */, include base facts and a query example.").
"""
result_dynamic = langda_solve(
rule_string=rules_dynamic,
agent_type="double_dc",
model_name="deepseek-chat",
prefix="dynamic_ext_demo",
save_dir="./outputs",
langda_ext={
"City": "Berlin",
"Task": "weather"
},
log_path="dynamic_ext.log"
)
print(result_dynamic)
# 2) For DeepProbLog, use query_ext to append extra facts/queries at the end
rules_dpl = r"""
% Generate a probabilistic model and leave space for external queries
langda(LLM:"Define a simple coin model with probabilities and an observation.").
"""
extra_queries = r"""
% --- query_ext appended content ---
evidence(coin, heads).
query(coin).
"""
result_dpl = langda_solve(
rule_string=rules_dpl,
agent_type="double_dc",
model_name="deepseek-chat",
prefix="deepproblog_ext_demo",
save_dir="./outputs",
query_ext=extra_queries,
log_path="deepproblog_ext.log"
)
print(result_dpl)
Knowledge Base (Optional)
For retriever tool, create langda/utils/problog_docs.json:
[
{
"id": "example_1",
"title": "Title",
"content": "Content here...",
"tags": ["tag1"],
"keywords": ["keyword1", "keyword2"]
}
]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langda-0.0.1.tar.gz.
File metadata
- Download URL: langda-0.0.1.tar.gz
- Upload date:
- Size: 44.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6188ac00b720a4f16ab403ebe0b997333e1d8100448449f72daebed7a96801ad
|
|
| MD5 |
3ce359f9e585df3bdeaf8b7002ad209a
|
|
| BLAKE2b-256 |
331b6398724292ad54904d21b452bee7999d98a2bffd8d16a41b23051f6d8196
|
Provenance
The following attestation bundles were made for langda-0.0.1.tar.gz:
Publisher:
python-publish.yml on simon-kohaut/Langda
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
langda-0.0.1.tar.gz -
Subject digest:
6188ac00b720a4f16ab403ebe0b997333e1d8100448449f72daebed7a96801ad - Sigstore transparency entry: 686246413
- Sigstore integration time:
-
Permalink:
simon-kohaut/Langda@914c15314f8474ea0d21ffd6716f0662c14c73e6 -
Branch / Tag:
refs/tags/v0.0.1-alpha - Owner: https://github.com/simon-kohaut
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@914c15314f8474ea0d21ffd6716f0662c14c73e6 -
Trigger Event:
release
-
Statement type:
File details
Details for the file langda-0.0.1-py3-none-any.whl.
File metadata
- Download URL: langda-0.0.1-py3-none-any.whl
- Upload date:
- Size: 53.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9d7e87042e95d4340f6c8081ed2f164eaf4d6f1ec676f51789620b3d18717335
|
|
| MD5 |
f30470d5f1c354bb6fef55e9cba3124c
|
|
| BLAKE2b-256 |
bcac455a3a1343c1250eb5db2696bca0f922714b485661f135c6088c07b062b1
|
Provenance
The following attestation bundles were made for langda-0.0.1-py3-none-any.whl:
Publisher:
python-publish.yml on simon-kohaut/Langda
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
langda-0.0.1-py3-none-any.whl -
Subject digest:
9d7e87042e95d4340f6c8081ed2f164eaf4d6f1ec676f51789620b3d18717335 - Sigstore transparency entry: 686246419
- Sigstore integration time:
-
Permalink:
simon-kohaut/Langda@914c15314f8474ea0d21ffd6716f0662c14c73e6 -
Branch / Tag:
refs/tags/v0.0.1-alpha - Owner: https://github.com/simon-kohaut
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@914c15314f8474ea0d21ffd6716f0662c14c73e6 -
Trigger Event:
release
-
Statement type: