Skip to main content

A multi-interface (REST and MCP) server for natural language inference

Project description

Omni-NLI Logo

Omni-NLI

Tests Code Coverage Python Version PyPI Documentation License
Examples Docker Image (CPU) Docker Image (CUDA)

A multi-interface (REST and MCP) server for natural language inference


Omni-NLI is a self-hostable server that provides natural language inference (NLI) capabilities via RESTful and the Model Context Protocol (MCP) interfaces. It can be used both as a very scalable standalone stateless microservice (via the REST API) and also as an MCP server for AI agents to implement a verification layer for AI-based applications.

Architecture Diagram

What is NLI?

Given two pieces of text called premise and hypothesis, NLI (AKA textual entailment) is the task of determining the directional relationship between them as it is perceived by a human reader. The relationship is given one of these three labels:

  • "entailment": the hypothesis is supported by the premise
  • "contradiction": the hypothesis is contradicted by the premise
  • "neutral": the hypothesis is neither supported nor contradicted by the premise

[!IMPORTANT] NLI is not the same as logical entailment. Its goal is to determine if a reasonable human would consider the hypothesis to follow from the premise. This checks for consistency instead of the absolute truth of the hypothesis.

Typical applications of NLI include:

  • NLI can be used to check if a given piece of text is consistent with the rest of the text. For example, if a new response from a chatbot or AI assistant contradicts something that was said earlier in the conversation.
  • It can be used to check if a summarization contradicts the original text in some way.
  • It can be used to check if the documents in the ranked list of results entail the query.
  • It can be used to check if a piece of text is supported by some facts. Note that this is not the same as using logic.

[!IMPORTANT] The quality of the results depends a lot on the model (the LLM) that is used. A good strategy is to first fine-tune the model using a dataset of premise-hypothesis-label triples that are relevant to your application domain.

Main Features of Omni-NLI

  • Helps mitigate LLM hallucinations by verifying if the generated content is supported by facts
  • Supports models provided by different backends, including Ollama, HuggingFace (public and private/gated models), and OpenRouter
  • Supports REST API (for traditional applications) and MCP (for AI agents) interfaces
  • Fully configurable and very scalable, with built-in caching
  • Provides confidence scores and (optional) reasoning traces for explainability

See ROADMAP.md for the list of implemented and planned features.

[!IMPORTANT] Omni-NLI is in early development, so bugs and breaking changes are expected. Please use the issues page to report bugs or request features.


Quickstart

1. Installation

pip install omni-nli[huggingface]

2. Start the Server

omni-nli

3. Evaluate NLI (with REST API)

curl -X POST \
  -H "Content-Type: application/json" \
  -d '{
    "premise": "A football player kicks a ball into the goal.",
    "hypothesis": "The football player is asleep on the field."
  }' \
  http://127.0.0.1:8000/api/v1/nli/evaluate

Example response:

{
    "label": "contradiction",
    "confidence": 0.99,
    "model": "microsoft/Phi-3.5-mini-instruct",
    "backend": "huggingface"
}

4. Evaluate NLI (with MCP Interface)

lm_studio_mcp_usage_example_1.png


Documentation

Check out the Omni-NLI Documentation for more information, including configuration options, API reference, and examples.


Contributing

Contributions are always welcome! Please see CONTRIBUTING.md for details on how to get started.

License

Omni-NLI is licensed under the MIT License (see LICENSE).

Acknowledgements

  • The logo is from SVG Repo with some modifications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omni_nli-0.1.0.tar.gz (230.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omni_nli-0.1.0-py3-none-any.whl (31.2 kB view details)

Uploaded Python 3

File details

Details for the file omni_nli-0.1.0.tar.gz.

File metadata

  • Download URL: omni_nli-0.1.0.tar.gz
  • Upload date:
  • Size: 230.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for omni_nli-0.1.0.tar.gz
Algorithm Hash digest
SHA256 8413b9c55db38eced0f628fb1466466dec16319979e46f931d32bebac1c6fdb8
MD5 d76ffd07a697dec7899a5e498dad2c02
BLAKE2b-256 b8b9675145bd5a9513ddfb2c49ef0443f6cd1e494068554e844a21043ec78e48

See more details on using hashes here.

File details

Details for the file omni_nli-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: omni_nli-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 31.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.28 {"installer":{"name":"uv","version":"0.9.28","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for omni_nli-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4844447456ae12f6e2496b0f6c1f049294ac532d53ad54af2a1e5d645d02cdfb
MD5 781757b2ee4ab310b2acab3e7b97dcf6
BLAKE2b-256 b3cfddbd3a3c4a1431899996a41ab985925b100da79067a8ccd296ea78185056

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page