Skip to main content

A multi-interface (REST and MCP) server for natural language inference (NLI)

Project description

Omni-NLI Logo

Omni-NLI

Tests Code Coverage Python Version PyPI Documentation License
Examples Docker Image (CPU) Docker Image (CUDA)

A multi-interface (REST and MCP) server for natural language inference


Omni-NLI is a self-hostable server that provides natural language inference (NLI) capabilities via RESTful and the Model Context Protocol (MCP) interfaces. It can be used both as a very scalable standalone stateless microservice and also as an MCP server for AI agents to implement a verification layer for AI-based applications like chatbots or virtual assistants.

What is NLI?

Given two pieces of text called premise and hypothesis, NLI is the task of determining the logical relationship between them if it was done by a human. The relationship is typically shown by one of three labels:

  • "entailment": the hypothesis is supported by the premise
  • "contradiction": the hypothesis is contradicted by the premise
  • "neutral": the hypothesis is neither supported nor contradicted by the premise

NLI is useful for a lot of applications, like fact-checking the output of large language models (LLMs) and checking the correctness of the answers a question-answering system generates.

Features

  • Supports models provided by different backends, including Ollama, HuggingFace, and OpenRouter
  • Supports REST API (for traditional applications) and MCP (for AI agents) interfaces
  • Fully configurable and very scalable

See ROADMAP.md for the list of implemented and planned features.

[!IMPORTANT] Omni-NLI is in early development, so bugs and breaking changes are expected. Please use the issues page to report bugs or request features.


Quickstart

1. Installation

pip install omni-nli

2. Configure Backends

Copy the example config and add your API keys and other settings in the .env file.

cp .env.example .env

3. Start the Server

omni-nli

The server will be listening on http://127.0.0.1:8000 by default.

4. Evaluate NLI

curl -X POST \
  -H "Content-Type: application/json" \
  -d '{
    "premise": "A football player kicks a ball into the goal.",
    "hypothesis": "The football player is asleep on the field."
  }' \
  http://127.0.0.1:8000/api/v1/nli/evaluate

Example response:

{
    "label": "contradiction",
    "confidence": 0.99,
    "model": "microsoft/Phi-3.5-mini-instruct",
    "backend": "huggingface"
}

Documentation

Check out the Omni-NLI Documentation for more information, including configuration options, API reference, and examples.


Contributing

Contributions are always welcome! Please see CONTRIBUTING.md for details on how to get started.

License

Omni-NLI is licensed under the MIT License (see LICENSE).

Acknowledgements

  • The logo is from SVG Repo with some modifications.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omni_nli-0.1.0a2.tar.gz (210.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

omni_nli-0.1.0a2-py3-none-any.whl (23.4 kB view details)

Uploaded Python 3

File details

Details for the file omni_nli-0.1.0a2.tar.gz.

File metadata

  • Download URL: omni_nli-0.1.0a2.tar.gz
  • Upload date:
  • Size: 210.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.27 {"installer":{"name":"uv","version":"0.9.27","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for omni_nli-0.1.0a2.tar.gz
Algorithm Hash digest
SHA256 437fa8c147302f14014da46f6e8aa51d018f51cd727f880d52425f20e589921f
MD5 98552cb3e352e40e64616365ce3a1213
BLAKE2b-256 e42e2ae5fdbf27352e42bfabfd677dd3f5e4bd737ce5efc389268b49158e0480

See more details on using hashes here.

File details

Details for the file omni_nli-0.1.0a2-py3-none-any.whl.

File metadata

  • Download URL: omni_nli-0.1.0a2-py3-none-any.whl
  • Upload date:
  • Size: 23.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.27 {"installer":{"name":"uv","version":"0.9.27","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for omni_nli-0.1.0a2-py3-none-any.whl
Algorithm Hash digest
SHA256 954094e890b067f3781462a67807f1a292f78e0faaae2dc4e593185bd386a0c7
MD5 4f4f59a5d5a8638fc53054063a7b9c41
BLAKE2b-256 4a0e9da60b96e662f6161aa13a9c445db193b427d7850469ad75146eadb1bb75

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page