Skip to main content

Averbis REST API client for Python.

Project description

PyPI Documentation Status Build Status Test Coverage Status PyPI - License PyPI - Python Version Code Style

Averbis is a leading text mining and machine learning company in Healthcare and Life Sciences. We extract information from texts, automate intellectual processes and make meaningful predictions.

The Averbis Python API allows convenient access to the REST API of Averbis products. This includes in particular the ability to interact with the text mining pipelines offered by these products, e.g. to use these in data science environments such as Jupyter notebooks or for integration of the Averbis products in other enterprise systems.

Supported products are:

Status

The Averbis Python API is currently in an open alpha development stage. We try to keep breaking changes minimal, but they may happen on the way to the first stable release.

Features

Currently, supported features are:

  • Managing projects

  • Managing pipelines

  • Managing terminologies

  • Managing collection of documents

  • Managing pears

  • Analysing text using a server-side text mining pipeline

  • Classifying texts using a server-side classifier

Installation

The library can be installed easily via pip

pip install averbis-python-api

Documentation

To get an overview over the methods provided with the client and the corresponding documentation, we refer to our readthedocs API reference.

Moreover, we will provide a number of example Jupyter notebooks that showcase the usage of the client to solve different use cases in an upcoming release.

The usage for a selected number of API endpoints is given below.

Usage

Connecting the client to a platform

from averbis import Client
# Use existing API Token
client = Client('http://localhost:8400/health-discovery', api_token='YOUR_API_TOKEN')
# or generate new API Token based on your credentials (invalidates old API Token)
client = Client('http://localhost:8400/health-discovery', username = 'YOUR_USERNAME', password = 'YOUR_PASSWORD')

Connecting to a pipeline and assure that it is started

project = client.get_project('YOUR_PROJECT_NAME')
pipeline = project.get_pipeline('YOUR_PIPELINE_NAME')
pipeline.ensure_started()

Analysing a string

document = 'This is the string we want to analyse.'
annotations = pipeline.analyse_text(document, language='en')
for annotation in annotations:
    print(annotation)

Analysing a text file

with open('/path/to/text_file.txt', 'rb') as document:
    annotations = pipeline.analyse_text(document, language='en')
    for annotation in annotations:
        print(annotation)

Restricting returned annotation types

annotations = pipeline.analyse_text(document, language='en',
    annotation_types='*Diagnosis') # will return only annotations that end with 'Diagnosis'

Upload documents, process them using a pipeline, and export results

In contrast to the simple text analysis endpoint above, one can also upload the documents into the product and create an analysis process there using experimental endpoints (may change soon). This has some advantages, namely that the results can be inspected in our product using the AnnotationViewer, and that the same document collection could be re-processed several times.

document_collection = project.create_document_collection("COLLECTION_NAME")

file_path = "path/to/text/file.txt"
with open(file_path, "r", encoding="UTF-8") as input_io:
    document_collection.import_documents(input_io)
print(f"Number of documents: {document_collection.get_number_of_documents()}")

pipeline = project.get_pipeline("MY_PIPELINE_NAME")

# Using experimental endpoints to run the analysis and monitor the process state
process = document_collection.create_and_run_process(process_name="MY_PROCESS", pipeline=pipeline)
while process.get_process_state().state == "PROCESSING":
    time.sleep(1)

results = process.export_text_analysis()
print(results)

Pear Management

A PEAR (Processing Engine ARchive) file is the UIMA standard packaging format for UIMA components like analysis engines (annotators) or CAS consumers. We provdie some (experimental - may change soon) endpoints to upload, delete and list PEARs.

project.list_pears()
pear = project.install_pear("path/to/mypear.pear")
print(pear.get_default_configuration())
pear.delete()

Connection profiles

To avoid storing API keys in the Python scripts or constantly re-generating them, it is possible to store the keys for commonly used servers in a configuration file. This file must be called client-settings.json and it must be located either in the working directory of the script or in the user’s home folder in .averbis/client-settings.json.

Each profile has four settings:

  • url: the base URL of the server application

  • api-token: the API token

  • verify-ssl: the path to a PEM file used to validate the server certificate if SSL is used

  • timeout: An optional timeout parameter (in seconds)

Default settings which should be applied to all profiles can be stored in the special profile * (star).

{
  "profiles": {
    "*": {
      "verify-ssl": "caRoot.pem"
    },
    "localhost-hd": {
      "url": "https://localhost:8080/health-discovery",
      "api-token": "dummy-token"
    },
    "localhost-id": {
      "url": "https://localhost:8080/information-discovery",
      "api-token": "dummy-token",
      "verify-ssl": "id.pem"
    }
  }
}

An existing profile can then be loaded with

from averbis import Client
client = Client("localhost-id")

Development

To set up a local development environment, check out the repository, and run uv sync. Best follow this by running the tests to see if all is well.

uv sync
uv run task test

You can get a list of the typical development tasks (e.g. test, format) using

uv run task -l

To install the latest development version of the library directly from GitHub, you can use the following command:

$ pip install --force-reinstall --upgrade git+https://github.com/averbis/averbis-python-api.git@refs/heads/main

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

averbis_python_api-0.17.0.tar.gz (67.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

averbis_python_api-0.17.0-py3-none-any.whl (46.8 kB view details)

Uploaded Python 3

File details

Details for the file averbis_python_api-0.17.0.tar.gz.

File metadata

  • Download URL: averbis_python_api-0.17.0.tar.gz
  • Upload date:
  • Size: 67.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.15 {"installer":{"name":"uv","version":"0.9.15","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for averbis_python_api-0.17.0.tar.gz
Algorithm Hash digest
SHA256 ec2bf2d609531745d5fd706da874cf7f9c60df51b18c586e771dce09d378cd81
MD5 c8ee6fc2f445fad477b4a8deda6037ea
BLAKE2b-256 2f9e0b5165dcfe61c096e05c1b504840de129c780c24b310fcee1ea7dd9d1761

See more details on using hashes here.

File details

Details for the file averbis_python_api-0.17.0-py3-none-any.whl.

File metadata

  • Download URL: averbis_python_api-0.17.0-py3-none-any.whl
  • Upload date:
  • Size: 46.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.15 {"installer":{"name":"uv","version":"0.9.15","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for averbis_python_api-0.17.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7fe57d8667abeb2f50f5cdd1420ea528694425af19339ea00b3c1c2062de079a
MD5 d5113c8c5d90795769e426323dee631a
BLAKE2b-256 11aea978c5884b934b928132e3cf03bb5e0b06866e693391edcc62addc462a6c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page