Python module for the Tecnotree Sensa Platform
Project description
Python Module for the Cortex Cognitive Platform
The Cortex Python module provides an API client library to easily integrate with the Cortex Cognitive Platform. Refer to the Cortex documentation for details on how to use the library:
- Developer guide: https://cognitivescale.github.io/cortex-fabric/
- Cortex Python references: https://cognitivescale.github.io/cortex-python/master/
Installation
There are several installation options available:
- base library - for interacting with a Sensa cluster or developing skills
poetry add cortex-python
- model development - for feature and model developing within jupyter notebooks
poetry add cortex-python[model_development]
- model development extras - for developing model training and inference skills
poetry add cortex-python[model_runtime]
- Certifai Evaluator plugin - this must be installed with on of the model* extras NOTE: extra config needed to access the Sensa python repository
poetry config http-basic.sensa <USER> <TOKEN>
poetry add cortex-python[model_development,certifai]
Install from source
git clone git@github.com:CognitiveScale/cortex-python.git
cd cortex-python
# Needed for certifai components
poetry config http-basic.sensa <USER> <TOKEN>
poetry install
Development
Setup
When developing, it's a best practice to work in a virtual environment. Create and activate a virtual environment:
poetry install
poetry shell
Install developer dependencies:
git clone git@github.com:CognitiveScale/cortex-python.git
cd cortex-python
make dev.install
Run Developer test and linting tasks: Three types of checks are configured for this:
- symilar - to test code duplication
- pylint - for linting
- pytest - for running the unit tests. These are orchestrated through tox. The tox configuration is available at
tox.ini
There's a convenience Makefile
that has commands to common tasks, such as build, test, etc. Use it!
Testing
Unit Tests
Follow above setup instructions (making sure to be in the virtual environment and having the necessary dependencies)
make test
to run test suite
To run an individual file or class method, use pytest. Example tests shown below:
- file:
pytest test/unit/agent_test.py
- class method:
pytest test/unit/agent_test.py::TestAgent::test_get_agent
Pre-release to staging
Note: this repository using git tag for versionning
- Create and push an alpha release:
git tag -a 6.5.0a<N> -m 'alpha tag'
git push --tags
make dev.push
This will build an alpha-tagged package.
- Merge
develop
tostaging
branch:
make stage
- In GitHub, create a pull request from
staging
tomaster
.
git tag -a 6.5.0 -m 'rlease tag'
git push --tags
Contributing
After contributing to the library, and before you submit changes as a PR, please do the following
- Run unit tests via
make test
- Manually verification (i.e. try the new changes out in Cortex) to make sure everything is going well. Not required, but highly encouraged.
- Bump up
setup.py
version and update theCHANGELOG.md
Documentation
Activate your virtual environment:
poetry shell
Set up your environment, if you have not done so:
make dev.install
The package documentation is built with Sphinx and generates versioned documentation for all tag matching the release/X.Y.Z
pattern and for the master
branch. To build the documentation:
make docs.multi
The documentation will be rendered in HTML format under the docs/_build/${VERSION}
directory.
Configuring the client
There are four mechanisms for configuring a client connection.
-
Use
$HOME/.cortex/config
file setup viacortex configure
.from cortex.client import Cortex client=Cortex.client(verify_ssl_cert=True|False)
-
Use environment variables CORTEX_URL, CORTEX_TOKEN, and CORTEX_PROJECT
from cortex.client import Cortex client=Cortex.client(verify_ssl_cert=True|False)
This is a table of supported environment variables.
These ENV vars are injected by the Sensa RUNTIME for use during skill startup.
Environment variables | Description |
---|---|
CORTEX_PERSONAL_ACCESS_CONFIG | JSON string containing a Pesonal Access Token obtained from the console UI |
CORTEX_CONFIG_DIR | A folder containing the config created by the cortex configure command |
CORTEX_TOKEN | A JWT token generate by cortex configure token or provided during skill invokes |
CORTEX_PROJECT | Project used during api requests |
CORTEX_URL | URL for Sensa apis |
-
Use method kwargs with
Cortex.client()
from cortex.client import Cortex client=Cortex.client(api_endpoint="<URL>", token="<token>", project="<project>", verify_ssl_cert=True|False)
-
Using the skills invoke message, this allows for authentication using the caller's identity.
from cortex.client import Cortex client=Cortex.from_message({})
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file cortex_python-7.0.0.tar.gz
.
File metadata
- Download URL: cortex_python-7.0.0.tar.gz
- Upload date:
- Size: 45.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.15 Linux/4.18.0-477.15.1.el8_8.x86_64
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 54467aa71203b342822b54f0a491f966316d51843241c55b04d329619028dc0b |
|
MD5 | 6602ccd55d22580c893ab5168005e98e |
|
BLAKE2b-256 | b9f8b52aff36a2bb09317c5f98872989ed5adf9f397f8e14ef74f134230c1245 |
File details
Details for the file cortex_python-7.0.0-py3-none-any.whl
.
File metadata
- Download URL: cortex_python-7.0.0-py3-none-any.whl
- Upload date:
- Size: 59.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.15 Linux/4.18.0-477.15.1.el8_8.x86_64
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f186081f989576a4c09794551da85092249c74237bc2070832881bd991ea5aff |
|
MD5 | 6c838ee163fe85e1ac13101d19040585 |
|
BLAKE2b-256 | 8b142a5d02d02b70cdc699aafad91f3b515f9463929e470a03ab57c80f93edc4 |