Skip to main content

Helper library for interacting with Landing AI LandingLens

Project description

LandingLens code sample repository

This repository contains LandingLens development library and running examples showing how to integrate LandingLens on a variety of scenarios. All the examples show different ways to acquire images from multiple sources and techniques to process the results. Jupyter notebooks focus on ease of use while Python apps include more robust and complete examples.

example description language
Poker card suit identification This notebook can run directly in Google collab using the web browser camera to detect suits on poker card Jupyter Notebook Colab
Door monitoring for home automation This notebook uses an object detection model to determine whether a door is open or closed. The notebook can acquire images directly from an RTSP camera Jupyter Notebook
Streaming capture service This application shows how to do continuous acquisition from an image sensor using RTSP. Python application
Pixel coverage post-processing This notebook demonstrates how to use a VisualPrompting model to analyze the area coverage of different types of land or structures on satellite images. Jupyter Notebook

Install the library

pip install landingai

Quick Start

Prerequisites

This library needs to communicate to the LandingAI platform for various functionalities (e.g. the Predictor API, it calls the HTTP endpoint of your deployed model for prediction results). Thus, you need to have below information at hand before using those functionalities:

  1. LandingAI user API credentials (API key and API secret). See here for how to get it.
  2. The Endpoint ID of your deployed model on CloudInference LandingAI. See here for how to get it.

Run inference using your deployed inference endpoint at LandingAI:

  • Install the library with the above command.
  • Create a Predictor with your inference endpoint id, landing API key and secret.
  • Call predict() with an image (in numpy array format).
from landingai.predict import Predictor
# Find your API key and secrets
endpoint_id = "FILL_YOUR_INFERENCE_ENDPOINT_ID"
api_key = "FILL_YOUR_API_KEY"
api_secret = "FILL_YOUR_API_SECRET"
# Load your image
image = ...
# Run inference
predictor = Predictor(endpoint_id, api_key, api_secret)
predictions = predictor.predict(image)

See a working example in here

Visualize your inference results by overlaying the predictions on the input image and save it on disk:

from landingai.visualize import overlay_predictions
# continue the above example
predictions = predictor.predict(image)
image_with_preds = overlay_predictions(predictions, image)
image_with_preds.save("image.jpg")

Storing API credentials

There are three ways to configure your user API credentials:

  1. Pass them as function parameters.

  2. Set them as environment variables, e.g. export LANDINGAI_API_KEY=..., export LANDINGAI_API_SECRET=...

  3. Store them in an .env file under your project root directory. E.g. below is an example credential data in .env file.

   LANDINGAI_API_KEY=v7b0hdyfj6271xy2o9lmiwkkcb12345
   LANDINGAI_API_SECRET=ao6yjcju7q1e6u0udgwrgknhrx6m4n1o48z81jy6huc059gne047l4fq312345

The above ordering also indicates the priority of the credential loading order.

Documentations

  1. LANDING AI Python Library API Reference

  2. LANDING AI Python Library User Guide (coming soon)

  3. LANDING AI Platform Suport Center

  4. Quick LandingLens Video Walk-Through

Running examples locally

All the examples in this repo can be run locally.

Here is an example to show you how to run the rtsp-capture example locally in a shell environment:

  1. Clone the repo to local: git clone https://github.com/landing-ai/landingai-python.git
  2. Install the library: poetry install --with examples (NOTE: see below for how to install poetry)
  3. Activate the virtual environment: poetry shell
  4. Run: python landingai-python/examples/capture-service/run.py

Building the landingai library locally (for contributors)

Most of the time you won't need to build the library since it is included on this repository and also published to pypi.

But if you want to contribute to the repo, you can follow the below steps.

Prerequisite - Install poetry

landingai uses Poetry for packaging and dependency management. If you want to build it from source, you have to install Poetry first. Please follow the official guide to see all possible options.

For Linux, macOS, Windows (WSL):

curl -sSL https://install.python-poetry.org | python3 -

NOTE: you can switch to use a different Python version by specifying the python version:

curl -sSL https://install.python-poetry.org | python3.10 -

or run below command after you have installed poetry:

poetry env use 3.10

Install all the dependencies

poetry install --all-extras

Run tests

poetry run pytest tests/

Activate the virtualenv

poetry shell

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

landingai-0.0.17.tar.gz (15.6 kB view details)

Uploaded Source

Built Distribution

landingai-0.0.17-py3-none-any.whl (15.7 kB view details)

Uploaded Python 3

File details

Details for the file landingai-0.0.17.tar.gz.

File metadata

  • Download URL: landingai-0.0.17.tar.gz
  • Upload date:
  • Size: 15.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.10.11 Linux/5.15.0-1037-azure

File hashes

Hashes for landingai-0.0.17.tar.gz
Algorithm Hash digest
SHA256 63ab5d3e87d537f203cbb82f0e7583ec668f1e4cee6b0c38024ae5f299505876
MD5 78fe2e06b420819c04f6bf0aa5443e4f
BLAKE2b-256 674523026983db4030183b8535d58c07949079f1c47fc3adac9b1c2149a71d47

See more details on using hashes here.

File details

Details for the file landingai-0.0.17-py3-none-any.whl.

File metadata

  • Download URL: landingai-0.0.17-py3-none-any.whl
  • Upload date:
  • Size: 15.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.10.11 Linux/5.15.0-1037-azure

File hashes

Hashes for landingai-0.0.17-py3-none-any.whl
Algorithm Hash digest
SHA256 164ec8e9d5775dced7139dab292812c0db64d5e926c6d98ed47f49c0837f9acf
MD5 c0206c7c2e954931ebf02ec553eae68c
BLAKE2b-256 39363df32fe1f6da6be579b6f1a4bdba771f34ef71d6ab0aef35fd42e6d980fb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page