Skip to main content

A set of AI tools for working with Cognite Data Fusion in Python.

Project description

cognite-ai

A set of AI tools for working with CDF (Cognite Data Fusion) in Python, including vector stores and intelligent data manipulation features leveraging large language models (LLMs).

Installation

This package is intended to be used in Cognite's Jupyter notebook and Streamlit. To get started, install the package using:

%pip install cognite-ai

MemoryVectorStore

The MemoryVectorStore allows you to store and query vector embeddings created from text, enabling use cases where the number of vectors is relatively small.

Example

You can create vectors from text (either as individual strings or as a list) and query them:

from cognite.ai import MemoryVectorStore
from cognite.client import CogniteClient

client = CogniteClient()
# Create a MemoryVectorStore instance
vector_store = MemoryVectorStore(client)

# Store text as vectors
vector_store.store_text("The compressor in unit 7B requires maintenance next week.")
vector_store.store_text("Pump 5A has shown signs of decreased efficiency.")
vector_store.store_text("Unit 9 is operating at optimal capacity.")

# Query vector store
vector_store.query_text("Which units require maintenance?")

Smart Data Tools

With cognite-ai, you can enhance your data workflows by integrating LLMs for intuitive querying and manipulation of data frames. The module is built on top of PandasAI and adds Cognite-specific features.

The Smart Data Tools come in three components:

Pandas Smart DataFrame Pandas Smart DataLake Pandas AI Agent

1. Pandas Smart DataFrame

SmartDataframe enables you to chat with individual data frames, using LLMs to query, summarize, and analyze your data conversationally.

Example

from cognite.ai import load_pandasai
from cognite.client import CogniteClient
import pandas as pd

# Load the necessary classes
client = CogniteClient()
SmartDataframe, SmartDatalake, Agent = await load_pandasai()

# Create demo data
workorders_df = pd.DataFrame({
    "workorder_id": ["WO001", "WO002", "WO003", "WO004", "WO005"],
    "description": [
        "Replace filter in compressor unit 3A",
        "Inspect and lubricate pump 5B",
        "Check pressure valve in unit 7C",
        "Repair leak in pipeline 4D",
        "Test emergency shutdown system"
    ],
    "priority": ["High", "Medium", "High", "Low", "Medium"]
})

# Create a SmartDataframe object
s_workorders_df = SmartDataframe(workorders_df, cognite_client=client)

# Chat with the dataframe
s_workorders_df.chat('Which 5 work orders are the most critical based on priority?')

Customizing LLM Parameters

You can configure the LLM parameters to control aspects like model selection and temperature.

params = {
    "model": "gpt-35-turbo",
    "temperature": 0.5
}

s_workorders_df = SmartDataframe(workorders_df, cognite_client=client, params=params)

2. Pandas Smart DataLake

SmartDatalake allows you to combine and query multiple data frames simultaneously, treating them as a unified data lake.

Example

from cognite.ai import load_pandasai
from cognite.client import CogniteClient
import pandas as pd

# Load the necessary classes
client = CogniteClient()
SmartDataframe, SmartDatalake, Agent = await load_pandasai()

# Create demo data
workorders_df = pd.DataFrame({
    "workorder_id": ["WO001", "WO002", "WO003"],
    "asset_id": ["A1", "A2", "A3"],
    "description": ["Replace filter", "Inspect pump", "Check valve"]
})
workitems_df = pd.DataFrame({
    "workitem_id": ["WI001", "WI002", "WI003"],
    "workorder_id": ["WO001", "WO002", "WO003"],
    "task": ["Filter replacement", "Pump inspection", "Valve check"]
})
assets_df = pd.DataFrame({
    "asset_id": ["A1", "A2", "A3"],
    "name": ["Compressor 3A", "Pump 5B", "Valve 7C"]
})

# Combine them into a smart lake
smart_lake_df = SmartDatalake([workorders_df, workitems_df, assets_df], cognite_client=client)

# Chat with the unified data lake
smart_lake_df.chat("Which assets have the most work orders associated with them?")

3. Pandas AI Agent

The Agent provides conversational querying capabilities across a single data frame, allowing you to have follow up questions.

Example

from cognite.ai import load_pandasai
from cognite.client import CogniteClient
import pandas as pd

# Load the necessary classes
client = CogniteClient()
SmartDataframe, SmartDatalake, Agent = await load_pandasai()

# Create example data
sensor_readings_df = pd.DataFrame({
    "sensor_id": ["A1", "A2", "A3", "A4", "A5"],
    "temperature": [75, 80, 72, 78, 69],
    "pressure": [30, 35, 33, 31, 29],
    "status": ["Normal", "Warning", "Normal", "Warning", "Normal"]
})

# Create an Agent for the dataframe
agent = Agent(sensor_readings_df, cognite_client=client)

# Ask a question
print(agent.chat("Which sensors are showing a warning status?"))

Contributing

This package exists mainly to provide a in memory vector store in addition to getting around the install problems a user gets in Pyodide when installing pandasai due to dependencies that are not pure Python 3 wheels.

The current development cycle is not great, but consists of copying the contents of the source code in this package into e.g. a Jupyter notebook in Fusion to verify that everything works there.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cognite_ai-0.6.0.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

cognite_ai-0.6.0-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file cognite_ai-0.6.0.tar.gz.

File metadata

  • Download URL: cognite_ai-0.6.0.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for cognite_ai-0.6.0.tar.gz
Algorithm Hash digest
SHA256 5d02e9f701fbe7c26ae64393ad85aa058a2c2dc0f9f6ad86d7391273357423c7
MD5 c81bc154bcd942404303f9691facdfa0
BLAKE2b-256 becdfe954f27183f890071295004b96df4d9d965a786480123413cdf60a90278

See more details on using hashes here.

Provenance

File details

Details for the file cognite_ai-0.6.0-py3-none-any.whl.

File metadata

  • Download URL: cognite_ai-0.6.0-py3-none-any.whl
  • Upload date:
  • Size: 9.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for cognite_ai-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cddbb3e97e17942ce5aade57996abf0112bf85c843fcf54dbaffe0d565c345d9
MD5 26507e57be7f588955fe8c9153902664
BLAKE2b-256 2a5df7e06b51f470c6fa2e96c7dba5ee0858e81564f37665bd2bc07ab8df5609

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page