Skip to main content

Snowflake adapter for Datus Agent

Project description

Datus Snowflake Adapter

Snowflake database adapter for Datus Agent, providing native Snowflake connector support.

Features

  • Native Snowflake SDK: Uses snowflake-connector-python for optimal performance
  • Full Snowflake Support: Databases, schemas, tables, views, and materialized views
  • Efficient Metadata Retrieval: Uses SHOW commands for fast metadata queries
  • Arrow-based Execution: High-performance query execution with Apache Arrow
  • Multiple Result Formats: CSV, Pandas DataFrame, Arrow Table, and Python list
  • Complete CRUD Operations: INSERT, UPDATE, DELETE, and DDL support

Installation

pip install datus-snowflake

This will automatically install the required dependencies:

  • datus-agent>=0.3.0
  • snowflake-connector-python>=3.6.0

Usage

Basic Connection

from datus_snowflake import SnowflakeConnector

# Create connector
connector = SnowflakeConnector(
    account="myaccount",
    user="myuser",
    password="mypassword",
    warehouse="my_warehouse",
    database="my_database",
    schema="my_schema"
)

# Test connection
result = connector.test_connection()
print(result)  # {'success': True, 'message': 'Connection successful', 'databases': ''}

Execute Queries

# Execute query and get CSV result
result = connector.execute_query("SELECT * FROM users LIMIT 10")
print(result.sql_return)  # CSV string

# Execute query and get pandas DataFrame
result = connector.execute_query("SELECT * FROM users LIMIT 10", result_format="pandas")
df = result.sql_return
print(df.head())

# Execute query and get Arrow table
result = connector.execute_query("SELECT * FROM users LIMIT 10", result_format="arrow")
arrow_table = result.sql_return
print(arrow_table.schema)

Metadata Operations

# Get databases
databases = connector.get_databases()
print(f"Databases: {databases}")

# Get schemas
schemas = connector.get_schemas(database_name="my_database")
print(f"Schemas: {schemas}")

# Get tables
tables = connector.get_tables(database_name="my_database", schema_name="public")
print(f"Tables: {tables}")

# Get views
views = connector.get_views(database_name="my_database", schema_name="public")
print(f"Views: {views}")

# Get materialized views
mvs = connector.get_materialized_views(database_name="my_database", schema_name="public")
print(f"Materialized Views: {mvs}")

Get Table Schema

# Get table structure
schema = connector.get_schema(
    database_name="my_database",
    schema_name="public",
    table_name="users"
)

for column in schema[:-1]:  # Last item is table metadata
    print(f"{column['name']}: {column['type']} (nullable: {column['nullable']})")

Get DDL Definitions

# Get tables with DDL
tables_with_ddl = connector.get_tables_with_ddl(
    database_name="my_database",
    schema_name="public"
)

for table in tables_with_ddl:
    print(f"\nTable: {table['table_name']}")
    print(f"DDL:\n{table['definition']}")

# Get views with DDL
views_with_ddl = connector.get_views_with_ddl(
    database_name="my_database",
    schema_name="public"
)

# Get materialized views with DDL
mvs_with_ddl = connector.get_materialized_views_with_ddl(
    database_name="my_database",
    schema_name="public"
)

Get Sample Data

# Get sample rows from specific tables
samples = connector.get_sample_rows(
    tables=["users", "orders"],
    top_n=5,
    database_name="my_database",
    schema_name="public"
)

for sample in samples:
    print(f"\nTable: {sample['table_name']}")
    print(sample['sample_rows'])  # CSV format

CRUD Operations

# INSERT
result = connector.execute_insert(
    "INSERT INTO users (name, email) VALUES ('John', 'john@example.com')"
)
print(f"Inserted rows: {result.row_count}")

# UPDATE
result = connector.execute_update(
    "UPDATE users SET email = 'newemail@example.com' WHERE name = 'John'"
)
print(f"Updated rows: {result.row_count}")

# DELETE
result = connector.execute_delete(
    "DELETE FROM users WHERE name = 'John'"
)
print(f"Deleted rows: {result.row_count}")

# DDL
result = connector.execute_ddl(
    "CREATE TABLE test_table (id INT, name VARCHAR(100))"
)
print(f"DDL executed: {result.success}")

Context Switching

# Switch database
connector.do_switch_context(database_name="another_database")

# Switch schema
connector.do_switch_context(
    database_name="my_database",
    schema_name="another_schema"
)

Configuration with Datus Agent

When using with Datus Agent, the adapter is automatically discovered via entry points:

# config.yaml
database:
  type: snowflake
  account: myaccount
  username: myuser
  password: mypassword
  warehouse: my_warehouse
  database: my_database
  schema: my_schema

The adapter will be automatically loaded when you use type: snowflake.

Architecture

This adapter:

  • Inherits from BaseSqlConnector in datus-agent
  • Uses native Snowflake connector for optimal performance
  • Implements all required abstract methods
  • Provides Snowflake-specific optimizations (SHOW commands, Arrow format)

Development

# Install in development mode
cd datus-snowflake
pip install -e .

# Run tests
pytest tests/

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datus_snowflake-0.1.3.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datus_snowflake-0.1.3-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file datus_snowflake-0.1.3.tar.gz.

File metadata

  • Download URL: datus_snowflake-0.1.3.tar.gz
  • Upload date:
  • Size: 13.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for datus_snowflake-0.1.3.tar.gz
Algorithm Hash digest
SHA256 860f94f039c1bb3fa0b4ac753392d0474f5a40ad8ec9947e982d51fe1dde128b
MD5 efb69348d7ed927f835792c20cfb454c
BLAKE2b-256 dc08a564f09f5a702b35a8d0dd729ff82c24e2fb07c69e4749c5dd46b5ce0823

See more details on using hashes here.

File details

Details for the file datus_snowflake-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for datus_snowflake-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 666e1e362ccb4399a742d2bf63ad3cc765cba185ca22c65d1128060313ba5c77
MD5 3d91dd184e3232862b873c0df72c9037
BLAKE2b-256 0da1046bf9534b4bd3c6caaabb72748c376610a788f5b4523a123c4a2667c524

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page