Skip to main content

Client for the e6data distributed SQL Engine.

Project description

e6data Python Connector

version

Introduction

The e6data Connector for Python provides an interface for writing Python applications that can connect to e6data and perform operations.

To install the Python package, use the command below:

pip install e6data-python-connector

Prerequisites

  • Open Inbound Port 9000 in the Engine Cluster.
  • Limit access to Port 9000 according to your organizational security policy. Public access is not encouraged.
  • Generated Access Token in the e6data console.

Creating connection

Use your e6data email id as a username and access token as a password.

import e6xdb.e6x as edb

username = '<username>'  # Your e6data email id.
password = '<password>'  # Generated Access Token from e6data console.

host = '<host>'  # Host name or IP address of you cluster.
database = '<database>'  # Database name where you want to perform query.

port = 9000  # Engine port.

conn = edb.connect(
    host=host,
    port=port,
    username=username,
    database=database,
    password=password
)

Performing query

Performing query

query = 'SELECT * FROM <TABLE_NAME>'  # Replace with the actual query.

cursor = conn.cursor()
query_id = cursor.execute(query)  # execute function returns query id, can be use for aborting the query.
all_records = cursor.fetchall()
for row in all_records:
   print(row)

To fetch all the records.

records = cursor.fetchall()

To fetch one record.

record = cursor.fetchone()

To fetch limited records.

limit = 500
records = cursor.fetchmany(limit)

To get execution plan after query execution.

import json

query_planner = json.loads(cursor.explain_analyse())

To abort running query.

query_id = '<query_id>'  # query id from execute function response.
cursor.cancel(query_id)

Switch database in existing connection.

database = '<new_database_name>'  # Replace with the new database.
cursor = conn.cursor(database)

Get Query Time Metrics

import json
query = 'SELECT * FROM <TABLE_NAME>'

cursor = conn.cursor()
query_id = cursor.execute(query)  # execute function returns query id, can be use for aborting th query.
all_records = cursor.fetchall()

query_planner = json.loads(cursor.explain_analyse())

execution_time = query_planner.get("total_query_time")  # In milliseconds
queue_time = query_planner.get("executionQueueingTime")  # In milliseconds
parsing_time = query_planner.get("parsingTime")  # In milliseconds
row_count = query_planner.get('row_count_out')

Get list of databases, tables or columns

The following code returns a dictionary of all databases, all tables and all columns connected to the cluster currently in use. This function can be used without passing database name to get list of all databases.

databases = conn.get_schema_names()  # To get list of databases.
print(databases)

database = '<database_name>'  # Replace with actual database name.
tables = conn.get_tables(database=database)  # To get list of tables from a database.
print(tables)

table_name = '<table_name>'  # Replace with actual table name.
columns = conn.get_tables(database=database, table=table_name)  # To get the list of columns from a table.
columns_with_type = list()
"""
Getting the column name and type.
"""
for column in columns:
   columns_with_type.append(dict(column_name=column.fieldName, column_type=column.fieldType))
print(columns_with_type)

Code Hygiene

It is recommended to clear the cursor, close the cursor and close the connection after running a function as a best practice. This enhances performance by clearing old data from memory.

cursor.clear() # Not needed when aborting a query
cursor.close()
conn.close()

Code Example

The following code is an example.

import e6xdb.e6x as edb
import json

username = '<username>'  # Your e6data email id.
password = '<password>'  # Generated Access Token from e6data console.

host = '<host>'  # Host name or IP address of you cluster.
database = '<database>'  # Database name where you want to perform query.

port = 9000  # Engine port.

sql_query = 'SELECT * FROM <TABLE_NAME>'  # Replace with the actual query.

conn = edb.connect(
    host=host,
    port=port,
    username=username,
    database=database,
    password=password
)

cursor = conn.cursor(db_name=database)
query_id = cursor.execute(sql_query)
all_records = cursor.fetchall()
planner_result = json.loads(cursor.explain_analyse())
execution_time = planner_result.get("total_query_time") / 1000  # Converting into seconds.
row_count = planner_result.get('row_count_out')
columns = [col[0] for col in cursor.description]  # Get the column names and merge with the records.
results = []
for row in all_records:
   row = dict(zip(columns, row))
   results.append(row)
   print(row)
print('Total row count {}, Execution Time (seconds): {}'.format(row_count, execution_time))
cursor.clear()
cursor.close()
conn.close()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

e6data-python-connector-1.0.2.tar.gz (32.4 kB view details)

Uploaded Source

Built Distribution

e6data_python_connector-1.0.2-py3-none-any.whl (35.4 kB view details)

Uploaded Python 3

File details

Details for the file e6data-python-connector-1.0.2.tar.gz.

File metadata

File hashes

Hashes for e6data-python-connector-1.0.2.tar.gz
Algorithm Hash digest
SHA256 f11a4bc72d9ffe7d059623a9f275e0253e8a9c4ef6427a3101ebf51c356d559f
MD5 76f031ebd1b1aef46bbbfef318e30eb5
BLAKE2b-256 474f0a3698cd28dcbed1f13de5e0540f9be13cc70afafd286409b9a952f5730d

See more details on using hashes here.

File details

Details for the file e6data_python_connector-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for e6data_python_connector-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ca2385be7e3b644a97d7d758ed4f1945272ae46e05de1a2d8e6df560f27faeeb
MD5 23986e7f5a763d19f70deffad3f69a10
BLAKE2b-256 48b7c3005a9003df4f8c208819365a30b5111405ebcd8ac81fe4da4dd895687c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page