Skip to main content

Python Driver for Macrometa Global Edge Fabric

Project description

PyC8

Welcome to the GitHub page for pyC8, a Python driver for the Digital Edge Fabric.

Features

  • Clean Pythonic interface.
  • Lightweight.

Compatibility

  • Python versions 3.4, 3.5 and 3.6 are supported.

Build & Install

To build,

 $ python setup.py build

To install locally,

 $ python setup.py build

Getting Started

Here is an overview example:

   from c8 import C8Client
   import time
   import warnings
   warnings.filterwarnings("ignore")

   region = "qa1-us-east-1.eng2.macrometa.io"
   demo_tenant = "demo"
   demo_fabric = "demofabric"
   demo_user = "demouser"
   demo_collection = "employees"
   demo_stream = "demostream"

   #--------------------------------------------------------------
   print("Create C8Client Connection...")
   client = C8Client(protocol='https', host=region, port=443)

   #--------------------------------------------------------------
   print("Create under demotenant, demofabric, demouser and assign permissions...")
   demotenant = client.tenant(name=demo_tenant, fabricname='_system', username='root', password='demo')
   fabric = client.tenant(name=demo_tenant, fabricname='_system', username='root', password='demo')

   if not demotenant.has_user(demo_user):
     demotenant.create_user(username=demo_user, password='demouser', active=True)

   if not fabric.has_fabric(demo_fabric):
     fabric.create_fabric(name=demo_fabric, dclist=demotenant.dclist(detail=False))

   demotenant.update_permission(username=demo_user, permission='rw', fabric=demo_fabric)

   #--------------------------------------------------------------
   print("Create and populate employees collection in demofabric...")
   fabric = client.fabric(tenant=demo_tenant, name=demo_fabric, username=demo_user, password='demouser')
   #get fabric detail
   fabric.fabrics_detail()
   employees = fabric.create_collection('employees') # Create a new collection named "employees".
   employees.add_hash_index(fields=['email'], unique=True) # Add a hash index to the collection.

   employees.insert({'firstname': 'Jean', 'lastname':'Picard', 'email':'jean.picard@macrometa.io'})
   employees.insert({'firstname': 'James', 'lastname':'Kirk', 'email':'james.kirk@macrometa.io'})
   employees.insert({'firstname': 'Han', 'lastname':'Solo', 'email':'han.solo@macrometa.io'})
   employees.insert({'firstname': 'Bruce', 'lastname':'Wayne', 'email':'bruce.wayne@macrometa.io'})

   # insert data from a CSV file
   # path to csv file should be an absolute path
   employees.insert_from_file("~/data.csv")

   #--------------------------------------------------------------
   print("query employees collection...")
   cursor = fabric.c8ql.execute('FOR employee IN employees RETURN employee') # Execute a C8QL query
   docs = [document for document in cursor]
   print(docs)

   #--------------------------------------------------------------
   print("Create global & local streams in demofabric...")
   fabric.create_stream(demo_stream, local=False)
   fabric.create_stream(demo_stream, local=True)

   streams = fabric.streams()
   print("streams:", streams)

   #--------------------------------------------------------------

Example to query a given fabric:

  from c8 import C8Client
  import json
  import warnings
  warnings.filterwarnings("ignore")

  region = "qa1-us-east-1.ops.aws.macrometa.io"

  #--------------------------------------------------------------
  print("query employees collection...")
  client = C8Client(protocol='https', host=region, port=443)
  fabric = client.fabric(tenant="demotenant", name="demofabric", username="demouser", password='poweruser')
  cursor = fabric.c8ql.execute('FOR employee IN employees RETURN employee') # Execute a C8QL query
  docs = [document for document in cursor]
  print(docs)

Example for real-time updates from a collection in fabric:

  from c8 import C8Client
  import warnings
  warnings.filterwarnings("ignore")

  region = "qa1-us-east-1.ops.aws.macrometa.io"

  def callback_fn(event):
      print(event)

  #--------------------------------------------------------------
  print("Subscribe to employees collection...")
  client = C8Client(protocol='https', host=region, port=443)
  fabric = client.fabric(tenant="demotenant", name="demofabric", username="demouser", password='poweruser')
  fabric.on_change("employees", callback=callback_fn)

Example to publish documents to a stream:

  from c8 import C8Client
  import time
  import warnings
  warnings.filterwarnings("ignore")

  region = "qa1-us-east-1.ops.aws.macrometa.io"

  #--------------------------------------------------------------
  print("publish messages to stream...")
  client = C8Client(protocol='https', host=region, port=443)
  fabric = client.fabric(tenant="demotenant", name="demofabric", username="demouser", password='poweruser')
  stream = fabric.stream()
  producer = stream.create_producer("demostream", local=False)
  for i in range(10):
      msg = "Hello from " + region + "("+ str(i) +")"
      producer.send(msg.encode('utf-8'))
      time.sleep(10) # 10 sec

Example to subscribe documents from a stream:

   from c8 import C8Client
   import warnings
   warnings.filterwarnings("ignore")

   region = "qa1-us-east-1.ops.aws.macrometa.io"

   #--------------------------------------------------------------
   print("consume messages from stream...")
   client = C8Client(protocol='https', host=region, port=443)
   fabric = client.fabric(tenant="demotenant", name="demofabric", username="demouser", password='poweruser')
   stream = fabric.stream()
   #you can subscribe using consumer_types option.
   subscriber = stream.subscribe("demostream", local=False, subscription_name="demosub", consumer_type= stream.CONSUMER_TYPES.EXCLUSIVE)
   for i in range(10):
       msg = subscriber.receive()
       print("Received message '{}' id='{}'".format(msg.data(), msg.message_id()))
       subscriber.acknowledge(msg)

Example: stream management:

    stream_collection = fabric.stream()
    #get_stream_stats
    stream_collection.get_stream_stats('demostream', local=False) #for global persistent stream

    #Skip all messages on a stream subscription
    stream_collection.skip_all_messages_for_subscription('demostream', 'demosub')

    #Skip num messages on a topic subscription
    stream_collection.skip_messages_for_subscription('demostream', 'demosub', 10)

    #Expire messages for a given subscription of a stream.
    #expire time is in seconds
    stream_collection.expire_messages_for_subscription('demostream', 'demosub', 2)

    #Expire messages on all subscriptions of stream
    stream_collection.expire_messages_for_subscriptions('demostream',2)

    #Reset subscription to message position to closest timestamp
    #time is in milli-seconds
    stream_collection.reset_message_subscription_by_timestamp('demostream','demosub', 5)

    #Reset subscription to message position closest to given position
    #stream_collection.reset_message_for_subscription('demostream', 'demosub')

    #stream_collection.reset_message_subscription_by_position('demostream','demosub', 4)

    #trigger compaction status
    stream_collection.put_stream_compaction_status('demostream')

    #get stream compaction status
    stream_collection.get_stream_compaction_status('demostream')

    #Unsubscribes the given subscription on all streams on a stream fabric
    stream_collection.unsubscribe('demosub')

    #delete subscription of a stream
    #stream_collection.delete_stream_subscription('demostream', 'demosub' , local=False)

Workflow of Spot Collections

from c8 import C8Client

# Initialize the client for C8DB.
client = C8Client(protocol='http', host='localhost', port=8529)

#Step 1: Make one of the regions in the fed as the Spot Region
# Connect to System admin
sys_tenant = client.tenant(name=macrometa-admin, fabricname='_system', username='root', password=macrometa-password)
#Make REGION-1 as spot-region
sys_tenant.assign_dc_spot('REGION-1',spot_region=True)

#Make REGION-2 as spot-region
sys_tenant.assign_dc_spot('REGION-2',spot_region=True)

#Step 2: Create a geo-fabric and pass one of the spot regions. You can use the SPOT_CREATION_TYPES for the same. If you use AUTOMATIC, a random spot region will be assigned by the system.
# If you specify None, a geo-fabric is created without the spot properties. If you specify spot region,pass the corresponding spot region in the spot_dc parameter.
dcl = sys_tenant.dclist(detail=False)
fabric = client.fabric(tenant='guest', name='_system', username='root', password='guest')
fabric.create_fabric('spot-geo-fabric', dclist=dcl,spot_creation_type= fabric.SPOT_CREATION_TYPES.SPOT_REGION, spot_dc='REGION-1')

#Step 3: Create spot collection in 'spot-geo-fabric'
spot_collection = fabric.create_collection('spot-collection', spot_collection=True)

#Step 4: Update Spot primary region of the geo-fabric. To change it, we need system admin credentials
sys_fabric = client.fabric(tenant=macrometa-admin, name='_system', username='root', password=macrometa-password)
sys_fabric.update_spot_region('guest', 'spot-geo-fabric', 'REGION-2')

Example for restql operations:

  from c8 import C8Client
  import json
  import warnings
  warnings.filterwarnings("ignore")

  client = C8Client(protocol='https', host=region, port=443)
  demotenant = client.tenant(name="demo_tenant", fabricname='_system',
                             username='root', password='demo')
  #--------------------------------------------------------------
  print("save restql...")
  data = {
    "query": {
      "parameter": {},
      "name": "demo",
      "value": "FOR employee IN employees RETURN employee"
    }
  }
  response = demotenant.save_restql(data)
  #--------------------------------------------------------------
  print("execute restql without bindVars...")
  response = demotenant.execute_restql("demo")
  #--------------------------------------------------------------
  print("execute restql with bindVars...")
  response = demotenant.execute_restql("demo",
                                       {"bindVars": {"name": "guest.root"}})
  #--------------------------------------------------------------
  print("get all restql...")
  response = demotenant.get_all_restql()
  #--------------------------------------------------------------
  print("update restql...")
  data = {
    "query": {
      "parameter": {},
      "value": "FOR employee IN employees Filter doc.name=@name RETURN employee"
    }
  }
  response = demotenant.update_restql("demo", data)
  #--------------------------------------------------------------
  print("delete restql...")
  response = demotenant.delete_restql("demo")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyC8-0.12.12.tar.gz (56.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyC8-0.12.12-py2.py3-none-any.whl (64.3 kB view details)

Uploaded Python 2Python 3

File details

Details for the file pyC8-0.12.12.tar.gz.

File metadata

  • Download URL: pyC8-0.12.12.tar.gz
  • Upload date:
  • Size: 56.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.6.8

File hashes

Hashes for pyC8-0.12.12.tar.gz
Algorithm Hash digest
SHA256 2c7645e758d947968a13421b3ad534fbd7bf47730274bdd809a3bd28b6ba9546
MD5 b3957a48b54c3f6089cd6e3e396bb3a2
BLAKE2b-256 52ae3202d5fd7359d9674744ceda95ed77c1bb86ac01f9ad3b653c5727015e2b

See more details on using hashes here.

File details

Details for the file pyC8-0.12.12-py2.py3-none-any.whl.

File metadata

  • Download URL: pyC8-0.12.12-py2.py3-none-any.whl
  • Upload date:
  • Size: 64.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.20.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.6.8

File hashes

Hashes for pyC8-0.12.12-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 76ca13e55ce8b37f1dd1ce3b503725ae27cec27e6566c2a84a2613d402b5dee0
MD5 c44095e5a69cdebcbd72895ef9508eec
BLAKE2b-256 6df1e41a1a52371666b4d6bb3e829b54f166eb08c3a044a8f24fde260c63db62

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page