Skip to main content

Turn complex GraphQL queries into optimized database queries.

Project description

Build Status Coverage Status License PyPI Python PyPI Version PyPI Status PyPI Wheel Code style: black

Turn complex GraphQL queries into optimized database queries.

pip install graphql-compiler

Quick Overview

GraphQL compiler is a library that simplifies data querying and exploration by exposing one simple query language written using GraphQL syntax to target multiple database backends. It currently supports OrientDB. and multiple SQL database management systems, such as PostgreSQL, MSSQL and MySQL.

For a detailed overview, see our blog post. To get started, see our Read the Docs documentation. To contribute, please see our contributing guide.

Examples

OrientDB

from graphql.utils.schema_printer import print_schema
from graphql_compiler import (
    get_graphql_schema_from_orientdb_schema_data, graphql_to_match
)
from graphql_compiler.schema.schema_info import CommonSchemaInfo
from graphql_compiler.schema_generation.orientdb.utils import ORIENTDB_SCHEMA_RECORDS_QUERY

# Step 1: Get schema metadata from hypothetical Animals database.
client = your_function_that_returns_an_orientdb_client()
schema_records = client.command(ORIENTDB_SCHEMA_RECORDS_QUERY)
schema_data = [record.oRecordData for record in schema_records]

# Step 2: Generate GraphQL schema from metadata.
schema, type_equivalence_hints = get_graphql_schema_from_orientdb_schema_data(schema_data)

print(print_schema(schema))
# schema {
#    query: RootSchemaQuery
# }
#
# directive @filter(op_name: String!, value: [String!]!) on FIELD | INLINE_FRAGMENT
#
# directive @tag(tag_name: String!) on FIELD
#
# directive @output(out_name: String!) on FIELD
#
# directive @output_source on FIELD
#
# directive @optional on FIELD
#
# directive @recurse(depth: Int!) on FIELD
#
# directive @fold on FIELD
#
# type Animal {
#     name: String
#     net_worth: Int
#     limbs: Int
# }
#
# type RootSchemaQuery{
#     Animal: [Animal]
# }

# Step 3: Write GraphQL query that returns the names of all animals with a certain net worth.
# Note that we prefix net_worth with '$' and surround it with quotes to indicate it's a parameter.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
        net_worth @filter(op_name: "=", value: ["$net_worth"])
    }
}
'''
parameters = {
    'net_worth': '100',
}

# Step 4: Use autogenerated GraphQL schema to compile query into the target database language.
common_schema_info = CommonSchemaInfo(schema, type_equivalence_hints)
compilation_result = graphql_to_match(common_schema_info, graphql_query, parameters)
print(compilation_result.query)
# SELECT Animal___1.name AS `animal_name`
# FROM  ( MATCH  { class: Animal, where: ((net_worth = decimal("100"))), as: Animal___1 }
# RETURN $matches)

SQL

from graphql_compiler import get_sqlalchemy_schema_info, graphql_to_sql
from sqlalchemy import MetaData, create_engine

engine = create_engine('<connection string>')

# Reflect the default database schema. Each table must have a primary key. Otherwise see:
# https://graphql-compiler.readthedocs.io/en/latest/supported_databases/sql.html#including-tables-without-explicitly-enforced-primary-keys
metadata = MetaData(bind=engine)
metadata.reflect()

# Wrap the schema information into a SQLAlchemySchemaInfo object.
sql_schema_info = get_sqlalchemy_schema_info(metadata.tables, {}, engine.dialect)

# Write GraphQL query.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
    }
}
'''
parameters = {}

# Compile and execute query.
compilation_result = graphql_to_sql(sql_schema_info, graphql_query, parameters)
query_results = [dict(row) for row in engine.execute(compilation_result.query)]

License

Licensed under the Apache 2.0 License. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Copyright 2017-present Kensho Technologies, LLC. The present date is determined by the timestamp of the most recent commit in the repository.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graphql-compiler-2.0.0.dev27.tar.gz (715.7 kB view details)

Uploaded Source

Built Distribution

graphql_compiler-2.0.0.dev27-py2.py3-none-any.whl (842.2 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file graphql-compiler-2.0.0.dev27.tar.gz.

File metadata

  • Download URL: graphql-compiler-2.0.0.dev27.tar.gz
  • Upload date:
  • Size: 715.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql-compiler-2.0.0.dev27.tar.gz
Algorithm Hash digest
SHA256 d147e955c7e8e53376b082873d2aef38e4fff251d6c258264f585e648af52c2b
MD5 6f2d736d261010afd6117f54c65e5f31
BLAKE2b-256 3596047df77b24d6c8b5fb02ccd51d1349f4fd92e74a9070c80610bd8cb46371

See more details on using hashes here.

File details

Details for the file graphql_compiler-2.0.0.dev27-py2.py3-none-any.whl.

File metadata

  • Download URL: graphql_compiler-2.0.0.dev27-py2.py3-none-any.whl
  • Upload date:
  • Size: 842.2 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql_compiler-2.0.0.dev27-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 390411e3dd6e923e09b4a34a4fb0bb32e8699939fefe78746e9a7ccb430602c2
MD5 8bf537ba927475b940a11bd1af2554b7
BLAKE2b-256 da96159050f6376ffe3ef221dc43b114d913c11465fa70e76425ecda17df6fbf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page