Skip to main content

Turn complex GraphQL queries into optimized database queries.

Project description

Build Status Coverage Status License PyPI Python PyPI Version PyPI Status PyPI Wheel Code style: black

Turn complex GraphQL queries into optimized database queries.

pip install graphql-compiler

Quick Overview

GraphQL compiler is a library that simplifies data querying and exploration by exposing one simple query language written using GraphQL syntax to target multiple database backends. It currently supports OrientDB. and multiple SQL database management systems, such as PostgreSQL, MSSQL and MySQL.

For a detailed overview, see our blog post. To get started, see our Read the Docs documentation. To contribute, please see our contributing guide.

Examples

OrientDB

from graphql.utils.schema_printer import print_schema
from graphql_compiler import (
    get_graphql_schema_from_orientdb_schema_data, graphql_to_match
)
from graphql_compiler.schema.schema_info import CommonSchemaInfo
from graphql_compiler.schema_generation.orientdb.utils import ORIENTDB_SCHEMA_RECORDS_QUERY

# Step 1: Get schema metadata from hypothetical Animals database.
client = your_function_that_returns_an_orientdb_client()
schema_records = client.command(ORIENTDB_SCHEMA_RECORDS_QUERY)
schema_data = [record.oRecordData for record in schema_records]

# Step 2: Generate GraphQL schema from metadata.
schema, type_equivalence_hints = get_graphql_schema_from_orientdb_schema_data(schema_data)

print(print_schema(schema))
# schema {
#    query: RootSchemaQuery
# }
#
# directive @filter(op_name: String!, value: [String!]!) on FIELD | INLINE_FRAGMENT
#
# directive @tag(tag_name: String!) on FIELD
#
# directive @output(out_name: String!) on FIELD
#
# directive @output_source on FIELD
#
# directive @optional on FIELD
#
# directive @recurse(depth: Int!) on FIELD
#
# directive @fold on FIELD
#
# type Animal {
#     name: String
#     net_worth: Int
#     limbs: Int
# }
#
# type RootSchemaQuery{
#     Animal: [Animal]
# }

# Step 3: Write GraphQL query that returns the names of all animals with a certain net worth.
# Note that we prefix net_worth with '$' and surround it with quotes to indicate it's a parameter.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
        net_worth @filter(op_name: "=", value: ["$net_worth"])
    }
}
'''
parameters = {
    'net_worth': '100',
}

# Step 4: Use autogenerated GraphQL schema to compile query into the target database language.
common_schema_info = CommonSchemaInfo(schema, type_equivalence_hints)
compilation_result = graphql_to_match(common_schema_info, graphql_query, parameters)
print(compilation_result.query)
# SELECT Animal___1.name AS `animal_name`
# FROM  ( MATCH  { class: Animal, where: ((net_worth = decimal("100"))), as: Animal___1 }
# RETURN $matches)

SQL

from graphql_compiler import get_sqlalchemy_schema_info, graphql_to_sql
from sqlalchemy import MetaData, create_engine

engine = create_engine('<connection string>')

# Reflect the default database schema. Each table must have a primary key. Otherwise see:
# https://graphql-compiler.readthedocs.io/en/latest/supported_databases/sql.html#including-tables-without-explicitly-enforced-primary-keys
metadata = MetaData(bind=engine)
metadata.reflect()

# Wrap the schema information into a SQLAlchemySchemaInfo object.
sql_schema_info = get_sqlalchemy_schema_info(metadata.tables, {}, engine.dialect)

# Write GraphQL query.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
    }
}
'''
parameters = {}

# Compile and execute query.
compilation_result = graphql_to_sql(sql_schema_info, graphql_query, parameters)
query_results = [dict(row) for row in engine.execute(compilation_result.query)]

License

Licensed under the Apache 2.0 License. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Copyright 2017-present Kensho Technologies, LLC. The present date is determined by the timestamp of the most recent commit in the repository.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graphql-compiler-2.0.0.dev29.tar.gz (720.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

graphql_compiler-2.0.0.dev29-py2.py3-none-any.whl (847.5 kB view details)

Uploaded Python 2Python 3

File details

Details for the file graphql-compiler-2.0.0.dev29.tar.gz.

File metadata

  • Download URL: graphql-compiler-2.0.0.dev29.tar.gz
  • Upload date:
  • Size: 720.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql-compiler-2.0.0.dev29.tar.gz
Algorithm Hash digest
SHA256 9b68dc4f61d8a8cf9b5b8e0211a63db18a684b34e7635838b484562aa123c1d3
MD5 c41d3ced73cd441a4901019cc8d0157c
BLAKE2b-256 cff96c9917106ca2f1e65778b9d090a76403b2eeecef88d33df49c9fd1c5c76d

See more details on using hashes here.

File details

Details for the file graphql_compiler-2.0.0.dev29-py2.py3-none-any.whl.

File metadata

  • Download URL: graphql_compiler-2.0.0.dev29-py2.py3-none-any.whl
  • Upload date:
  • Size: 847.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql_compiler-2.0.0.dev29-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 83a49c6617d00fbe722d1203b9a9fb78860bc8a7cb0232f5b82fa1ca9726b890
MD5 a6c2fab29cae9ae8454364af07206783
BLAKE2b-256 df14defa205f228cac2f12f2911026bda225be98421b3e15a192c794228ed821

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page