Skip to main content

Turn complex GraphQL queries into optimized database queries.

Project description

Build Status Coverage Status License PyPI Python PyPI Version PyPI Status PyPI Wheel Code style: black

Turn complex GraphQL queries into optimized database queries.

pip install graphql-compiler

Quick Overview

GraphQL compiler is a library that simplifies data querying and exploration by exposing one simple query language written using GraphQL syntax to target multiple database backends. It currently supports OrientDB. and multiple SQL database management systems, such as PostgreSQL, MSSQL and MySQL.

For a detailed overview, see our blog post. To get started, see our Read the Docs documentation. To contribute, please see our contributing guide.

Examples

OrientDB

from graphql.utils.schema_printer import print_schema
from graphql_compiler import (
    get_graphql_schema_from_orientdb_schema_data, graphql_to_match
)
from graphql_compiler.schema.schema_info import CommonSchemaInfo
from graphql_compiler.schema_generation.orientdb.utils import ORIENTDB_SCHEMA_RECORDS_QUERY

# Step 1: Get schema metadata from hypothetical Animals database.
client = your_function_that_returns_an_orientdb_client()
schema_records = client.command(ORIENTDB_SCHEMA_RECORDS_QUERY)
schema_data = [record.oRecordData for record in schema_records]

# Step 2: Generate GraphQL schema from metadata.
schema, type_equivalence_hints = get_graphql_schema_from_orientdb_schema_data(schema_data)

print(print_schema(schema))
# schema {
#    query: RootSchemaQuery
# }
#
# directive @filter(op_name: String!, value: [String!]!) on FIELD | INLINE_FRAGMENT
#
# directive @tag(tag_name: String!) on FIELD
#
# directive @output(out_name: String!) on FIELD
#
# directive @output_source on FIELD
#
# directive @optional on FIELD
#
# directive @recurse(depth: Int!) on FIELD
#
# directive @fold on FIELD
#
# type Animal {
#     name: String
#     net_worth: Int
#     limbs: Int
# }
#
# type RootSchemaQuery{
#     Animal: [Animal]
# }

# Step 3: Write GraphQL query that returns the names of all animals with a certain net worth.
# Note that we prefix net_worth with '$' and surround it with quotes to indicate it's a parameter.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
        net_worth @filter(op_name: "=", value: ["$net_worth"])
    }
}
'''
parameters = {
    'net_worth': '100',
}

# Step 4: Use autogenerated GraphQL schema to compile query into the target database language.
common_schema_info = CommonSchemaInfo(schema, type_equivalence_hints)
compilation_result = graphql_to_match(common_schema_info, graphql_query, parameters)
print(compilation_result.query)
# SELECT Animal___1.name AS `animal_name`
# FROM  ( MATCH  { class: Animal, where: ((net_worth = decimal("100"))), as: Animal___1 }
# RETURN $matches)

SQL

from graphql_compiler import get_sqlalchemy_schema_info, graphql_to_sql
from sqlalchemy import MetaData, create_engine

engine = create_engine('<connection string>')

# Reflect the default database schema. Each table must have a primary key. Otherwise see:
# https://graphql-compiler.readthedocs.io/en/latest/supported_databases/sql.html#including-tables-without-explicitly-enforced-primary-keys
metadata = MetaData(bind=engine)
metadata.reflect()

# Wrap the schema information into a SQLAlchemySchemaInfo object.
sql_schema_info = get_sqlalchemy_schema_info(metadata.tables, {}, engine.dialect)

# Write GraphQL query.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
    }
}
'''
parameters = {}

# Compile and execute query.
compilation_result = graphql_to_sql(sql_schema_info, graphql_query, parameters)
query_results = [dict(row) for row in engine.execute(compilation_result.query)]

License

Licensed under the Apache 2.0 License. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Copyright 2017-present Kensho Technologies, LLC. The present date is determined by the timestamp of the most recent commit in the repository.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graphql-compiler-2.0.0.dev28.tar.gz (717.7 kB view details)

Uploaded Source

Built Distribution

graphql_compiler-2.0.0.dev28-py2.py3-none-any.whl (844.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file graphql-compiler-2.0.0.dev28.tar.gz.

File metadata

  • Download URL: graphql-compiler-2.0.0.dev28.tar.gz
  • Upload date:
  • Size: 717.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql-compiler-2.0.0.dev28.tar.gz
Algorithm Hash digest
SHA256 c1c41285e34e436217285298b2ccb6d4374f99bd9369f9c527d5ec34e0e8819a
MD5 dbed885e24af56780035210d9bb80b64
BLAKE2b-256 171a8c4edef3f860c3bde15a4fb6458dfe45163db7324139ed721bb7862afe9f

See more details on using hashes here.

File details

Details for the file graphql_compiler-2.0.0.dev28-py2.py3-none-any.whl.

File metadata

  • Download URL: graphql_compiler-2.0.0.dev28-py2.py3-none-any.whl
  • Upload date:
  • Size: 844.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql_compiler-2.0.0.dev28-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f3276ab3dcd4788b6c23e482acf9613c3f7e8823751b94d1c62075ca03e85b00
MD5 ef5c7b541dac72a1dd96a160e2ea82f0
BLAKE2b-256 afb21b4bf8d74ca75d955caa9502e71d2191eb229edef7b0f7c44be2211cce3d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page