Skip to main content

Turn complex GraphQL queries into optimized database queries.

Project description

Build Status Coverage Status License PyPI Python PyPI Version PyPI Status PyPI Wheel Code style: black

Turn complex GraphQL queries into optimized database queries.

pip install graphql-compiler

Quick Overview

GraphQL compiler is a library that simplifies data querying and exploration by exposing one simple query language written using GraphQL syntax to target multiple database backends. It currently supports OrientDB. and multiple SQL database management systems, such as PostgreSQL, MSSQL and MySQL.

For a detailed overview, see our blog post. To get started, see our Read the Docs documentation. To contribute, please see our contributing guide.

Examples

OrientDB

from graphql.utils.schema_printer import print_schema
from graphql_compiler import (
    get_graphql_schema_from_orientdb_schema_data, graphql_to_match
)
from graphql_compiler.schema.schema_info import CommonSchemaInfo
from graphql_compiler.schema_generation.orientdb.utils import ORIENTDB_SCHEMA_RECORDS_QUERY

# Step 1: Get schema metadata from hypothetical Animals database.
client = your_function_that_returns_an_orientdb_client()
schema_records = client.command(ORIENTDB_SCHEMA_RECORDS_QUERY)
schema_data = [record.oRecordData for record in schema_records]

# Step 2: Generate GraphQL schema from metadata.
schema, type_equivalence_hints = get_graphql_schema_from_orientdb_schema_data(schema_data)

print(print_schema(schema))
# schema {
#    query: RootSchemaQuery
# }
#
# directive @filter(op_name: String!, value: [String!]!) on FIELD | INLINE_FRAGMENT
#
# directive @tag(tag_name: String!) on FIELD
#
# directive @output(out_name: String!) on FIELD
#
# directive @output_source on FIELD
#
# directive @optional on FIELD
#
# directive @recurse(depth: Int!) on FIELD
#
# directive @fold on FIELD
#
# type Animal {
#     name: String
#     net_worth: Int
#     limbs: Int
# }
#
# type RootSchemaQuery{
#     Animal: [Animal]
# }

# Step 3: Write GraphQL query that returns the names of all animals with a certain net worth.
# Note that we prefix net_worth with '$' and surround it with quotes to indicate it's a parameter.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
        net_worth @filter(op_name: "=", value: ["$net_worth"])
    }
}
'''
parameters = {
    'net_worth': '100',
}

# Step 4: Use autogenerated GraphQL schema to compile query into the target database language.
common_schema_info = CommonSchemaInfo(schema, type_equivalence_hints)
compilation_result = graphql_to_match(common_schema_info, graphql_query, parameters)
print(compilation_result.query)
# SELECT Animal___1.name AS `animal_name`
# FROM  ( MATCH  { class: Animal, where: ((net_worth = decimal("100"))), as: Animal___1 }
# RETURN $matches)

SQL

from graphql_compiler import get_sqlalchemy_schema_info, graphql_to_sql
from sqlalchemy import MetaData, create_engine

engine = create_engine('<connection string>')

# Reflect the default database schema. Each table must have a primary key. Otherwise see:
# https://graphql-compiler.readthedocs.io/en/latest/supported_databases/sql.html#including-tables-without-explicitly-enforced-primary-keys
metadata = MetaData(bind=engine)
metadata.reflect()

# Wrap the schema information into a SQLAlchemySchemaInfo object.
sql_schema_info = get_sqlalchemy_schema_info(metadata.tables, {}, engine.dialect)

# Write GraphQL query.
graphql_query = '''
{
    Animal {
        name @output(out_name: "animal_name")
    }
}
'''
parameters = {}

# Compile and execute query.
compilation_result = graphql_to_sql(sql_schema_info, graphql_query, parameters)
query_results = [dict(row) for row in engine.execute(compilation_result.query)]

License

Licensed under the Apache 2.0 License. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an “AS IS” BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Copyright 2017-present Kensho Technologies, LLC. The present date is determined by the timestamp of the most recent commit in the repository.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graphql-compiler-2.0.0.dev33.tar.gz (729.1 kB view details)

Uploaded Source

Built Distribution

graphql_compiler-2.0.0.dev33-py2.py3-none-any.whl (856.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file graphql-compiler-2.0.0.dev33.tar.gz.

File metadata

  • Download URL: graphql-compiler-2.0.0.dev33.tar.gz
  • Upload date:
  • Size: 729.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql-compiler-2.0.0.dev33.tar.gz
Algorithm Hash digest
SHA256 9d0092e3319f4d8ab45f55d9f58ca545eb023d0e1e70633643a146ec60978f8e
MD5 6956a3a8b10bab79806b3894fe37ed58
BLAKE2b-256 a606bf40cc08f6619d935d3099a515ae316fe2a8ff2d3a4b1e29fa278c9a2192

See more details on using hashes here.

File details

Details for the file graphql_compiler-2.0.0.dev33-py2.py3-none-any.whl.

File metadata

  • Download URL: graphql_compiler-2.0.0.dev33-py2.py3-none-any.whl
  • Upload date:
  • Size: 856.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.8.2

File hashes

Hashes for graphql_compiler-2.0.0.dev33-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 2abc63b1d293a7b2c071c4f06e4eb911e4e096ce4c64f798f9aeaf58fc2f3d26
MD5 3692471f37c4ba68ad1a294b32fe4bfe
BLAKE2b-256 a0ad7ad8b00c7ebe44b8aa8274823d7bd8c34c7b070015dea73c02f86a447f8b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page