Convert pydantic schema to pydantic datamodel and build request from it
Project description
GraphQL to Pydantic Converter & Pydantic to Query Builder
Overview
The GraphQL to Pydantic Converter is a Python package designed to simplify the process of transforming GraphQL schemas in JSON format into Pydantic models. This tool is particularly useful for developers working with GraphQL APIs who want to generate Pydantic models from GraphQL types for efficient data validation and serialization/deserialization.
Features
- Converts GraphQL schemas in JSON format into Pydantic models.
- Build query or mutation from pydantic dataclass
Installation
You can install the GraphQL to Pydantic Transformer package via pip:
pip install graphql-pydantic-converter
# or
poetry add git+https://github.com/FRINXio/frinx-services-python-api.git@main#subdirectory=utils/graphql-pydantic-converter
Usage
Cli tool to transform GraphQL JSON to Pydantic
graphql-pydantic-converter [-h] [-i INPUT_FILE] [-o OUTPUT_FILE] [--url URL] [--headers HEADERS [HEADERS ...] ]
options:
-h, --help show this help message and exit
-i INPUT_FILE, --input-file INPUT_FILE
-o OUTPUT_FILE, --output-file OUTPUT_FILE
--url URL
--headers HEADERS [HEADERS ...] # --headers "HeaderName: HeaderValue" "HeaderName: HeaderValue"
Output from cli tool
import typing
from pydantic import Field
from pydantic import PrivateAttr
from graphql_pydantic_converter.graphql_types import Input
from graphql_pydantic_converter.graphql_types import Mutation
from graphql_pydantic_converter.graphql_types import Payload
Boolean: typing.TypeAlias = bool
DateTime: typing.TypeAlias = typing.Any
Float: typing.TypeAlias = float
ID: typing.TypeAlias = str
Int: typing.TypeAlias = int
JSON: typing.TypeAlias = typing.Any
String: typing.TypeAlias = str
class CreateScheduleInput(Input):
name: String
workflow_name: String = Field(default=None, alias='workflowName')
workflow_version: String = Field(default=None, alias='workflowVersion')
cron_string: String = Field(default=None, alias='cronString')
enabled: typing.Optional[Boolean] = Field(default=None)
parallel_runs: typing.Optional[Boolean] = Field(default=None, alias='parallelRuns')
workflow_context: typing.Optional[String] = Field(default=None, alias='workflowContext')
from_date: typing.Optional[DateTime] = Field(default=None, alias='fromDate')
to_date: typing.Optional[DateTime] = Field(default=None, alias='toDate')
class Schedule(Payload):
name: typing.Optional[bool] = Field(default=False)
enabled: typing.Optional[bool] = Field(default=False)
parallel_runs: typing.Optional[bool] = Field(alias='parallelRuns', default=False)
workflow_name: typing.Optional[bool] = Field(alias='workflowName', default=False)
workflow_version: typing.Optional[bool] = Field(alias='workflowVersion', default=False)
cron_string: typing.Optional[bool] = Field(alias='cronString', default=False)
workflow_context: typing.Optional[bool] = Field(alias='workflowContext', default=False)
from_date: typing.Optional[bool] = Field(alias='fromDate', default=False)
to_date: typing.Optional[bool] = Field(alias='toDate', default=False)
status: typing.Optional[bool] = Field(default=False)
class CreateScheduleMutation(Mutation):
_name: str = PrivateAttr('createSchedule')
input: CreateScheduleInput
payload: Schedule
CreateScheduleInput.model_rebuild()
CreateScheduleMutation.model_rebuild()
Schedule.model_rebuild()
Query & Mutation builder
from schedule_api import Schedule, CreateScheduleMutation, CreateScheduleInput
SCHEDULE: Schedule = Schedule(
name=True,
enabled=True,
workflowName=True,
workflowVersion=True,
cronString=True
)
mutation = CreateScheduleMutation(
payload=SCHEDULE,
input=CreateScheduleInput(
name='name',
workflowName='workflowName',
workflowVersion='workflowVersion',
cronString='* * * * *',
enabled=True,
parallelRuns=False,
)
)
mutation.render()
Created query
mutation {
createSchedule(
input: {
name: "name"
workflowName: "workflowName"
workflowVersion: "workflowVersion"
cronString: "* * * * *"
enabled: true
parallelRuns: false
}
) {
name
enabled
workflowName
workflowVersion
cronString
}
}
Response parser
Example of generated model.py
import typing
from pydantic import BaseModel, Field
from graphql_pydantic_converter.graphql_types import ENUM
Boolean: typing.TypeAlias = bool
DateTime: typing.TypeAlias = typing.Any
Float: typing.TypeAlias = float
ID: typing.TypeAlias = str
Int: typing.TypeAlias = int
JSON: typing.TypeAlias = typing.Any
String: typing.TypeAlias = str
class Status(ENUM):
UNKNOWN = 'UNKNOWN'
COMPLETED = 'COMPLETED'
FAILED = 'FAILED'
PAUSED = 'PAUSED'
RUNNING = 'RUNNING'
TERMINATED = 'TERMINATED'
TIMED_OUT = 'TIMED_OUT'
class SchedulePayload(BaseModel):
name: typing.Optional[typing.Optional[String]] = Field(default=None)
enabled: typing.Optional[typing.Optional[Boolean]] = Field(default=None)
parallel_runs: typing.Optional[typing.Optional[Boolean]] = Field(default=None, alias='parallelRuns')
workflow_name: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowName')
workflow_version: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowVersion')
cron_string: typing.Optional[typing.Optional[String]] = Field(default=None, alias='cronString')
workflow_context: typing.Optional[typing.Optional[String]] = Field(default=None, alias='workflowContext')
from_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='fromDate')
to_date: typing.Optional[typing.Optional[DateTime]] = Field(default=None, alias='toDate')
status: typing.Optional[typing.Optional[Status]] = Field(default=None)
class CreateScheduleData(BaseModel):
create_schedule: SchedulePayload = Field(default=None, alias='createSchedule')
class CreateScheduleResponse(BaseModel):
data: typing.Optional[CreateScheduleData] = Field(default=None)
errors: typing.Optional[typing.Any] = Field(default=None)
Example of response
from model import CreateScheduleResponse
# send previously created request to backend service
payload = {'query': mutation.render()}
resp = requests.post(SCHELLAR_URL, json=payload)
response = resp.json()
# Example of response
# {
# 'data': {
# 'createSchedule': {
# 'name': 'name',
# 'enabled': True,
# 'workflowName': 'workflowName',
# 'workflowVersion': 'workflowVersion',
# 'cronString': '* * * * *'
# }
# }
# }
schedule = CreateScheduleResponse(**response)
if schedule.errors is None:
print(schedule.data.create_schedule.workflow_name)
else:
print(schedule.errors)
0.0.3
- Add unit tests to project
- Fix mutation input rendering for list items
- Generate graphql schema request with custom typeOf depth
0.1.0
- Changed default payload boolean value from True to False
- In query or mutation must be defined response properties
- Generating of pydantic dataclasses for response parsing
0.1.2
- Ignore generating of private graphql schema objects
- Add graphql_pydantic_converter.graphql_types.concatenate_queries function
1.0.0
- Migration to pydantic v2
1.0.1
- Change Map type to dict (key, value)
1.0.2
- Stringify mutation input strings
- Add __typename to payload
- create test folder as a module
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for graphql_pydantic_converter-1.0.2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 22fe741fe9695db9a54af7f2af8f8f5985bb35398a7b0e7d4efa58ee8eeb52d1 |
|
MD5 | a9ced56d0a59bd56e4e833059baef11b |
|
BLAKE2b-256 | 629776188ecaf8a1968bb625acfdd3cd41044a08128c8e0c12ba48398f8e49e9 |
Close
Hashes for graphql_pydantic_converter-1.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a6ee6070bfd51aef3a773440c94b513ad0f87297dbe87ded375686fdd201a3ca |
|
MD5 | 01891e7c9e874b0d926e94696b50766c |
|
BLAKE2b-256 | 3c77e3cac47a7c12050aa4d400dff3c0210c99af027b7521b7c30fe99364a902 |