Skip to main content

Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy

Project description

orjson-pydantic

This is a (maintained) fork of orjson that adds serialization of pydantic objects.

orjson is a fast, correct JSON library for Python. It benchmarks as the fastest Python library for JSON and is more correct than the standard json library or other third-party libraries. It serializes dataclass, datetime, numpy, and UUID instances natively.

Its features and drawbacks compared to other Python JSON libraries:

  • serializes dataclass instances 40-50x as fast as other libraries
  • serializes datetime, date, and time instances to RFC 3339 format, e.g., "1970-01-01T00:00:00+00:00"
  • serializes numpy.ndarray instances 4-12x as fast with 0.3x the memory usage of other libraries
  • pretty prints 10x to 20x as fast as the standard library
  • serializes to bytes rather than str, i.e., is not a drop-in replacement
  • serializes str without escaping unicode to ASCII, e.g., "好" rather than "\\u597d"
  • serializes float 10x as fast and deserializes twice as fast as other libraries
  • serializes subclasses of str, int, list, and dict natively, requiring default to specify how to serialize others
  • serializes arbitrary types using a default hook
  • has strict UTF-8 conformance, more correct than the standard library
  • has strict JSON conformance in not supporting Nan/Infinity/-Infinity
  • has an option for strict JSON conformance on 53-bit integers with default support for 64-bit
  • does not provide load() or dump() functions for reading from/writing to file-like objects

orjson supports CPython 3.7, 3.8, 3.9, and 3.10. It distributes x86_64/amd64, aarch64/armv8, and arm7 wheels for Linux, amd64 and aarch64 wheels for macOS, and amd64 wheels for Windows. orjson does not support PyPy. Releases follow semantic versioning and serializing a new object type without an opt-in flag is considered a breaking change.

orjson is licensed under both the Apache 2.0 and MIT licenses. The repository and issue tracker is github.com/ijl/orjson, and patches may be submitted there. There is a CHANGELOG available in the repository.

  1. Usage
    1. Install
    2. Quickstart
    3. Migrating
    4. Serialize
      1. default
      2. option
    5. Deserialize
  2. Types
    1. dataclass
    2. datetime
    3. enum
    4. float
    5. int
    6. numpy
    7. str
    8. uuid
    9. pydantic
  3. Testing
  4. Performance
    1. Latency
    2. Memory
    3. Reproducing
  5. Questions
  6. Packaging
  7. License

Usage

Install

To install a wheel from PyPI:

pip install --upgrade "pip>=20.3" # manylinux_x_y, universal2 wheel support
pip install --upgrade orjson-pydantic

To build a wheel, see packaging.

Quickstart

This is an example of serializing, with options specified, and deserializing:

>>> import orjson, datetime, numpy
>>> data = {
    "type": "job",
    "created_at": datetime.datetime(1970, 1, 1),
    "status": "🆗",
    "payload": numpy.array([[1, 2], [3, 4]]),
}
>>> orjson.dumps(data, option=orjson.OPT_NAIVE_UTC | orjson.OPT_SERIALIZE_NUMPY)
b'{"type":"job","created_at":"1970-01-01T00:00:00+00:00","status":"\xf0\x9f\x86\x97","payload":[[1,2],[3,4]]}'
>>> orjson.loads(_)
{'type': 'job', 'created_at': '1970-01-01T00:00:00+00:00', 'status': '🆗', 'payload': [[1, 2], [3, 4]]}

Migrating

orjson version 3 serializes more types than version 2. Subclasses of str, int, dict, and list are now serialized. This is faster and more similar to the standard library. It can be disabled with orjson.OPT_PASSTHROUGH_SUBCLASS.dataclasses.dataclass instances are now serialized by default and cannot be customized in a default function unless option=orjson.OPT_PASSTHROUGH_DATACLASS is specified. uuid.UUID instances are serialized by default. For any type that is now serialized, implementations in a default function and options enabling them can be removed but do not need to be. There was no change in deserialization.

To migrate from the standard library, the largest difference is that orjson.dumps returns bytes and json.dumps returns a str. Users with dict objects using non-str keys should specify option=orjson.OPT_NON_STR_KEYS. sort_keys is replaced by option=orjson.OPT_SORT_KEYS. indent is replaced by option=orjson.OPT_INDENT_2 and other levels of indentation are not supported.

Serialize

def dumps(
    __obj: Any,
    default: Optional[Callable[[Any], Any]] = ...,
    option: Optional[int] = ...,
) -> bytes: ...

dumps() serializes Python objects to JSON.

It natively serializes str, dict, list, tuple, int, float, bool, dataclasses.dataclass, typing.TypedDict, datetime.datetime, datetime.date, datetime.time, uuid.UUID, numpy.ndarray, pydantic.BaseModel, and None instances. It supports arbitrary types through default. It serializes subclasses of str, int, dict, list, dataclasses.dataclass, and enum.Enum. It does not serialize subclasses of tuple to avoid serializing namedtuple objects as arrays. To avoid serializing subclasses, specify the option orjson.OPT_PASSTHROUGH_SUBCLASS.

The output is a bytes object containing UTF-8.

The global interpreter lock (GIL) is held for the duration of the call.

It raises JSONEncodeError on an unsupported type. This exception message describes the invalid object with the error message Type is not JSON serializable: .... To fix this, specify default.

It raises JSONEncodeError on a str that contains invalid UTF-8.

It raises JSONEncodeError on an integer that exceeds 64 bits by default or, with OPT_STRICT_INTEGER, 53 bits.

It raises JSONEncodeError if a dict has a key of a type other than str, unless OPT_NON_STR_KEYS is specified.

It raises JSONEncodeError if the output of default recurses to handling by default more than 254 levels deep.

It raises JSONEncodeError on circular references.

It raises JSONEncodeError if a tzinfo on a datetime object is unsupported.

JSONEncodeError is a subclass of TypeError. This is for compatibility with the standard library.

default

To serialize a subclass or arbitrary types, specify default as a callable that returns a supported type. default may be a function, lambda, or callable class instance. To specify that a type was not handled by default, raise an exception such as TypeError.

>>> import orjson, decimal
>>>
def default(obj):
    if isinstance(obj, decimal.Decimal):
        return str(obj)
    raise TypeError

>>> orjson.dumps(decimal.Decimal("0.0842389659712649442845"))
JSONEncodeError: Type is not JSON serializable: decimal.Decimal
>>> orjson.dumps(decimal.Decimal("0.0842389659712649442845"), default=default)
b'"0.0842389659712649442845"'
>>> orjson.dumps({1, 2}, default=default)
orjson.JSONEncodeError: Type is not JSON serializable: set

The default callable may return an object that itself must be handled by default up to 254 times before an exception is raised.

It is important that default raise an exception if a type cannot be handled. Python otherwise implicitly returns None, which appears to the caller like a legitimate value and is serialized:

>>> import orjson, json, rapidjson
>>>
def default(obj):
    if isinstance(obj, decimal.Decimal):
        return str(obj)

>>> orjson.dumps({"set":{1, 2}}, default=default)
b'{"set":null}'
>>> json.dumps({"set":{1, 2}}, default=default)
'{"set":null}'
>>> rapidjson.dumps({"set":{1, 2}}, default=default)
'{"set":null}'

option

To modify how data is serialized, specify option. Each option is an integer constant in orjson. To specify multiple options, mask them together, e.g., option=orjson.OPT_STRICT_INTEGER | orjson.OPT_NAIVE_UTC.

OPT_APPEND_NEWLINE

Append \n to the output. This is a convenience and optimization for the pattern of dumps(...) + "\n". bytes objects are immutable and this pattern copies the original contents.

>>> import orjson
>>> orjson.dumps([])
b"[]"
>>> orjson.dumps([], option=orjson.OPT_APPEND_NEWLINE)
b"[]\n"
OPT_INDENT_2

Pretty-print output with an indent of two spaces. This is equivalent to indent=2 in the standard library. Pretty printing is slower and the output larger. orjson is the fastest compared library at pretty printing and has much less of a slowdown to pretty print than the standard library does. This option is compatible with all other options.

>>> import orjson
>>> orjson.dumps({"a": "b", "c": {"d": True}, "e": [1, 2]})
b'{"a":"b","c":{"d":true},"e":[1,2]}'
>>> orjson.dumps(
    {"a": "b", "c": {"d": True}, "e": [1, 2]},
    option=orjson.OPT_INDENT_2
)
b'{\n  "a": "b",\n  "c": {\n    "d": true\n  },\n  "e": [\n    1,\n    2\n  ]\n}'

If displayed, the indentation and linebreaks appear like this:

{
  "a": "b",
  "c": {
    "d": true
  },
  "e": [
    1,
    2
  ]
}

This measures serializing the github.json fixture as compact (52KiB) or pretty (64KiB):

Library compact (ms) pretty (ms) vs. orjson
orjson 0.06 0.07 1.0
ujson 0.18 0.19 2.8
rapidjson 0.22
simplejson 0.35 1.49 21.4
json 0.36 1.19 17.2

This measures serializing the citm_catalog.json fixture, more of a worst case due to the amount of nesting and newlines, as compact (489KiB) or pretty (1.1MiB):

Library compact (ms) pretty (ms) vs. orjson
orjson 0.88 1.73 1.0
ujson 3.73 4.52 2.6
rapidjson 3.54
simplejson 11.77 72.06 41.6
json 6.71 55.22 31.9

rapidjson is blank because it does not support pretty printing. This can be reproduced using the pyindent script.

OPT_NAIVE_UTC

Serialize datetime.datetime objects without a tzinfo as UTC. This has no effect on datetime.datetime objects that have tzinfo set.

>>> import orjson, datetime
>>> orjson.dumps(
        datetime.datetime(1970, 1, 1, 0, 0, 0),
    )
b'"1970-01-01T00:00:00"'
>>> orjson.dumps(
        datetime.datetime(1970, 1, 1, 0, 0, 0),
        option=orjson.OPT_NAIVE_UTC,
    )
b'"1970-01-01T00:00:00+00:00"'
OPT_NON_STR_KEYS

Serialize dict keys of type other than str. This allows dict keys to be one of str, int, float, bool, None, datetime.datetime, datetime.date, datetime.time, enum.Enum, and uuid.UUID. For comparison, the standard library serializes str, int, float, bool or None by default. orjson benchmarks as being faster at serializing non-str keys than other libraries. This option is slower for str keys than the default.

>>> import orjson, datetime, uuid
>>> orjson.dumps(
        {uuid.UUID("7202d115-7ff3-4c81-a7c1-2a1f067b1ece"): [1, 2, 3]},
        option=orjson.OPT_NON_STR_KEYS,
    )
b'{"7202d115-7ff3-4c81-a7c1-2a1f067b1ece":[1,2,3]}'
>>> orjson.dumps(
        {datetime.datetime(1970, 1, 1, 0, 0, 0): [1, 2, 3]},
        option=orjson.OPT_NON_STR_KEYS | orjson.OPT_NAIVE_UTC,
    )
b'{"1970-01-01T00:00:00+00:00":[1,2,3]}'

These types are generally serialized how they would be as values, e.g., datetime.datetime is still an RFC 3339 string and respects options affecting it. The exception is that int serialization does not respect OPT_STRICT_INTEGER.

This option has the risk of creating duplicate keys. This is because non-str objects may serialize to the same str as an existing key, e.g., {"1": true, 1: false}. The last key to be inserted to the dict will be serialized last and a JSON deserializer will presumably take the last occurrence of a key (in the above, false). The first value will be lost.

This option is compatible with orjson.OPT_SORT_KEYS. If sorting is used, note the sort is unstable and will be unpredictable for duplicate keys.

>>> import orjson, datetime
>>> orjson.dumps(
    {"other": 1, datetime.date(1970, 1, 5): 2, datetime.date(1970, 1, 3): 3},
    option=orjson.OPT_NON_STR_KEYS | orjson.OPT_SORT_KEYS
)
b'{"1970-01-03":3,"1970-01-05":2,"other":1}'

This measures serializing 589KiB of JSON comprising a list of 100 dict in which each dict has both 365 randomly-sorted int keys representing epoch timestamps as well as one str key and the value for each key is a single integer. In "str keys", the keys were converted to str before serialization, and orjson still specifes option=orjson.OPT_NON_STR_KEYS (which is always somewhat slower).

Library str keys (ms) int keys (ms) int keys sorted (ms)
orjson 1.53 2.16 4.29
ujson 3.07 5.65
rapidjson 4.29
simplejson 11.24 14.50 21.86
json 7.17 8.49

ujson is blank for sorting because it segfaults. json is blank because it raises TypeError on attempting to sort before converting all keys to str. rapidjson is blank because it does not support non-str keys. This can be reproduced using the pynonstr script.

OPT_OMIT_MICROSECONDS

Do not serialize the microsecond field on datetime.datetime and datetime.time instances.

>>> import orjson, datetime
>>> orjson.dumps(
        datetime.datetime(1970, 1, 1, 0, 0, 0, 1),
    )
b'"1970-01-01T00:00:00.000001"'
>>> orjson.dumps(
        datetime.datetime(1970, 1, 1, 0, 0, 0, 1),
        option=orjson.OPT_OMIT_MICROSECONDS,
    )
b'"1970-01-01T00:00:00"'
OPT_PASSTHROUGH_DATACLASS

Passthrough dataclasses.dataclass instances to default. This allows customizing their output but is much slower.

>>> import orjson, dataclasses
>>>
@dataclasses.dataclass
class User:
    id: str
    name: str
    password: str

def default(obj):
    if isinstance(obj, User):
        return {"id": obj.id, "name": obj.name}
    raise TypeError

>>> orjson.dumps(User("3b1", "asd", "zxc"))
b'{"id":"3b1","name":"asd","password":"zxc"}'
>>> orjson.dumps(User("3b1", "asd", "zxc"), option=orjson.OPT_PASSTHROUGH_DATACLASS)
TypeError: Type is not JSON serializable: User
>>> orjson.dumps(
        User("3b1", "asd", "zxc"),
        option=orjson.OPT_PASSTHROUGH_DATACLASS,
        default=default,
    )
b'{"id":"3b1","name":"asd"}'
OPT_PASSTHROUGH_DATETIME

Passthrough datetime.datetime, datetime.date, and datetime.time instances to default. This allows serializing datetimes to a custom format, e.g., HTTP dates:

>>> import orjson, datetime
>>>
def default(obj):
    if isinstance(obj, datetime.datetime):
        return obj.strftime("%a, %d %b %Y %H:%M:%S GMT")
    raise TypeError

>>> orjson.dumps({"created_at": datetime.datetime(1970, 1, 1)})
b'{"created_at":"1970-01-01T00:00:00"}'
>>> orjson.dumps({"created_at": datetime.datetime(1970, 1, 1)}, option=orjson.OPT_PASSTHROUGH_DATETIME)
TypeError: Type is not JSON serializable: datetime.datetime
>>> orjson.dumps(
        {"created_at": datetime.datetime(1970, 1, 1)},
        option=orjson.OPT_PASSTHROUGH_DATETIME,
        default=default,
    )
b'{"created_at":"Thu, 01 Jan 1970 00:00:00 GMT"}'

This does not affect datetimes in dict keys if using OPT_NON_STR_KEYS.

OPT_PASSTHROUGH_SUBCLASS

Passthrough subclasses of builtin types to default.

>>> import orjson
>>>
class Secret(str):
    pass

def default(obj):
    if isinstance(obj, Secret):
        return "******"
    raise TypeError

>>> orjson.dumps(Secret("zxc"))
b'"zxc"'
>>> orjson.dumps(Secret("zxc"), option=orjson.OPT_PASSTHROUGH_SUBCLASS)
TypeError: Type is not JSON serializable: Secret
>>> orjson.dumps(Secret("zxc"), option=orjson.OPT_PASSTHROUGH_SUBCLASS, default=default)
b'"******"'

This does not affect serializing subclasses as dict keys if using OPT_NON_STR_KEYS.

OPT_SERIALIZE_DATACLASS

This is deprecated and has no effect in version 3. In version 2 this was required to serialize dataclasses.dataclass instances. For more, see dataclass.

OPT_SERIALIZE_NUMPY

Serialize numpy.ndarray instances. For more, see numpy.

OPT_SERIALIZE_UUID

This is deprecated and has no effect in version 3. In version 2 this was required to serialize uuid.UUID instances. For more, see UUID.

OPT_SORT_KEYS

Serialize dict keys in sorted order. The default is to serialize in an unspecified order. This is equivalent to sort_keys=True in the standard library.

This can be used to ensure the order is deterministic for hashing or tests. It has a substantial performance penalty and is not recommended in general.

>>> import orjson
>>> orjson.dumps({"b": 1, "c": 2, "a": 3})
b'{"b":1,"c":2,"a":3}'
>>> orjson.dumps({"b": 1, "c": 2, "a": 3}, option=orjson.OPT_SORT_KEYS)
b'{"a":3,"b":1,"c":2}'

This measures serializing the twitter.json fixture unsorted and sorted:

Library unsorted (ms) sorted (ms) vs. orjson
orjson 0.5 0.92 1
ujson 1.61 2.48 2.7
rapidjson 2.17 2.89 3.2
simplejson 3.56 5.13 5.6
json 3.59 4.59 5

The benchmark can be reproduced using the pysort script.

The sorting is not collation/locale-aware:

>>> import orjson
>>> orjson.dumps({"a": 1, "ä": 2, "A": 3}, option=orjson.OPT_SORT_KEYS)
b'{"A":3,"a":1,"\xc3\xa4":2}'

This is the same sorting behavior as the standard library, rapidjson, simplejson, and ujson.

dataclass also serialize as maps but this has no effect on them.

OPT_STRICT_INTEGER

Enforce 53-bit limit on integers. The limit is otherwise 64 bits, the same as the Python standard library. For more, see int.

OPT_UTC_Z

Serialize a UTC timezone on datetime.datetime instances as Z instead of +00:00.

>>> import orjson, datetime, zoneinfo
>>> orjson.dumps(
        datetime.datetime(1970, 1, 1, 0, 0, 0, tzinfo=zoneinfo.ZoneInfo("UTC")),
    )
b'"1970-01-01T00:00:00+00:00"'
>>> orjson.dumps(
        datetime.datetime(1970, 1, 1, 0, 0, 0, tzinfo=zoneinfo.ZoneInfo("UTC")),
        option=orjson.OPT_UTC_Z
    )
b'"1970-01-01T00:00:00Z"'
OPT_SERIALIZE_PYDANTIC

Serialize a pydantic.BaseModel instance. For more see pydantic.

Deserialize

def loads(__obj: Union[bytes, bytearray, memoryview, str]) -> Any: ...

loads() deserializes JSON to Python objects. It deserializes to dict, list, int, float, str, bool, and None objects.

bytes, bytearray, memoryview, and str input are accepted. If the input exists as a memoryview, bytearray, or bytes object, it is recommended to pass these directly rather than creating an unnecessary str object. This has lower memory usage and lower latency.

The input must be valid UTF-8.

orjson maintains a cache of map keys for the duration of the process. This causes a net reduction in memory usage by avoiding duplicate strings. The keys must be at most 64 bytes to be cached and 512 entries are stored.

The global interpreter lock (GIL) is held for the duration of the call.

It raises JSONDecodeError if given an invalid type or invalid JSON. This includes if the input contains NaN, Infinity, or -Infinity, which the standard library allows, but is not valid JSON.

JSONDecodeError is a subclass of json.JSONDecodeError and ValueError. This is for compatibility with the standard library.

Types

dataclass

orjson serializes instances of dataclasses.dataclass natively. It serializes instances 40-50x as fast as other libraries and avoids a severe slowdown seen in other libraries compared to serializing dict.

It is supported to pass all variants of dataclasses, including dataclasses using __slots__, frozen dataclasses, those with optional or default attributes, and subclasses. There is a performance benefit to not using __slots__.

Library dict (ms) dataclass (ms) vs. orjson
orjson 1.40 1.60 1
ujson
rapidjson 3.64 68.48 42
simplejson 14.21 92.18 57
json 13.28 94.90 59

This measures serializing 555KiB of JSON, orjson natively and other libraries using default to serialize the output of dataclasses.asdict(). This can be reproduced using the pydataclass script.

Dataclasses are serialized as maps, with every attribute serialized and in the order given on class definition:

>>> import dataclasses, orjson, typing

@dataclasses.dataclass
class Member:
    id: int
    active: bool = dataclasses.field(default=False)

@dataclasses.dataclass
class Object:
    id: int
    name: str
    members: typing.List[Member]

>>> orjson.dumps(Object(1, "a", [Member(1, True), Member(2)]))
b'{"id":1,"name":"a","members":[{"id":1,"active":true},{"id":2,"active":false}]}'

Users may wish to control how dataclass instances are serialized, e.g., to not serialize an attribute or to change the name of an attribute when serialized. orjson may implement support using the metadata mapping on field attributes, e.g., field(metadata={"json_serialize": False}), if use cases are clear.

datetime

orjson serializes datetime.datetime objects to RFC 3339 format, e.g., "1970-01-01T00:00:00+00:00". This is a subset of ISO 8601 and is compatible with isoformat() in the standard library.

>>> import orjson, datetime, zoneinfo
>>> orjson.dumps(
    datetime.datetime(2018, 12, 1, 2, 3, 4, 9, tzinfo=zoneinfo.ZoneInfo("Australia/Adelaide"))
)
b'"2018-12-01T02:03:04.000009+10:30"'
>>> orjson.dumps(
    datetime.datetime(2100, 9, 1, 21, 55, 2).replace(tzinfo=zoneinfo.ZoneInfo("UTC"))
)
b'"2100-09-01T21:55:02+00:00"'
>>> orjson.dumps(
    datetime.datetime(2100, 9, 1, 21, 55, 2)
)
b'"2100-09-01T21:55:02"'

datetime.datetime supports instances with a tzinfo that is None, datetime.timezone.utc, a timezone instance from the python3.9+ zoneinfo module, or a timezone instance from the third-party pendulum, pytz, or dateutil/arrow libraries.

It is fastest to use the standard library's zoneinfo.ZoneInfo for timezones.

datetime.time objects must not have a tzinfo.

>>> import orjson, datetime
>>> orjson.dumps(datetime.time(12, 0, 15, 290))
b'"12:00:15.000290"'

datetime.date objects will always serialize.

>>> import orjson, datetime
>>> orjson.dumps(datetime.date(1900, 1, 2))
b'"1900-01-02"'

Errors with tzinfo result in JSONEncodeError being raised.

It is faster to have orjson serialize datetime objects than to do so before calling dumps(). If using an unsupported type such as pendulum.datetime, use default.

To disable serialization of datetime objects specify the option orjson.OPT_PASSTHROUGH_DATETIME.

To use "Z" suffix instead of "+00:00" to indicate UTC ("Zulu") time, use the option orjson.OPT_UTC_Z.

To assume datetimes without timezone are UTC, se the option orjson.OPT_NAIVE_UTC.

enum

orjson serializes enums natively. Options apply to their values.

>>> import enum, datetime, orjson
>>>
class DatetimeEnum(enum.Enum):
    EPOCH = datetime.datetime(1970, 1, 1, 0, 0, 0)
>>> orjson.dumps(DatetimeEnum.EPOCH)
b'"1970-01-01T00:00:00"'
>>> orjson.dumps(DatetimeEnum.EPOCH, option=orjson.OPT_NAIVE_UTC)
b'"1970-01-01T00:00:00+00:00"'

Enums with members that are not supported types can be serialized using default:

>>> import enum, orjson
>>>
class Custom:
    def __init__(self, val):
        self.val = val

def default(obj):
    if isinstance(obj, Custom):
        return obj.val
    raise TypeError

class CustomEnum(enum.Enum):
    ONE = Custom(1)

>>> orjson.dumps(CustomEnum.ONE, default=default)
b'1'

float

orjson serializes and deserializes double precision floats with no loss of precision and consistent rounding. The same behavior is observed in rapidjson, simplejson, and json. ujson 1.35 was inaccurate in both serialization and deserialization, i.e., it modifies the data, and the recent 2.0 release is accurate.

orjson.dumps() serializes Nan, Infinity, and -Infinity, which are not compliant JSON, as null:

>>> import orjson, ujson, rapidjson, json
>>> orjson.dumps([float("NaN"), float("Infinity"), float("-Infinity")])
b'[null,null,null]'
>>> ujson.dumps([float("NaN"), float("Infinity"), float("-Infinity")])
OverflowError: Invalid Inf value when encoding double
>>> rapidjson.dumps([float("NaN"), float("Infinity"), float("-Infinity")])
'[NaN,Infinity,-Infinity]'
>>> json.dumps([float("NaN"), float("Infinity"), float("-Infinity")])
'[NaN, Infinity, -Infinity]'

int

orjson serializes and deserializes 64-bit integers by default. The range supported is a signed 64-bit integer's minimum (-9223372036854775807) to an unsigned 64-bit integer's maximum (18446744073709551615). This is widely compatible, but there are implementations that only support 53-bits for integers, e.g., web browsers. For those implementations, dumps() can be configured to raise a JSONEncodeError on values exceeding the 53-bit range.

>>> import orjson
>>> orjson.dumps(9007199254740992)
b'9007199254740992'
>>> orjson.dumps(9007199254740992, option=orjson.OPT_STRICT_INTEGER)
JSONEncodeError: Integer exceeds 53-bit range
>>> orjson.dumps(-9007199254740992, option=orjson.OPT_STRICT_INTEGER)
JSONEncodeError: Integer exceeds 53-bit range

numpy

orjson natively serializes numpy.ndarray and individual numpy.float64, numpy.float32, numpy.int64, numpy.int32, numpy.int8, numpy.uint64, numpy.uint32, numpy.uint8, numpy.uintp, or numpy.intp, and numpy.datetime64 instances.

orjson is faster than all compared libraries at serializing numpy instances. Serializing numpy data requires specifying option=orjson.OPT_SERIALIZE_NUMPY.

>>> import orjson, numpy
>>> orjson.dumps(
        numpy.array([[1, 2, 3], [4, 5, 6]]),
        option=orjson.OPT_SERIALIZE_NUMPY,
)
b'[[1,2,3],[4,5,6]]'

The array must be a contiguous C array (C_CONTIGUOUS) and one of the supported datatypes.

numpy.datetime64 instances are serialized as RFC 3339 strings and datetime options affect them.

>>> import orjson, numpy
>>> orjson.dumps(
        numpy.datetime64("2021-01-01T00:00:00.172"),
        option=orjson.OPT_SERIALIZE_NUMPY,
)
b'"2021-01-01T00:00:00.172000"'
>>> orjson.dumps(
        numpy.datetime64("2021-01-01T00:00:00.172"),
        option=(
            orjson.OPT_SERIALIZE_NUMPY |
            orjson.OPT_NAIVE_UTC |
            orjson.OPT_OMIT_MICROSECONDS
        ),
)
b'"2021-01-01T00:00:00+00:00"'

If an array is not a contiguous C array, contains an supported datatype, or contains a numpy.datetime64 using an unsupported representation (e.g., picoseconds), orjson falls through to default. In default, obj.tolist() can be specified. If an array is malformed, which is not expected, orjson.JSONEncodeError is raised.

This measures serializing 92MiB of JSON from an numpy.ndarray with dimensions of (50000, 100) and numpy.float64 values:

Library Latency (ms) RSS diff (MiB) vs. orjson
orjson 194 99 1.0
ujson
rapidjson 3,048 309 15.7
simplejson 3,023 297 15.6
json 3,133 297 16.1

This measures serializing 100MiB of JSON from an numpy.ndarray with dimensions of (100000, 100) and numpy.int32 values:

Library Latency (ms) RSS diff (MiB) vs. orjson
orjson 178 115 1.0
ujson
rapidjson 1,512 551 8.5
simplejson 1,606 504 9.0
json 1,506 503 8.4

This measures serializing 105MiB of JSON from an numpy.ndarray with dimensions of (100000, 200) and numpy.bool values:

Library Latency (ms) RSS diff (MiB) vs. orjson
orjson 157 120 1.0
ujson
rapidjson 710 327 4.5
simplejson 931 398 5.9
json 996 400 6.3

In these benchmarks, orjson serializes natively, ujson is blank because it does not support a default parameter, and the other libraries serialize ndarray.tolist() via default. The RSS column measures peak memory usage during serialization. This can be reproduced using the pynumpy script.

orjson does not have an installation or compilation dependency on numpy. The implementation is independent, reading numpy.ndarray using PyArrayInterface.

str

orjson is strict about UTF-8 conformance. This is stricter than the standard library's json module, which will serialize and deserialize UTF-16 surrogates, e.g., "\ud800", that are invalid UTF-8.

If orjson.dumps() is given a str that does not contain valid UTF-8, orjson.JSONEncodeError is raised. If loads() receives invalid UTF-8, orjson.JSONDecodeError is raised.

orjson and rapidjson are the only compared JSON libraries to consistently error on bad input.

>>> import orjson, ujson, rapidjson, json
>>> orjson.dumps('\ud800')
JSONEncodeError: str is not valid UTF-8: surrogates not allowed
>>> ujson.dumps('\ud800')
UnicodeEncodeError: 'utf-8' codec ...
>>> rapidjson.dumps('\ud800')
UnicodeEncodeError: 'utf-8' codec ...
>>> json.dumps('\ud800')
'"\\ud800"'
>>> orjson.loads('"\\ud800"')
JSONDecodeError: unexpected end of hex escape at line 1 column 8: line 1 column 1 (char 0)
>>> ujson.loads('"\\ud800"')
''
>>> rapidjson.loads('"\\ud800"')
ValueError: Parse error at offset 1: The surrogate pair in string is invalid.
>>> json.loads('"\\ud800"')
'\ud800'

To make a best effort at deserializing bad input, first decode bytes using the replace or lossy argument for errors:

>>> import orjson
>>> orjson.loads(b'"\xed\xa0\x80"')
JSONDecodeError: str is not valid UTF-8: surrogates not allowed
>>> orjson.loads(b'"\xed\xa0\x80"'.decode("utf-8", "replace"))
'���'

uuid

orjson serializes uuid.UUID instances to RFC 4122 format, e.g., "f81d4fae-7dec-11d0-a765-00a0c91e6bf6".

>>> import orjson, uuid
>>> orjson.dumps(uuid.UUID('f81d4fae-7dec-11d0-a765-00a0c91e6bf6'))
b'"f81d4fae-7dec-11d0-a765-00a0c91e6bf6"'
>>> orjson.dumps(uuid.uuid5(uuid.NAMESPACE_DNS, "python.org"))
b'"886313e1-3b8a-5372-9b90-0c9aee199e5d"'

pydantic

orjson serializes pydantic.BaseModel instances based on the existence of __fields__ attribute.

:warning: The serialization doesn't behave like pydantic.BaseModel.json():

  1. It doesn't respect any of the Config attributes.
  2. It doesn't have any of the additional features of Pydantic (json_encoder, exclusions, inclusions, etc).

Testing

The library has comprehensive tests. There are tests against fixtures in the JSONTestSuite and nativejson-benchmark repositories. It is tested to not crash against the Big List of Naughty Strings. It is tested to not leak memory. It is tested to not crash against and not accept invalid UTF-8. There are integration tests exercising the library's use in web servers (gunicorn using multiprocess/forked workers) and when multithreaded. It also uses some tests from the ultrajson library.

orjson is the most correct of the compared libraries. This graph shows how each library handles a combined 342 JSON fixtures from the JSONTestSuite and nativejson-benchmark tests:

Library Invalid JSON documents not rejected Valid JSON documents not deserialized
orjson 0 0
ujson 38 0
rapidjson 6 0
simplejson 13 0
json 17 0

This shows that all libraries deserialize valid JSON but only orjson correctly rejects the given invalid JSON fixtures. Errors are largely due to accepting invalid strings and numbers.

The graph above can be reproduced using the pycorrectness script.

Performance

Serialization and deserialization performance of orjson is better than ultrajson, rapidjson, simplejson, or json. The benchmarks are done on fixtures of real data:

  • twitter.json, 631.5KiB, results of a search on Twitter for "一", containing CJK strings, dictionaries of strings and arrays of dictionaries, indented.

  • github.json, 55.8KiB, a GitHub activity feed, containing dictionaries of strings and arrays of dictionaries, not indented.

  • citm_catalog.json, 1.7MiB, concert data, containing nested dictionaries of strings and arrays of integers, indented.

  • canada.json, 2.2MiB, coordinates of the Canadian border in GeoJSON format, containing floats and arrays, indented.

Latency

alt text alt text alt text alt text alt text alt text alt text alt text

twitter.json serialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 0.59 1698.8 1
ujson 2.14 464.3 3.64
rapidjson 2.39 418.5 4.06
simplejson 3.15 316.9 5.36
json 3.56 281.2 6.06

twitter.json deserialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 2.28 439.3 1
ujson 2.89 345.9 1.27
rapidjson 3.85 259.6 1.69
simplejson 3.66 272.1 1.61
json 4.05 246.7 1.78

github.json serialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 0.07 15265.2 1
ujson 0.22 4556.7 3.35
rapidjson 0.26 3808.9 4.02
simplejson 0.37 2690.4 5.68
json 0.35 2847.8 5.36

github.json deserialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 0.18 5610.1 1
ujson 0.28 3540.7 1.58
rapidjson 0.33 3031.5 1.85
simplejson 0.29 3385.6 1.65
json 0.29 3402.1 1.65

citm_catalog.json serialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 0.99 1008.5 1
ujson 3.69 270.7 3.72
rapidjson 3.55 281.4 3.58
simplejson 11.76 85.1 11.85
json 6.89 145.1 6.95

citm_catalog.json deserialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 4.53 220.5 1
ujson 5.67 176.5 1.25
rapidjson 7.51 133.3 1.66
simplejson 7.54 132.7 1.66
json 7.8 128.2 1.72

canada.json serialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 4.72 198.9 1
ujson 17.76 56.3 3.77
rapidjson 61.83 16.2 13.11
simplejson 80.6 12.4 17.09
json 52.38 18.8 11.11

canada.json deserialization

Library Median latency (milliseconds) Operations per second Relative (latency)
orjson 10.28 97.4 1
ujson 16.49 60.5 1.6
rapidjson 37.92 26.4 3.69
simplejson 37.7 26.5 3.67
json 37.87 27.6 3.68

Memory

orjson's memory usage when deserializing is similar to or lower than the standard library and other third-party libraries.

This measures, in the first column, RSS after importing a library and reading the fixture, and in the second column, increases in RSS after repeatedly calling loads() on the fixture.

twitter.json

Library import, read() RSS (MiB) loads() increase in RSS (MiB)
orjson 13.5 2.5
ujson 14 4.1
rapidjson 14.7 6.5
simplejson 13.2 2.5
json 12.9 2.3

github.json

Library import, read() RSS (MiB) loads() increase in RSS (MiB)
orjson 13.1 0.3
ujson 13.5 0.3
rapidjson 14 0.7
simplejson 12.6 0.3
json 12.3 0.1

citm_catalog.json

Library import, read() RSS (MiB) loads() increase in RSS (MiB)
orjson 14.6 7.9
ujson 15.1 11.1
rapidjson 15.8 36
simplejson 14.3 27.4
json 14 27.2

canada.json

Library import, read() RSS (MiB) loads() increase in RSS (MiB)
orjson 17.1 15.7
ujson 17.6 17.4
rapidjson 18.3 17.9
simplejson 16.9 19.6
json 16.5 19.4

Reproducing

The above was measured using Python 3.8.3 on Linux (x86_64) with orjson 3.3.0, ujson 3.0.0, python-rapidson 0.9.1, and simplejson 3.17.2.

The latency results can be reproduced using the pybench and graph scripts. The memory results can be reproduced using the pymem script.

Questions

Why can't I install it from PyPI?

Probably pip needs to be upgraded to version 20.3 or later to support the latest manylinux_x_y or universal2 wheel formats.

Will it deserialize to dataclasses, UUIDs, decimals, etc or support object_hook?

No. This requires a schema specifying what types are expected and how to handle errors etc. This is addressed by data validation libraries a level above this.

Will it serialize to str?

No. bytes is the correct type for a serialized blob.

Will it support PyPy?

If someone implements it well.

Packaging

To package orjson requires Rust and the maturin build tool.

This is an example for x86_64 on the Rust nightly channel:

export RUSTFLAGS="-C target-cpu=k8"
maturin build --no-sdist --release --strip --cargo-extra-args="--features=unstable-simd"

To build on the stable channel, do not specify --features=unstable-simd.

The project's own CI tests against nightly-2022-02-13 and stable 1.54. It is prudent to pin the nightly version because that channel can introduce breaking changes.

orjson is tested for amd64, aarch64, and arm7 on Linux. It is tested for amd64 on macOS and ships an aarch64 wheel also supporting aarch64. For Windows is is tested on amd64.

There are no runtime dependencies other than libc.

orjson's tests are included in the source distribution on PyPI. The requirements to run the tests are specified in test/requirements.txt. The tests should be run as part of the build. It can be run with pytest -q test.

License

orjson was written by ijl <ijl@mailbox.org>, copyright 2018 - 2022, licensed under both the Apache 2 and MIT licenses.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orjson_pydantic-3.6.7.tar.gz (551.9 kB view details)

Uploaded Source

Built Distributions

orjson_pydantic-3.6.7-cp310-none-win_amd64.whl (186.6 kB view details)

Uploaded CPython 3.10 Windows x86-64

orjson_pydantic-3.6.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (254.5 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

orjson_pydantic-3.6.7-cp310-cp310-macosx_10_7_x86_64.whl (241.2 kB view details)

Uploaded CPython 3.10 macOS 10.7+ x86-64

orjson_pydantic-3.6.7-cp39-none-win_amd64.whl (186.6 kB view details)

Uploaded CPython 3.9 Windows x86-64

orjson_pydantic-3.6.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (254.5 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

orjson_pydantic-3.6.7-cp39-cp39-macosx_10_7_x86_64.whl (241.2 kB view details)

Uploaded CPython 3.9 macOS 10.7+ x86-64

orjson_pydantic-3.6.7-cp38-none-win_amd64.whl (186.4 kB view details)

Uploaded CPython 3.8 Windows x86-64

orjson_pydantic-3.6.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (254.5 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

orjson_pydantic-3.6.7-cp38-cp38-macosx_10_7_x86_64.whl (241.0 kB view details)

Uploaded CPython 3.8 macOS 10.7+ x86-64

orjson_pydantic-3.6.7-cp37-none-win_amd64.whl (186.5 kB view details)

Uploaded CPython 3.7 Windows x86-64

orjson_pydantic-3.6.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (254.6 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

orjson_pydantic-3.6.7-cp37-cp37m-macosx_10_7_x86_64.whl (241.1 kB view details)

Uploaded CPython 3.7m macOS 10.7+ x86-64

File details

Details for the file orjson_pydantic-3.6.7.tar.gz.

File metadata

  • Download URL: orjson_pydantic-3.6.7.tar.gz
  • Upload date:
  • Size: 551.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.8.12

File hashes

Hashes for orjson_pydantic-3.6.7.tar.gz
Algorithm Hash digest
SHA256 e3fa285dada580dde22b52e4d4d4a681e186746e6b0dd082b87c60913a757c39
MD5 573af913ac00adb339c97db9235ab06b
BLAKE2b-256 a1a7ee32301b4d39f01939e56eca64fc0d0345a888e8f86337045717034cc2eb

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp310-none-win_amd64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp310-none-win_amd64.whl
  • Upload date:
  • Size: 186.6 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for orjson_pydantic-3.6.7-cp310-none-win_amd64.whl
Algorithm Hash digest
SHA256 c6496ebd9b8df5ebdae321caeeab88576c806b080a719b340425c321942d1587
MD5 9fa7c6037c7d4e898f69808218106472
BLAKE2b-256 59e708977cf1637d251c50dd2f6fe2bd00e8d3a66d3939998025c5230ef1a059

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 254.5 kB
  • Tags: CPython 3.10, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for orjson_pydantic-3.6.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 8991e5ffccde321bbe1dd2f93bfa7d7a642f738006ebf2a424a3bc7de6f45630
MD5 156fddb846fd82d7635376e1b04d4872
BLAKE2b-256 56c89fa1cf96a97783f96817f1e9c2f3d189c524a6ff475ee850cb031beb6023

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp310-cp310-macosx_10_7_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp310-cp310-macosx_10_7_x86_64.whl
  • Upload date:
  • Size: 241.2 kB
  • Tags: CPython 3.10, macOS 10.7+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.10.3

File hashes

Hashes for orjson_pydantic-3.6.7-cp310-cp310-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 020b99fc479b5c6fd42e86601cb6ed6fa127b053f2313e19ed50a5d0b0a0a363
MD5 c3264f6584d774538effca3b3422e3ea
BLAKE2b-256 3007e23977c3e3cd14246d54a2c9efe0755a1160c5df0f0abad47b97a70ff0d2

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp39-none-win_amd64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp39-none-win_amd64.whl
  • Upload date:
  • Size: 186.6 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for orjson_pydantic-3.6.7-cp39-none-win_amd64.whl
Algorithm Hash digest
SHA256 db7c762f7cd12ee99c104cc64f2e7df0d2530b315ec495b48db53de31471e993
MD5 e57429ddae58cbe7dc5e45fbad7e3ad7
BLAKE2b-256 4ee60f48376993c6480c4a133c73f6341656f89a182e05d58dc6b3460cb7cf1d

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 254.5 kB
  • Tags: CPython 3.9, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.9.10

File hashes

Hashes for orjson_pydantic-3.6.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 306e471b00babf2ea208f73fc38b8d9da8f11ff91bac804852c7f09c03925f54
MD5 165038b5f40a993cab40c638dd4520ce
BLAKE2b-256 dd33cf68d4edf8f3743b747cb39c9c95ed90f1b4dbdb4a4dd9d3405fdf9468bb

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp39-cp39-macosx_10_7_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp39-cp39-macosx_10_7_x86_64.whl
  • Upload date:
  • Size: 241.2 kB
  • Tags: CPython 3.9, macOS 10.7+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.9.11

File hashes

Hashes for orjson_pydantic-3.6.7-cp39-cp39-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 2d2cf1f5c289287652924d1ef35d3eedc88efc7c005656cd0a1a39a2f0c32f3c
MD5 0d03e72145374287fc31cc198685d0ac
BLAKE2b-256 f6d356aa853ee037bf8417b6a55d68a317b13a2c5255fa979e8f9b30514e5fda

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp38-none-win_amd64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp38-none-win_amd64.whl
  • Upload date:
  • Size: 186.4 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.8.10

File hashes

Hashes for orjson_pydantic-3.6.7-cp38-none-win_amd64.whl
Algorithm Hash digest
SHA256 f73d5d5b6082a471b8cc7319d11c7dba4e74f2cfcb984e444acad97fc08e8d59
MD5 4c5d47d5b8c65e781b6ba433aa4d912a
BLAKE2b-256 5b00188db53798661f5e81df3ab62f7fd6b6bfe8eeef5070c191dab4f5dcfeaf

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 254.5 kB
  • Tags: CPython 3.8, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.8.13

File hashes

Hashes for orjson_pydantic-3.6.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0458e9626a9a5167b48caa85aa74f98d89c791c13421589f3da95bf9f3237b7f
MD5 4440cec33845aa7a739e73657fb2f308
BLAKE2b-256 e3df2ad7ef34d3b0e1e7a7d9a07a37d022577d3fe6737dfb69fed993c30e6c0f

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp38-cp38-macosx_10_7_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp38-cp38-macosx_10_7_x86_64.whl
  • Upload date:
  • Size: 241.0 kB
  • Tags: CPython 3.8, macOS 10.7+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.8.12

File hashes

Hashes for orjson_pydantic-3.6.7-cp38-cp38-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 d6db4df1dc918be1b372e0f5967385e30849e6a7b5e8f61a5ca45b7d5e21542a
MD5 830e86fc1905330e68bbc18e78a3eb1f
BLAKE2b-256 9a3b6f0210fcd559c20a7bc0ddaea656a042822ba3a7a8892950aebc41ddf2f1

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp37-none-win_amd64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp37-none-win_amd64.whl
  • Upload date:
  • Size: 186.5 kB
  • Tags: CPython 3.7, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.7.9

File hashes

Hashes for orjson_pydantic-3.6.7-cp37-none-win_amd64.whl
Algorithm Hash digest
SHA256 8ab178603be15eabfe7814413d593bdae141fe7f2577ab4bfbae46a8c794460f
MD5 4f4f9204f0df09e5963bf49518a86abd
BLAKE2b-256 3145f7317e4f69141c7cea682d5a8dc72335618a87f5977073c3e3dbe7727c34

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
  • Upload date:
  • Size: 254.6 kB
  • Tags: CPython 3.7m, manylinux: glibc 2.17+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.7.13

File hashes

Hashes for orjson_pydantic-3.6.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 dd4e900f9ad0f9471cc6fcf902f5b201e44631379e78c9cb2cde1798ba558c67
MD5 7937bb0d62dcceac7d34858481b25194
BLAKE2b-256 40b73cd70a5d5a28aa8a6f32bed484edaea1b3e5dfb07218bd72df47c686dc1f

See more details on using hashes here.

File details

Details for the file orjson_pydantic-3.6.7-cp37-cp37m-macosx_10_7_x86_64.whl.

File metadata

  • Download URL: orjson_pydantic-3.6.7-cp37-cp37m-macosx_10_7_x86_64.whl
  • Upload date:
  • Size: 241.1 kB
  • Tags: CPython 3.7m, macOS 10.7+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.63.1 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/1.5.0 colorama/0.4.4 CPython/3.7.12

File hashes

Hashes for orjson_pydantic-3.6.7-cp37-cp37m-macosx_10_7_x86_64.whl
Algorithm Hash digest
SHA256 8cd8fdd80e7ccd9d4cdb5e5c7e7142d945d0424c08ac74a2205db095068abe59
MD5 1c85c2962f2a76441fbd594f7903764f
BLAKE2b-256 1c5c2629c018bf177e08a9aeb0968c541b81ed8ca3a60ea1923dd26a6f01e692

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page