Skip to main content

Server-side processing for DataTables and Editor with MongoDB

Project description

mongo-datatables

PyPI version Downloads License: MIT codecov Tests Sponsor

Server-side processing for jQuery DataTables with MongoDB.

Translates DataTables Ajax requests into MongoDB aggregation pipelines, handling pagination, sorting, filtering, search, SearchPanes, SearchBuilder, and Editor (full CRUD).

Installation

pip install mongo-datatables
# or
uv add mongo-datatables

Quick Start

from pymongo import MongoClient
from mongo_datatables import DataTables, DataField

db = MongoClient("mongodb://localhost:27017/")["mydb"]

data_fields = [
    DataField('title', 'string'),
    DataField('artist', 'string'),
    DataField('year', 'number'),
    DataField('genre', 'string'),
]

# args is the DataTables Ajax request body (a dict)
result = DataTables(db, 'albums', args, data_fields).get_rows()

db is any PyMongo Database object. args is the JSON body from a DataTables Ajax POST. get_rows() returns the standard server-side response:

{
    "draw": 1,
    "recordsTotal": 1000000,
    "recordsFiltered": 4821,
    "data": [...]
}

Framework Examples

Flask

@app.route('/api/data', methods=['POST'])
def data():
    return jsonify(DataTables(db, 'albums', request.get_json(), data_fields).get_rows())

FastAPI

@app.post('/api/data')
async def data(request: Request):
    return JSONResponse(DataTables(db, 'albums', await request.json(), data_fields).get_rows())

Django

class DataView(View):
    def post(self, request):
        return JsonResponse(DataTables(db, 'albums', json.loads(request.body), data_fields).get_rows())

Litestar

@post('/api/data')
async def data(request: Request) -> dict:
    return DataTables(db, 'albums', await request.json(), data_fields).get_rows()

Quart

@app.route('/api/data', methods=['POST'])
async def data():
    return jsonify(DataTables(db, 'albums', await request.get_json(), data_fields).get_rows())

DataField

DataField(name, data_type, alias=None) maps a MongoDB field to a DataTables column.

DataField('title', 'string')                          # basic field
DataField('release_date', 'date')                     # enables date comparison search
DataField('track_count', 'number')                    # enables numeric comparison search
DataField('PublisherInfo.label', 'string', 'label')   # nested field with UI alias
DataField('_id', 'objectid')                          # serialized as string in response

Valid types:

Type Search behaviour Uses index? Operators
keyword Exact equality match Yes (regular index)
string Case-insensitive regex (substring) No
number Exact equality or numeric comparison Yes (regular index) > >= < <= =
date Date comparison (ISO YYYY-MM-DD) Yes (regular index) > >= < <= =
array Regex against array elements No
objectid Serialized as string in response
boolean, object, null Treated as string (regex) No

Use keyword for categorical/code fields (country codes, status values, tags) where exact matching is always intended and index performance matters. Use string for free-text fields where substring and partial matching is useful.

DataField('country_code', 'keyword')   # country:US  →  {"country_code": "US"}  — uses index
DataField('name',         'string')    # name:york   →  regex, finds "New York", "Yorkshire"
DataField('year',         'number')    # year:>1990  →  {"year": {"$gt": 1990}}  — uses index
DataField('released',     'date')      # released:>=2020-01-01  — uses index
DataField('_id',          'objectid')  # serialized as string in response
DataField('PublisherInfo.label', 'string', 'label')  # nested field with UI alias

The alias is the name DataTables uses for the column (columns[i][data]). Defaults to the last segment of the field path (PublisherInfo.labellabel).


Search

Search is where this library earns its keep. The global search box supports several modes:

Text index search (fast)

When a MongoDB text index exists, global search uses $text — fast even on multi-million-row collections:

db.albums.create_index([("title", "text"), ("artist", "text"), ("genre", "text")])

Without a text index, the library falls back to per-column regex (much slower on large collections).

Phrase search

Wrap in quotes for exact phrase matching:

"Dark Side of the Moon"

Multi-word AND search

With search[smart]=true (DataTables default), each word must match at least one searchable column:

pink floyd 1973   →  all three terms must appear across the row

Colon syntax — field-specific search

Target a specific field without needing a separate input:

artist:Bowie                 →  artist contains "Bowie" (string — regex)
artist:"David Bowie"         →  exact phrase in artist field
country_code:US              →  country_code equals "US" (keyword — exact, uses index)
year:1972                    →  year equals 1972 (number — exact, uses index)
year:>1990                   →  greater than
year:>=1990 year:<2000       →  combine multiple conditions (ANDed)
release_date:>2020-01-01     →  date comparison

Column search with ranges

Per-column search supports pipe-delimited min|max for numbers and dates:

1990|2000          →  1990 ≤ year ≤ 2000
2020-01-01|2020-12-31

Regex mode

Set search[regex]=true to treat the search value as a raw MongoDB regex:

^Dark             →  starts with "Dark"
(Floyd|Bowie)     →  matches either

Case sensitivity

Case-insensitive by default. Pass search[caseInsensitive]=false for case-sensitive matching. Per-column override via columns[i][search][caseInsensitive].


SearchPanes

No server-side configuration needed — call get_searchpanes_options() to populate panes on page load:

@app.route('/searchpanes', methods=['POST'])
def searchpanes():
    dt = DataTables(db, 'albums', request.get_json(), data_fields)
    return jsonify(dt.get_searchpanes_options())

SearchBuilder

Full server-side support with nested AND/OR criteria trees. Works automatically — no extra configuration needed.


Sorting

Multi-column sorting, ColReorder (order[i][name] name-based ordering), and orderData column redirect are all supported.


Custom Filters

Scope all queries to a subset of the collection by passing extra filter criteria as keyword arguments:

DataTables(db, 'albums', args, data_fields, status='active', label='Merge Records')

Editor

Full CRUD support for DataTables Editor.

from mongo_datatables import Editor

@app.route('/editor', methods=['POST'])
def editor():
    data = request.get_json()
    result = Editor(
        db, 'albums', data,
        doc_id=request.args.get('id'),
        data_fields=data_fields,
    ).process()
    return jsonify(result)

Editor also handles action=search for autocomplete and tags field types:

@app.route('/editor/search', methods=['POST'])
def editor_search():
    return jsonify(Editor(db, 'albums', request.get_json(), data_fields=data_fields).search())

Optional Editor parameters:

Parameter Description
validators dict mapping field names to callable(value) -> str|None
hooks pre_create, pre_edit, pre_remove callables — return falsy to cancel
options dict or zero-arg callable for select/radio/checkbox field options
dependent_handlers dict mapping field names to callables for dependent field Ajax
file_fields + storage_adapter file upload support (subclass StorageAdapter)
row_class, row_data, row_attr per-row metadata (static value or callable)

Performance & Indexes

For large collections, indexes are critical — the library uses aggregation pipelines on every request.

Text index

db.albums.create_index([
    ("title", "text"),
    ("artist", "text"),
    ("genre", "text"),
])

With a text index, global search runs in ~100–300ms on multi-million-row collections. Without one, regex fallback can take 5–10+ seconds.

MongoDB allows only one text index per collection, but it can cover multiple fields.

To force regex search even when a text index exists (for substring matching):

DataTables(db, 'albums', args, data_fields, use_text_index=False)

Regular indexes

Create indexes for fields used in sorting, column search, or custom filters:

db.albums.create_index("year")
db.albums.create_index("artist")
db.albums.create_index([("artist", 1), ("year", -1)])  # compound

Advanced

pipeline_stages — inject aggregation stages ($lookup, $addFields, $unwind) before the $match, useful for computed or joined fields.

allow_disk_use=True — pass allowDiskUse to aggregation pipelines when complex filters exceed MongoDB's 100 MB in-memory limit.

get_export_data() — returns all matching rows without pagination for CSV/Excel export.

row_id, row_class, row_data, row_attr — per-row DT_Row* metadata, accepts a static value or a callable receiving the raw document.

See the full documentation for details.


Development

Run tests:

python -m pytest tests/

Run with coverage:

python -m pytest --cov=mongo_datatables tests/ --cov-report=term --cov-report=html

License

Released under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mongo_datatables-2.0.0.tar.gz (49.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mongo_datatables-2.0.0-py3-none-any.whl (37.9 kB view details)

Uploaded Python 3

File details

Details for the file mongo_datatables-2.0.0.tar.gz.

File metadata

  • Download URL: mongo_datatables-2.0.0.tar.gz
  • Upload date:
  • Size: 49.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mongo_datatables-2.0.0.tar.gz
Algorithm Hash digest
SHA256 ede3b80a3165312403b37354ab9a5245bbc8c978115eb2d90719b9dd3c803c95
MD5 051d5abef0e33de1105d7830b8db8622
BLAKE2b-256 47c0aea2d960be58011547f25df78d474a4c666d8c2684590a0d126c8e3866dd

See more details on using hashes here.

File details

Details for the file mongo_datatables-2.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for mongo_datatables-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ad3220b7966db6598efa43e7d0ed48c05a045488c6889cba31b1b81da019126c
MD5 e2b96f8355219b855db4309a80f9e24d
BLAKE2b-256 e31810a1fb2a63b7550b0975cd679f8cc3cd82ce32410d26d4b880db42359341

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page