GraphBridge is a lightweight Microsoft Graph client that uses app-only (Azure AD) authentication and streamlines SharePoint site/list operations: metadata retrieval, feature-based queries, CRUD, upsert, and field key encoding/decoding.
Project description
GraphBridge
A tiny Python helper to authenticate against Microsoft Graph (app-only) and work with SharePoint Online (sites & lists): read, create, update, delete, upsert, and batch.
Works with Python 3.10+ · Microsoft Graph v1.0 · App-only auth via client credentials (Entra ID / Azure AD)
Table of Contents
- Installation
- Microsoft 365 prerequisites
- Key concepts
- Quick start
- Usage examples
- API reference
- Errors & best practices
- Security
- Compatibility & notes
- Full example
Installation
pip install graphbridge azure-identity requests
If your package on PyPI uses a different name, replace
graphbridgewith the actual name. Runtime dependencies:azure-identityandrequests.
Microsoft 365 prerequisites
-
An App registration in Entra ID (Azure AD) with Client ID, Tenant ID, and Client Secret.
-
Application permissions for Microsoft Graph (SharePoint scope):
- Read-only:
Sites.Read.All - Read/Write:
Sites.ReadWrite.All
- Read-only:
-
Admin consent granted by a tenant admin.
-
The app must have access to the target SharePoint site and list.
Key concepts
-
GbAuth Builds app-only authentication (client credentials) and provides the Bearer token and headers.
-
GbSite Resolves a SharePoint site from
hostname(e.g.,contoso.sharepoint.com) andsite_path(e.g.,/sites/Marketing) to obtain thesite_idvia Graph. -
GbList Operates on a SharePoint list: read items/rows, create, update, delete, upsert, and batch via Graph
$batch.
Quick start
from graphbridge import GbAuth, GbSite, GbList
import os
# 1) App-only authentication
auth = GbAuth(
tenant_id=os.environ["AZURE_TENANT_ID"],
client_id=os.environ["AZURE_CLIENT_ID"],
client_secret=os.environ["AZURE_CLIENT_SECRET"],
)
# 2) Resolve the SharePoint site
site = GbSite(
hostname="contoso.sharepoint.com",
site_path="/sites/Marketing",
gb_auth=auth,
)
print("Site ID:", site.site_id)
# 3) Bind the list
tasks = GbList(
list_name="Tasks", # display name of your list
gb_site=site
)
# 4) Read the first rows
rows = tasks.list_rows
print("Example row:", rows[0] if rows else "No items")
Usage examples
Read list rows
rows = tasks.list_rows # list of dicts under 'fields'
ids = tasks.list_ids # list item IDs
columns = tasks.list_fields # column names (keys from first 'fields')
for r in rows[:5]:
print(r.get("Title"), r.get("Status"))
list_rowsrelies onlist_items_all, which automatically paginates and returns all items in the list.
Create, update, delete
Create one or more items
# Single
res_create = tasks.create({"Title": "New task", "Status": "Open"})
print(res_create)
# Multiple
res_create_many = tasks.create([
{"Title": "Task 1", "Status": "Open"},
{"Title": "Task 2", "Status": "InProgress"},
])
print(res_create_many)
Update by ID (PATCH fields)
res_update = tasks.update(
ids="25",
rows={"Status": "Done", "PercentComplete": 1.0}
)
print(res_update)
Delete by ID
res_delete = tasks.delete(ids=["25", "26", "27"])
print(res_delete)
Every method returns a result object with
successesandfailuresso you can inspect what happened.
Upsert with upload()
upload(ids, rows, force=False, delete=False) synchronizes the list with your local source:
-
If an
idalready exists:force=True➜ delete & recreate the item (replace).force=False➜ patch update the item.
-
If an
iddoes not exist ➜ create a new item. -
delete=True➜ remove items not present inids(cleanup).
# Bring the list in sync with 3 source rows:
ids = ["101", "102", "103"]
rows = [
{"Title": "A", "Status": "Open"},
{"Title": "B", "Status": "InProgress"},
{"Title": "C", "Status": "Done"},
]
res_upload = tasks.upload(ids=ids, rows=rows, force=False, delete=True)
print(res_upload)
Simplified return shape:
{
"delete_results": {"successes": [], "failures": []},
"force_results": {
"replaced": {"successes": [], "failures": []},
"updated": {"successes": [{"id":"101","row":{"Title":"A"}}], "failures": []},
"created": {"successes": [{"id":"103","new_id":"345"}], "failures": []}
}
}
Batch: create_many() and delete_many()
These use the Graph $batch endpoint (default 20 sub-requests per batch).
# CREATE in batch
bulk_rows = [{"Title": f"Bulk {i}"} for i in range(1, 51)]
bulk_create = tasks.create_many(bulk_rows, batch_size=20)
print(bulk_create)
# DELETE in batch
bulk_delete = tasks.delete_many(ids=["301", "302", "303"], batch_size=20, if_match="*")
print(bulk_delete)
if_match="*"disables concurrency checks for batch deletes (use carefully).
Filter with get_items_by_features()
Returns raw list items (not only fields) that match at least one criteria dict (OR across dicts; AND inside each dict).
Important: matching occurs on the top-level keys of each item. To filter by list fields, include the
"fields"key explicitly.
features = [
{"fields": {"Status": "Open"}}, # match on fields.Status
{"id": "123"} # match on top-level id
]
matched_items = tasks.get_items_by_features(features)
print(len(matched_items))
Handle fields with spaces/symbols
SharePoint often encodes internal column names (e.g., spaces) as _x0020_.
Use the helpers to map keys both ways:
row = {"Project Name": "Apollo", "Start-Date": "2025-08-24"}
encoded = tasks.encode_row(row) # {'Project_x0020_Name': 'Apollo', 'Start_x002d_Date': '2025-08-24'}
decoded = tasks.decode_row(encoded) # back to human-friendly keys
This is handy when an API/payload requires internal field names.
API reference
Utility
deduplicate_dicts(dict_list: list[dict]) -> list[dict]Removes duplicates (by sorted JSON) from a list of dicts.
class GbAuth
GbAuth(tenant_id: str, client_id: str, client_secret: str)
Properties:
credential: ClientSecretCredential– lazily cachedazure.identity.ClientSecretCredential.token: str– Graph access token forhttps://graph.microsoft.com/.default.headers: dict–{"Authorization": f"Bearer {token}"}.
Validation: raises TypeError/ValueError for empty/non-string inputs; raises RuntimeError if token acquisition fails.
class GbSite(GbAuth)
GbSite(hostname: str, site_path: str, gb_auth: GbAuth | None = None, **auth_kwargs)
- Pass either an existing
GbAuthviagb_author auth keywords (tenant_id,client_id,client_secret).
Properties:
hostname: str– e.g.,contoso.sharepoint.comsite_path: str– e.g.,/sites/Marketing(or/teams/...)site_url: str– Graph URL for the site.site_data: dict– site metadata (cached).site_id: str– resolved fromsite_data.
Errors: raises RuntimeError if the site GET is not 200.
class GbList(GbSite)
GbList(list_name: str, gb_site: GbSite | None = None, **site_and_auth_kwargs)
- Pass either a
GbSiteor site+auth keywords (hostname,site_path,tenant_id, …).
Core properties:
list_url: str,list_data: dict,list_id: strlist_items_all -> list[dict]All list items with automatic pagination. Internally requests pages of 200 items; page size is not configurable via the property.list_items -> list[dict]First page of items (handy for quick checks).list_rows -> list[dict]Only thefieldssection of each item (derived fromlist_items_all).list_ids -> list[str]list_fields -> list[str](keys from the firstfieldsif present)encode_row(row: dict) -> dict,decode_row(row: dict) -> dictBidirectional mapping between friendly keys and encoded_x00.._internal names.
CRUD methods:
create(rows: dict | list[dict] | tuple[dict] | set[dict]) -> dictupdate(ids: str | int | list[str] | tuple[str] | set[str], rows: dict | list[dict] | tuple[dict]) -> dictdelete(ids: str | list[str] | tuple[str] | set[str]) -> dict
Upsert & batch:
upload(ids, rows, force=False, delete=False) -> dictcreate_many(rows: list[dict], batch_size: int = 20) -> dictdelete_many(ids, batch_size: int = 20, if_match: str | None = None) -> dict
Criteria-based retrieval:
get_items_by_features(features: list[dict]) -> list[dict]OR across dicts, AND within each dict. Note: comparisons are on top-level item keys; use{"fields": {...}}for list fields.
Errors & best practices
-
Methods that call Graph return:
successes: items with details (id,item,updated_row, …)failures: errors withstatus/errorpayload
-
Wrap critical calls with
try/except:
try:
out = tasks.create({"Title": "Check errors"})
if out["failures"]:
print("Failures:", out["failures"])
except Exception as e:
print("Fatal error:", e)
- Permissions:
403 Forbiddenusually means missing application permission (Sites.ReadWrite.All) or missing admin consent. - Concurrency: batch deletes support
if_match; standard updates use PATCH onfields. - Pagination: prefer
list_rows/list_items_allto fetch complete sets;list_itemsis a single page. - Rate limits: for large workloads, prefer batch methods and keep
batch_sizereasonable (typically ≤ 20).
Security
- Never hardcode the Client Secret. Use environment variables or a secret vault.
GbAuthobtains tokens at runtime viaazure-identity.- Never commit credentials or tokens to source control.
Compatibility & notes
- Python 3.10+ (uses modern union types like
str | int). list_items_allis exposed as a property; it internally requests pages of 200 items until completion.- Column names: Graph’s
fieldsoften use internal names; if your columns have spaces/symbols, useencode_row/decode_rowto simplify payload handling.
Full example
from graphbridge import GbAuth, GbSite, GbList
auth = GbAuth(
tenant_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
client_id="yyyyyyyy-yyyy-yyyy-yyyy-yyyyyyyyyyyy",
client_secret="********"
)
site = GbSite(
hostname="contoso.sharepoint.com",
site_path="/sites/Finance",
gb_auth=auth
)
invoices = GbList(list_name="Invoices", gb_site=site)
# 1) Read
for r in invoices.list_rows[:3]:
print("Invoice:", r.get("Title"), r.get("Amount"))
# 2) Create
new_items = invoices.create([
{"Title": "I-2025-001", "Amount": 1000},
{"Title": "I-2025-002", "Amount": 2500},
])
print("Create:", new_items["successes"])
# 3) Update
upd = invoices.update(ids="42", rows={"Amount": 3000})
print("Update:", upd)
# 4) Upsert & cleanup
ids = ["1001", "1002"]
rows = [{"Title": "A"}, {"Title": "B"}]
print(invoices.upload(ids=ids, rows=rows, force=False, delete=True))
# 5) Batch delete
print(invoices.delete_many(ids=["1001", "1002"], if_match="*"))
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file graphbridge-0.0.5.tar.gz.
File metadata
- Download URL: graphbridge-0.0.5.tar.gz
- Upload date:
- Size: 19.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bcbf4d5cd367fd1ddf392601652dfd11876360c484a0831114d0f70512f6a353
|
|
| MD5 |
e23123256c81b90b70add18b1b79db00
|
|
| BLAKE2b-256 |
c6eab76aac086795679e64be4d7479795207506a171de239924d0c6a50d6d103
|
File details
Details for the file graphbridge-0.0.5-py3-none-any.whl.
File metadata
- Download URL: graphbridge-0.0.5-py3-none-any.whl
- Upload date:
- Size: 14.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
49e6c5ce340b7c34826945f56697fb6d9920a4c360f3c1b4858a9b14a97f6720
|
|
| MD5 |
24f319814492d386aa2988c7ebd8383c
|
|
| BLAKE2b-256 |
08b28359b1b56e328f59306ae0123f1addc67d97762f9d5d992ff160ff4ac167
|