Official Python Client for Unihra API
Project description
Unihra Python SDK
SEO and semantic analysis for your pages and competitors.
Compare content, surface semantic gaps, and get actionable recommendations using zone analysis and vector semantics.
English · Русский
Resources
| Product | unihra.ru — web interface |
| API reference | unihra.ru/docs |
| API key | Telegram: @UniHRA_bot |
| Updates | @mncosine |
Features
- Semantic context (zones) — weights words by where they appear (title, H1–H6, body) and distance to your target queries, with concrete recommendations (for example, what to add to title or headings).
- Page structure — headings, meta tags, and content metrics for your URL and each competitor URL.
- Word comparison (TF‑IDF) — suggested actions per term (add, increase, decrease, ok).
- Phrases (n‑grams) — recurring phrases across competitor pages.
- Vector / LSI terms (DrMaxs) — semantically related vocabulary for the topic.
- Anchors (link texts) — identify missing internal and external link texts used by competitors to rank.
- Cookies — optional per‑URL cookie strings for pages behind login or gates.
- Streaming — the client handles the live analysis stream and waits for completion.
- Retries — optional HTTP retries with backoff for unstable networks.
- Reports — export multi‑sheet Excel reports with formatting (optional dependencies).
- Progress — optional progress bar in notebooks when
tqdmis installed.
Installation
pip install unihra
Optional bundles (install what you need):
| Command | Includes |
|---|---|
pip install "unihra[report]" |
Excel export (pandas, openpyxl) |
pip install "unihra[full]" |
Report export + progress bar (tqdm) |
pip install "unihra[mcp]" |
MCP server for Cursor / Claude Code (requires Python 3.10+) |
Or install pieces manually, for example: pip install pandas openpyxl tqdm.
Quick start
1. Run an analysis
Pass queries — the search intents you care about — so zone recommendations and gap analysis are meaningful.
from unihra import UnihraClient
client = UnihraClient(api_key="YOUR_API_KEY", max_retries=3)
result = client.analyze(
own_page="https://example.com/my-product",
competitors=[
"https://competitor.com/top-product",
"https://market-leader.com/item",
],
queries=["buy widget", "best widgets 2025"],
lang="en",
url_cookies={
"https://example.com/my-product": "session_id=abc123; auth=true",
},
verbose=True,
)
gaps = result.get("semantic_context_analysis", [])
pages = result.get("page_structure",[])
print(f"Semantic gap rows: {len(gaps)}")
for p in pages:
print(p["url"], "—", p["meta_tags"]["title"])
2. Save an Excel report
Sheet names typically include Page Structure, Semantic Gaps, Word Analysis, N‑Grams, Anchors, and vector sections.
client.save_report(result, "seo_report.xlsx")
What’s in the result
The SDK returns a Python dictionary aligned with the API. Keys are normalized to snake_case.
1. Page structure
A list of pages (yours first, then competitors). Each item includes:
urlmeta_tags—title,description, etc.content—h1_heading,heading_structure_raw(heading outline as text)metrics— e.g.char_count_no_spaces,uniqueness_percentage
2. Semantic context analysis
Zone‑weighted comparison of lemmas vs your queries:
lemma— base formcompetitor_avg_score,own_score— weighted scores (0.0 on your side often means missing or weak placement)gap— how far behind competitors you are (higher = higher priority)coverage_percent— share of competitors using the term in a strong contextcontext_snippet— short example from competitorsrecommendation— suggested action (e.g. add to title/H1)
{
"lemma": "battery",
"competitor_avg_score": 10.5,
"own_score": 0.0,
"gap": 10.5,
"coverage_percent": 80.0,
"context_snippet": "long lasting battery life",
"recommendation": "Add to Title/H1"
}
3. Block comparison (lexical)
TF‑IDF style comparison:
frequency,frequency_own_page,pct_target_comp_avgaction_needed—add,increase,decrease,ok(after normalization for English)
{
"word": "price",
"frequency": 12.5,
"pct_target_comp_avg": 2.5,
"action_needed": "increase",
"present_on_own_page": true
}
4. N‑grams
Phrases (bigrams / trigrams) and how many competitor pages contain them.
ngram,pages_count, etc.
5. DrMaxs (vector / LSI)
Semantic neighbours of the topic, grouped (e.g. by_frequency, by_tfidf), with similarity_score and whether the word appears on your page.
{
"word": "logistics",
"similarity_score": 0.89,
"present_on_own_page": false
}
6. Anchors analysis
Comparison of link texts (anchors) used across pages.
anchor— the link textfrequency_own— occurrences on your pagefrequency_comp_avg— average occurrences across competitorspages_count— number of competitor pages using this anchor
{
"anchor": "buy online",
"frequency_own": 0,
"frequency_comp_avg": 5.0,
"pages_count": 3
}
Command line
python -m unihra \
--key "YOUR_API_KEY" \
--own "https://mysite.com" \
--comp "https://comp1.com" \
--comp "https://comp2.com" \
--query "main keyword" \
--cookies "session=secret_123" \
--save report.xlsx \
--verbose
| Option | Meaning |
|---|---|
--own |
Your page URL (required) |
--comp |
Competitor URL (repeat for multiple; at least one required) |
--query |
Target query (repeatable; recommended) |
--lang |
ru or en (default ru) |
--cookies |
Cookie string for your own page |
--save |
Write .xlsx or .csv report |
--retries |
HTTP retry count |
--verbose |
Show progress |
--no-style |
Plain Excel without extra styling |
You can omit --key if the environment variable UNIHRA_API_KEY is set.
Without --save and without --verbose, JSON is printed to the terminal.
Cursor, Claude, and other MCP clients
The optional MCP server lets compatible assistants call Unihra as tools instead of fetching pages themselves.
- Install:
pip install "unihra[mcp]"(Python 3.10+). - Set your API key: environment variable
UNIHRA_API_KEY, or pass--keywhen starting the server. - Start:
python -m unihra.mcp_serveror the commandunihra-mcp. - Point your client’s MCP settings at that Python and module (see below).
File-backed architecture: The unihra_analyze tool runs the full analysis and saves the large raw result (~200k chars) to a local JSON file, returning only a result_id and a compact summary to the LLM. The model then uses unihra_get_* tools with the result_id to retrieve specific data slices on demand. This keeps the context small while ensuring no data is lost.
Available tools:
| Tool | Purpose |
|---|---|
unihra_health |
Check that the service is reachable |
unihra_analyze |
Primary tool: runs full analysis, saves to disk, returns result_id + summary |
unihra_list_results |
List all saved analysis results on disk |
unihra_delete_result |
Delete a saved analysis result by result_id |
unihra_get_page_structure |
Fetch heading/meta report for a result_id |
unihra_get_gaps |
Get semantic gaps and zone recommendations from a result_id |
unihra_get_anchors |
Get anchor text (link texts) analysis from a result_id |
unihra_get_vectors |
LSI / vector terms from a result_id |
unihra_get_word_actions |
TF‑IDF words grouped by action |
unihra_get_ngrams |
Phrase list from a result_id |
Example MCP configuration (adjust paths to your Python executable):
{
"mcpServers": {
"unihra": {
"command": "python",
"args":["-m", "unihra.mcp_server"],
"env": {
"UNIHRA_API_KEY": "YOUR_KEY_HERE"
}
}
}
}
Unihra Team
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file unihra-1.5.0.tar.gz.
File metadata
- Download URL: unihra-1.5.0.tar.gz
- Upload date:
- Size: 24.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
59e4b647b24d9176b674afbaa8f4877df2f35bd0ef65bff5f0f854ef28e809d8
|
|
| MD5 |
6d9f681c4d5163f1c351c03bfd7cbcda
|
|
| BLAKE2b-256 |
f11f7719e2a62de2c5d1b88d26389f58b8c941b0f6ff9383b4fc7cd5694e93de
|
File details
Details for the file unihra-1.5.0-py3-none-any.whl.
File metadata
- Download URL: unihra-1.5.0-py3-none-any.whl
- Upload date:
- Size: 23.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a1334f44582c9a04fb4c15b8ef092d792843144b54201cc349acaa5c5953139e
|
|
| MD5 |
8a0fcfa878b13d06443a4b102b02196e
|
|
| BLAKE2b-256 |
a300f14322967465cf7d3af0944fc0060543324f5984eb729c4e84141c6e33d6
|