Wikipedia page view time series and growth via a Python client. Weekly or daily data, period-over-period growth, zero dependencies beyond httpx. Powered by trendsmcp.ai
Project description
wikipedia-trends-api
The number one Python package for Wikipedia trend data. Wikipedia page view time series and growth via a Python client. Weekly or daily data, period-over-period growth, zero dependencies beyond httpx.
Powered by trendsmcp.ai, the #1 MCP server for live trend data.
Get your free API key at trendsmcp.ai - 100 free requests per month, no credit card.
📖 Full API docs → trendsmcp.ai/docs
Updated for 2026. Works with Python 3.8 through 3.13.
No scraping. No 429 errors. No proxies.
If you have used pytrends or similar scrapers before, you know the problems: random 429 Too Many Requests blocks, broken pipelines at 2am, time.sleep() hacks, proxy rotation costs, and a library that is now archived because Google explicitly flags scrapers at the protocol level.
trendsmcp is the managed alternative. We run the data infrastructure. You call a REST endpoint.
pytrends alternative for Wikipedia data
| Scrapers / pytrends | trendsmcp | |
|---|---|---|
| 429 rate limit errors | constant | never |
| Proxy required | often | never |
| Breaks on platform changes | yes, regularly | no |
| Platforms covered | 1 (Google only) | 13 |
| Absolute volume estimates | no | yes |
| Cross-platform growth | no | yes |
| Async support | no | yes |
| Actively maintained | no (archived) | yes |
| Free tier | no | yes, 100 req/month |
Install
pip install wikipedia-trends-api
Zero system dependencies. Python 3.8 or later. Uses httpx under the hood.
Quick start
from wikipedia_trends_api import TrendsMcpClient, SOURCE
client = TrendsMcpClient(api_key="YOUR_API_KEY")
# 5-year weekly time series, no sleep(), no proxies, no 429s
series = client.get_trends(source=SOURCE, keyword="artificial intelligence")
print(series[0])
# TrendsDataPoint(date='2026-03-28', value=72, keyword='artificial intelligence', source='wikipedia')
# Period-over-period growth
growth = client.get_growth(
source=SOURCE,
keyword="artificial intelligence",
percent_growth=["12M", "YTD"],
)
print(growth.results[0])
# GrowthResult(period='3M', growth=14.5, direction='increase', ...)
# What's trending right now
trending = client.get_top_trends(limit=10)
print(trending.data)
# [[1, 'topic one'], [2, 'topic two'], ...]
Async support
import asyncio
from wikipedia_trends_api import AsyncTrendsMcpClient, SOURCE
async def main():
client = AsyncTrendsMcpClient(api_key="YOUR_API_KEY")
series = await client.get_trends(source=SOURCE, keyword="artificial intelligence")
print(series[0])
asyncio.run(main())
Run multiple platform queries concurrently:
google, youtube, reddit = await asyncio.gather(
client.get_trends(source="google search", keyword="artificial intelligence"),
client.get_trends(source="youtube", keyword="artificial intelligence"),
client.get_trends(source="reddit", keyword="artificial intelligence"),
)
Use cases
- SEO research: track keyword search volume trends across Google Search, Google News, and Google Images before publishing content
- Market research: measure consumer demand signals on Amazon and Google Shopping before entering a product category
- Investment research: monitor Reddit discussion volume, news sentiment, and Wikipedia page view spikes as leading indicators
- Content strategy: find what is growing on YouTube and TikTok before topics peak and competition saturates them
- Competitor tracking: compare brand search volume growth across platforms over custom date ranges
Works with
- Claude (via MCP server at trendsmcp.ai)
- Cursor (via MCP server at trendsmcp.ai)
- ChatGPT (via MCP server at trendsmcp.ai)
- VS Code Copilot (via MCP server at trendsmcp.ai)
- LangChain: pass
TrendsMcpClientoutput directly as tool results or context - LlamaIndex: use trend series as structured data nodes for retrieval
- Pandas: each
get_trends()response converts to a DataFrame in one line
Methods
get_trends(source, keyword, data_mode=None)
Returns a historical time series for a keyword. Defaults to 5 years of weekly data. Pass data_mode="daily" for the last 30 days at daily granularity.
get_growth(source, keyword, percent_growth, data_mode=None)
Calculates percentage growth between two points in time. Pass preset strings or CustomGrowthPeriod objects.
Growth presets: 7D 14D 30D 1M 2M 3M 6M 9M 12M 1Y 18M 24M 2Y 36M 3Y 48M 60M 5Y MTD QTD YTD
get_top_trends(type=None, limit=None)
Returns today's live trending items. Omit type to get all feeds at once.
Available feeds: Google Trends YouTube TikTok Trending Hashtags Reddit Hot Posts Amazon Best Sellers Top Rated App Store Top Free Wikipedia Trending Spotify Top Podcasts X (Twitter) and more.
All 13 supported sources
One API key. One client. All platforms. No separate credentials for each.
| source | What it measures |
|---|---|
"google search" |
Google Search volume |
"google images" |
Google Images search volume |
"google news" |
Google News search volume |
"google shopping" |
Google Shopping purchase intent |
"youtube" |
YouTube search volume |
"tiktok" |
TikTok hashtag volume |
"reddit" |
Reddit mention volume |
"amazon" |
Amazon product search volume |
"wikipedia" |
Wikipedia page views |
"news volume" |
News article mention count |
"news sentiment" |
News sentiment score (positive/negative) |
"npm" |
npm package weekly downloads |
"steam" |
Steam concurrent player count |
All values normalized 0 to 100 on the same scale so you can compare across platforms directly.
Error handling
from wikipedia_trends_api import TrendsMcpClient, TrendsMcpError, SOURCE
client = TrendsMcpClient(api_key="YOUR_API_KEY")
try:
series = client.get_trends(source=SOURCE, keyword="artificial intelligence")
except TrendsMcpError as e:
print(e.status) # e.g. 429 if you exceed your plan quota
print(e.code) # e.g. "rate_limited"
print(e.message)
Frequently asked questions
Does this scrape Wikipedia? No. trendsmcp runs managed data infrastructure. Your Python code makes a single authenticated REST call. No scraping, no Selenium, no cookies, no proxies required.
Do I need a Wikipedia developer account, OAuth token, or platform API key? No. One trendsmcp API key gives you access to all 13 sources.
Will it break when Wikipedia changes its backend? No. API stability is our responsibility. If something changes upstream, we update the backend. Your code keeps working.
Is there a free tier? Yes, 100 requests per month, no credit card required. Get your key at trendsmcp.ai.
Can I use this in production data pipelines? Yes. The client is stateless, thread-safe, and supports async for concurrent queries across multiple platforms.
Related packages
- trendsmcp - core package, all 13 sources
- youtube-trends-api / youtube-trends-mcp
- reddit-trends-api / reddit-trends-mcp
- google-search-trends-api / google-search-trends-mcp
- amazon-trends-api / amazon-trends-mcp
- tiktok-trends-api / tiktok-trends-mcp
- wikipedia-trends-api / wikipedia-trends-mcp
- npm-trends-api / npm-trends-mcp
- steam-trends-api / steam-trends-mcp
- app-store-trends-api / app-store-trends-mcp
- news-volume-api / news-volume-mcp
- news-sentiment-api / news-sentiment-mcp
Links
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file wikipedia_trends_api-1.0.0.tar.gz.
File metadata
- Download URL: wikipedia_trends_api-1.0.0.tar.gz
- Upload date:
- Size: 5.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7eddba98d64e63dc4c79d05e0dd7d0828dc74b24f7540ac25ca1f3698843e023
|
|
| MD5 |
83dc79cba4dc78803a4029a603dcdd62
|
|
| BLAKE2b-256 |
bfee9bbf9ebd70e417d2dc54551979b9ecb4252c30c6f232de1a92c538ab2879
|
Provenance
The following attestation bundles were made for wikipedia_trends_api-1.0.0.tar.gz:
Publisher:
publish.yml on trendsmcp/wikipedia-trends-api
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
wikipedia_trends_api-1.0.0.tar.gz -
Subject digest:
7eddba98d64e63dc4c79d05e0dd7d0828dc74b24f7540ac25ca1f3698843e023 - Sigstore transparency entry: 1203569545
- Sigstore integration time:
-
Permalink:
trendsmcp/wikipedia-trends-api@5ca5fda5655e50cd24c1aaaa4c1a250c9f97394c -
Branch / Tag:
refs/tags/v1.0.0 - Owner: https://github.com/trendsmcp
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5ca5fda5655e50cd24c1aaaa4c1a250c9f97394c -
Trigger Event:
release
-
Statement type:
File details
Details for the file wikipedia_trends_api-1.0.0-py3-none-any.whl.
File metadata
- Download URL: wikipedia_trends_api-1.0.0-py3-none-any.whl
- Upload date:
- Size: 6.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4ca2fc70efb595f10a5d30c1b19d00db038707ee25d29dcc58dfec3a2158f14c
|
|
| MD5 |
b8e9d0270de398e5a7e28ab859c348f7
|
|
| BLAKE2b-256 |
c1c9a14f3102de8ee906cdbbc9c86fee6dde2daf92a5937e7686e5a96129d0d8
|
Provenance
The following attestation bundles were made for wikipedia_trends_api-1.0.0-py3-none-any.whl:
Publisher:
publish.yml on trendsmcp/wikipedia-trends-api
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
wikipedia_trends_api-1.0.0-py3-none-any.whl -
Subject digest:
4ca2fc70efb595f10a5d30c1b19d00db038707ee25d29dcc58dfec3a2158f14c - Sigstore transparency entry: 1203569547
- Sigstore integration time:
-
Permalink:
trendsmcp/wikipedia-trends-api@5ca5fda5655e50cd24c1aaaa4c1a250c9f97394c -
Branch / Tag:
refs/tags/v1.0.0 - Owner: https://github.com/trendsmcp
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5ca5fda5655e50cd24c1aaaa4c1a250c9f97394c -
Trigger Event:
release
-
Statement type: