Conversational weather agent with LangGraph and configurable LLMs
Project description
Codex Weather Agent 🌤️
A conversational weather agent powered by LangGraph with configurable LLM support and intelligent conversation memory.
🚀 Features
🤖 Configurable LLM Support
- Google Gemini (2.5 Flash, 1.5 Pro, 1.0 Pro)
- OpenAI (GPT-4, GPT-4 Turbo, GPT-3.5 Turbo)
- Anthropic Claude (Claude 3 Sonnet, Haiku, Claude 2)
- Custom LLM support for any LangChain-compatible model
🌍 Comprehensive Weather Data
- Current Weather - Real-time conditions for any city
- 5-Day Forecasts - Detailed predictions with 3-hour intervals
- Air Quality - Pollution levels and air quality indices
- Location Detection - Automatic IP-based location discovery
- Geocoding - Convert location names to coordinates
💬 Natural Conversation
- Memory Management - Remembers conversation context (configurable)
- Conversational Style - Natural responses without bullet points or lists
- Streaming Support - Real-time response generation
- Error Recovery - Graceful handling of API failures
📦 Installation
Basic Installation
pip install codex-weather-agent
With specific LLM providers
# For Google Gemini (recommended)
pip install codex-weather-agent[google]
# For OpenAI
pip install codex-weather-agent[openai]
# For Anthropic Claude
pip install codex-weather-agent[anthropic]
# For all providers
pip install codex-weather-agent[all]
🔧 Quick Start
Basic Usage with Google Gemini
from codex_weather_agent import create_weather_agent
# Create agent with Google Gemini (ALL PARAMETERS REQUIRED)
agent = create_weather_agent(
llm_provider="google", # REQUIRED: Choose your LLM provider
llm_model="gemini-2.5-flash", # REQUIRED: Specify model name
llm_api_key="your-google-api-key", # REQUIRED: Your LLM API key
openweather_api_key="your-openweather-key" # REQUIRED: OpenWeather API key
)
# Have a natural conversation about weather
response = agent.chat("What's the weather like right now?")
print(response)
response = agent.chat("How about tomorrow in Tokyo?")
print(response)
Using Different LLM Providers
# OpenAI GPT-4 (ALL PARAMETERS REQUIRED)
agent = create_weather_agent(
llm_provider="openai",
llm_model="gpt-4",
llm_api_key="your-openai-key", # REQUIRED
openweather_api_key="your-weather-key" # REQUIRED
)
# Anthropic Claude (ALL PARAMETERS REQUIRED)
agent = create_weather_agent(
llm_provider="anthropic",
llm_model="claude-3-sonnet",
llm_api_key="your-anthropic-key", # REQUIRED
openweather_api_key="your-weather-key" # REQUIRED
)
# Custom LLM
from langchain_openai import ChatOpenAI
custom_llm = ChatOpenAI(model="gpt-3.5-turbo", temperature=0.2)
agent = create_weather_agent(
llm_provider="custom",
llm_model="custom", # Can be any string for custom LLM
llm_api_key="not-used-for-custom", # Still required
openweather_api_key="your-weather-key", # REQUIRED
custom_llm=custom_llm
)
Advanced Configuration
from codex_weather_agent import WeatherAgent, LLMConfig, WeatherConfig
# Detailed configuration
llm_config = LLMConfig(
provider="google",
model="gemini-2.5-flash",
temperature=0.1,
max_tokens=1000,
api_key="your-api-key"
)
weather_config = WeatherConfig(
openweather_api_key="your-weather-key",
request_timeout=10,
default_units="metric"
)
# Create agent with custom configurations
agent = WeatherAgent(
llm_config=llm_config,
weather_config=weather_config,
max_memory_conversations=10 # Remember last 10 conversations
)
Streaming Responses
# Get real-time streaming responses
for chunk in agent.stream_chat("Tell me about the weather in Paris"):
print(chunk, end="", flush=True)
Memory Management
# Check memory usage
memory_info = agent.get_memory_info()
print(f"Conversations in memory: {memory_info['current_conversations']}")
print(f"LLM Provider: {memory_info['llm_provider']}")
# Clear conversation memory
agent.clear_memory()
🔑 API Keys (REQUIRED)
⚠️ All API Keys Are Now Mandatory
Starting from version 1.0.4, ALL API keys are required for the weather agent to function. This ensures reliable operation and prevents rate limiting issues.
Required API Keys
-
LLM Provider API Key (mandatory - choose one):
- Google: Get from Google AI Studio
- OpenAI: Get from OpenAI Platform
- Anthropic: Get from Anthropic Console
-
OpenWeather API Key (mandatory):
- Get from OpenWeatherMap
- Free tier available with 1,000 calls/day
- No longer optional - you must provide your own key
Setting API Keys
Environment Variables (Recommended)
export GOOGLE_API_KEY="your-google-api-key"
export OPENWEATHER_API_KEY="your-openweather-key"
Direct Parameter Passing
agent = create_weather_agent(
llm_api_key="your-llm-api-key",
openweather_api_key="your-weather-key"
)
🌟 Example Conversations
The agent responds naturally without structured formatting:
agent = create_weather_agent(llm_provider="google", llm_api_key="your-key")
# Natural weather queries
print(agent.chat("Hey, what's it like outside?"))
# "Hey there! Let me check your current location... It looks like you're in New York, and it's a beautiful sunny day with 72°F and clear skies!"
print(agent.chat("Should I bring an umbrella tomorrow?"))
# "Based on tomorrow's forecast for New York, you should definitely grab an umbrella! There's rain expected in the afternoon with about 80% chance of precipitation..."
print(agent.chat("What about the air quality?"))
# "The air quality in your area is pretty good today with an AQI of 45, which means it's safe for outdoor activities and everyone can enjoy being outside!"
🛠️ Available Tools
The agent has access to these weather tools:
current_location()- Detect user's location via IPget_current_weather(city)- Current weather conditionsget_weather_forecast(city)- 5-day weather forecastget_air_pollution(lat, lon)- Air quality dataget_location_coordinates(location)- Geocoding service
🔧 Configuration Options
LLM Configuration
from codex_weather_agent import LLMConfig
config = LLMConfig(
provider="google", # "google", "openai", "anthropic", "custom"
model="gemini-2.5-flash",
api_key="your-key",
temperature=0.1, # Response randomness (0.0-1.0)
max_tokens=1000, # Maximum response length
top_p=0.8, # Nucleus sampling (Google only)
top_k=40, # Top-k sampling (Google only)
additional_params={} # Provider-specific parameters
)
Weather Configuration
from codex_weather_agent import WeatherConfig
config = WeatherConfig(
openweather_api_key="your-key",
request_timeout=5, # API request timeout
max_retries=3, # Number of retry attempts
enable_location_detection=True, # Auto-detect user location
default_units="metric" # "metric", "imperial", "kelvin"
)
🧪 Error Handling
The package includes comprehensive error handling:
from codex_weather_agent import WeatherAgentError, LLMConfigError, APIKeyError
try:
agent = create_weather_agent(llm_provider="invalid")
except LLMConfigError as e:
print(f"LLM configuration error: {e}")
try:
response = agent.chat("What's the weather?")
except WeatherAgentError as e:
print(f"Weather agent error: {e}")
📋 Requirements
- Python 3.8+
- Internet connection for weather data and LLM APIs
- Valid API keys for your chosen LLM provider
Core Dependencies
requests>=2.31.0langchain-core>=0.3.0langgraph>=0.2.0typing-extensions>=4.7.0
Optional Dependencies
langchain-google-genai>=2.0.0(for Google Gemini)langchain-openai>=0.2.0(for OpenAI models)langchain-anthropic>=0.2.0(for Anthropic Claude)
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🔗 Links
- PyPI: https://pypi.org/project/codex-weather-agent/
- GitHub: https://github.com/CodexJitin/codex-weather-agent
- Documentation: https://github.com/CodexJitin/codex-weather-agent#readme
- Issues: https://github.com/CodexJitin/codex-weather-agent/issues
Made with ❤️ by CodexJitin
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file codex_weather_agent-1.0.4.tar.gz.
File metadata
- Download URL: codex_weather_agent-1.0.4.tar.gz
- Upload date:
- Size: 19.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0d8cc523c1df774cc15c3fd7c764925b9a5199d9d9eba02d9948b9f75eb9f9b5
|
|
| MD5 |
362c597ec7b48e907135cb4e730631ff
|
|
| BLAKE2b-256 |
475a919ddc0c126baa43d1524c0c1a9a010d3cb73ca080fd6f85c774f94dbe14
|
File details
Details for the file codex_weather_agent-1.0.4-py3-none-any.whl.
File metadata
- Download URL: codex_weather_agent-1.0.4-py3-none-any.whl
- Upload date:
- Size: 19.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
924024de948d39f3129c4148093606f342a20a4d208d0393c89837fbabf1e373
|
|
| MD5 |
7edeea0944f8e81f8e683ac1070ff462
|
|
| BLAKE2b-256 |
55146245a887e2f4c51c18a4e29f95f1274b267d2bcfe7bcf900ecc2cb7cd15b
|