Skip to main content

Core integration package for building AI agents with CrewAI, providing configuration management, memory systems, and tool integration

Project description

Cognition Core

Core integration package for building AI agents with CrewAI, providing configuration management, memory systems, tool integration, and API capabilities.

Architecture

Cognition AI

cognition-core/
├── src/
│   └── cognition_core/
│       ├── api.py              # Core API implementation
│       ├── crew.py             # Enhanced CrewAI base
│       ├── agent.py            # Enhanced Agent class
│       ├── task.py             # Enhanced Task class
│       ├── llm.py              # Portkey LLM integration
│       ├── logger.py           # Logging system
│       ├── config.py           # Configuration management
│       ├── memory/             # Memory implementations
│       │   ├── entity.py       # Entity memory with Chroma
│       │   ├── long_term.py    # PostgreSQL long-term memory
│       │   ├── short_term.py   # Chroma short-term memory
│       │   ├── storage.py      # ChromaDB storage implementation
│       │   └── mem_svc.py      # Memory service orchestration
│       └── tools/              # Tool management
│           ├── custom_tool.py  # Base for custom tools
│           └── tool_svc.py     # Dynamic tool service

Core Features

1. Enhanced Crew Base

  • Automatic API capability through @CognitionCoreCrewBase decorator
  • Integrated tool service management
  • Memory system initialization
  • Configuration management
  • Portkey LLM integration

2. Memory Systems

  • Short-term Memory: ChromaDB-based implementation
  • Long-term Memory: PostgreSQL-based storage
  • Entity Memory: Relationship tracking with ChromaDB
  • Configurable storage backends
  • Embedder configuration support

3. Tool Integration

  • Dynamic tool loading from HTTP endpoints
  • Tool service with caching
  • Async tool operations
  • Tool refresh capability
  • Structured tool definitions with Pydantic

4. API Integration

  • Built-in FastAPI implementation
  • Async task processing
  • Health check endpoints
  • Task status tracking
  • Background task execution

5. Configuration Management

  • Hot-reloading YAML configuration
  • Environment variable integration
  • Configurable paths
  • Fallback to CrewAI defaults

Environment Variables

Required:

  • PORTKEY_API_KEY: API key for Portkey LLM routing
  • PORTKEY_VIRTUAL_KEY: Virtual key for Portkey

Optional:

  • COGNITION_CONFIG_DIR: Configuration directory path
  • CONFIG_RELOAD_TIMEOUT: Config reload timeout (default: 0.1)
  • LONG_TERM_DB_PASSWORD: PostgreSQL database password
  • CHROMA_PASSWORD: ChromaDB password
  • APP_LOG_LEVEL: Logging level (default: INFO)

Usage Example

from cognition_core import CognitionCoreCrewBase
from crewai import Agent, Task

@CognitionCoreCrewBase
class YourCrew:
    @agent
    def researcher(self) -> Agent:
        return self.get_cognition_agent(
            config=self.agents_config["researcher"],
            llm=self.init_portkey_llm(
                model="gpt-4",
                portkey_config=self.portkey_config
            )
        )

    @task
    def research_task(self) -> Task:
        return CognitionTask(
            config=self.tasks_config["research"],
            tools=["calculator", "search"]
        )

    @crew
    def crew(self) -> Crew:
        return CognitionCrew(
            agents=self.agents,
            tasks=self.tasks,
            process=Process.sequential,
            memory=True,
            tool_service=self.tool_service
        )

# Access API
app = YourCrew().api

Configuration Files

Memory Configuration (memory.yaml)

short_term_memory:
  enabled: true
  external: true
  host: "localhost"
  port: 8000
  collection_name: "short_term"

long_term_memory:
  enabled: true
  external: true
  connection_string: "postgresql://user:${LONG_TERM_DB_PASSWORD}@localhost:5432/db"

entity_memory:
  enabled: true
  external: true
  host: "localhost"
  port: 8000
  collection_name: "entities"

embedder:
  provider: "ollama"
  config:
    model: "nomic-embed-text"
    vector_dimension: 384

Tool Configuration (tools.yaml)

tool_services:
  - name: "primary_service"
    enabled: true
    base_url: "http://localhost:8080/api/v1"
    endpoints:
      - path: "/tools"
        method: "GET"

settings:
  cache:
    enabled: true
    ttl: 3600
  validation:
    response_timeout: 30

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Submit a pull request with tests

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cognition_core-0.4.0.tar.gz (118.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cognition_core-0.4.0-py3-none-any.whl (20.6 kB view details)

Uploaded Python 3

File details

Details for the file cognition_core-0.4.0.tar.gz.

File metadata

  • Download URL: cognition_core-0.4.0.tar.gz
  • Upload date:
  • Size: 118.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for cognition_core-0.4.0.tar.gz
Algorithm Hash digest
SHA256 18602b11262c0a90f6fadff1401cc4889cb42a6687e4f754af7c9d5d6b61294e
MD5 42581adf533d071842f8e91f90a99a3d
BLAKE2b-256 46c12dc788f96fe92aebe93455794557aa8cbfc0348453f2574c828895eb7c9d

See more details on using hashes here.

File details

Details for the file cognition_core-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: cognition_core-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 20.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for cognition_core-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 15f6ea23af76ddb112ed5c9574e796359a0f5de36e2532d11e2e581c7a32fcaa
MD5 86a235d10f5a21a277e14688e07915cf
BLAKE2b-256 ed4159c761eb05fbd58f4a245954d710c80be3d7047e2632ce64b49d6cabe30d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page