KISS Agent Framework - A simple and portable agent framework for building and evolving AI agents
Project description
When Simplicity Becomes Your Superpower: Meet KISS Sorcar, a General-purpose and Software engineering AI Assistant and IDE
"Everything should be made as simple as possible, but not simpler." — Albert Einstein
KISS stands for "Keep it Simple, Stupid" which is a well-known software engineering principle.
Table of Contents
Introduction to KISS Sorcar
KISS Sorcar (named after P.C. Sorcar, the legendary Bengali magician, evoking the idea of an agent that performs feats that appear magical yet are grounded in disciplined engineering) is a general-purpose assistant and integrated development environment (IDE) built on top of the KISS Agent Framework, a stupidly-simple agentic framework. It codes really well and works pretty fast. The agent can run relentlessly for hours. KISS Sorcar is implemented as a Visual Studio Code extension that runs locally. It has full browser support (using open-source Chromium browser and Playwright), multimodal support, Docker container support, and OpenClaw like features (whose functionality will be posted later in the social media). The good part is that KISS Sorcar is completely free and open-source; all one needs is a model API key from a major LLM provider, such as Anthropic (highly recommended). A paper on KISS Sorcar can be found at papers/kisssorcar/kiss_sorcar.pdf. An old video on KISS Sorcar can be found at https://www.youtube.com/watch?v=xnYxWvRqACE. We no longer recommend to explicitly create a plan in KISS Sorcar. See the paper for details.
Note that Sorcar also means government in Bengali.
Full Installation
git clone https://github.com/ksenxx/kiss_ai.git
cd kiss_ai
./install.sh
KISS Sorcar Extension Installation
To Install KISS Sorcar, open Visual Studio Code, search for "KISS Sorcar" in the extension marketplace, install, and relaunch VS Code. Press ESC if you don't have a specific API key, but you must provide at least one API key.
You can also manually download the extension from src/kiss/agents/vscode/kiss-sorcar.vsix.
CLI Interface
If you do not want to use the KISS Sorcar IDE, you can open a terminal and use sorcar as a normal shell command. Some examples are:
sorcar -t "What is 2435*234"
sorcar -n -t --use-chat "What is 2435*234?" # to start in a new chat session in sorcar use -n
sorcar -m "claude-sonnet-4-6" -t "What is 2435*234?" # to use a specific model
echo "Can you find the cheapest non-stop flight from SFO to JFK on June 15 by consulting various websites?" > prompt
sorcar -f prompt # use contents of a file to send task
sorcar -t 'Can you send the message "Hello from Sorcar!" to ksen via the desktop slack app?'
sorcar -t 'Can you write a thorough and precise plan in PLAN.md to simplify the project code?'
sorcar -t 'I see some issues and bugs in PLAN.md. Can you fix them?' # lie to the agent to force improve the plan
🤖 Models Supported
Supported Models: The framework includes context length, pricing, and capability flags for:
Generation Models (text generation with function calling support):
- OpenAI: gpt-4.1, gpt-4.1-mini, gpt-4.1-nano, gpt-4o, gpt-4o-mini, gpt-4.5-preview, gpt-4-turbo, gpt-4, gpt-5, gpt-5-mini, gpt-5-nano, gpt-5-pro, gpt-5.1, gpt-5.2, gpt-5.2-pro, gpt-5.3-chat-latest, gpt-5.4, gpt-5.4-pro
- OpenAI (Codex): gpt-5-codex, gpt-5.1-codex, gpt-5.1-codex-max, gpt-5.1-codex-mini, gpt-5.2-codex, gpt-5.3-codex, codex-mini-latest
- OpenAI (Reasoning): o1, o1-mini, o1-pro, o3, o3-mini, o3-mini-high, o3-pro, o3-deep-research, o4-mini, o4-mini-high, o4-mini-deep-research
- OpenAI (Open Source): openai/gpt-oss-20b, openai/gpt-oss-120b
- Anthropic: claude-opus-4-6, claude-opus-4-5, claude-opus-4-1, claude-opus-4, claude-sonnet-4-6, claude-sonnet-4-5, claude-sonnet-4, claude-haiku-4-5
- Anthropic (Legacy): claude-3-5-haiku, claude-3-haiku-20240307
- Gemini: gemini-2.5-pro, gemini-2.5-flash, gemini-2.0-flash, gemini-2.0-flash-lite, gemini-1.5-pro (deprecated), gemini-1.5-flash (deprecated)
- Gemini (preview, unreliable function calling): gemini-3-pro-preview, gemini-3-flash-preview, gemini-3.1-pro-preview, gemini-3.1-flash-lite-preview, gemini-2.5-flash-lite
- Together AI (Llama): Llama-4-Scout/Maverick (with function calling), Llama-3.x series (generation only)
- Together AI (Qwen): Qwen2.5-72B/7B-Instruct-Turbo, Qwen2.5-Coder-32B, Qwen2.5-VL-72B, Qwen3-235B series, Qwen3-Coder-480B, Qwen3-Coder-Next, Qwen3-Next-80B, Qwen3-VL-32B/8B, QwQ-32B (with function calling)
- Together AI (DeepSeek): DeepSeek-R1, DeepSeek-V3-0324, DeepSeek-V3.1 (with function calling)
- Together AI (Kimi/Moonshot): Kimi-K2-Instruct, Kimi-K2-Instruct-0905, Kimi-K2-Thinking, Kimi-K2.5
- Together AI (Mistral): Ministral-3-14B, Mistral-7B-v0.2/v0.3, Mistral-Small-24B
- Together AI (Z.AI): GLM-5.0, GLM-4.5-Air, GLM-4.7
- Together AI (Other): Nemotron-Nano-9B, Arcee (Coder-Large, Maestro-Reasoning, Virtuoso-Large, trinity-mini), DeepCogito (cogito-v2 series), google/gemma-2b/3n, Refuel-LLM-2/2-Small, essentialai/rnj-1, marin-community/marin-8b
- OpenRouter: Access to 300+ models from 60+ providers via unified API:
- OpenAI (gpt-3.5-turbo, gpt-4, gpt-4-turbo, gpt-4.1, gpt-4o variants, gpt-5/5.1/5.2/5.3/5.4 and codex variants, o1, o3, o3-pro, o4-mini, codex-mini, gpt-oss, gpt-audio)
- Anthropic (claude-3-haiku, claude-3.5-haiku/sonnet, claude-3.7-sonnet, claude-sonnet-4/4.5, claude-haiku-4.5, claude-opus-4/4.1/4.5/4.6 with 1M context)
- Google (gemini-2.0-flash, gemini-2.5-flash/pro, gemini-3-flash/pro-preview, gemma-2-9b/27b, gemma-3-4b/12b/27b, gemma-3n-e4b)
- Meta Llama (llama-3-8b/70b, llama-3.1-8b/70b/405b, llama-3.2-1b/3b/11b-vision, llama-3.3-70b, llama-4-maverick/scout, llama-guard-2/3/4)
- DeepSeek (deepseek-chat/v3/v3.1/v3.2/v3.2-speciale, deepseek-r1/r1-0528/r1-turbo, deepseek-r1-distill variants, deepseek-coder-v2, deepseek-prover-v2)
- Qwen (qwen-2.5-7b/72b, qwen-turbo/plus/max, qwen3-8b/14b/30b/32b/235b, qwen3-coder/coder-plus/coder-next/coder-flash/coder-30b, qwen3-vl variants, qwq-32b, qwen3-next-80b, qwen3-max/max-thinking)
- Amazon Nova (nova-micro/lite/pro, nova-2-lite, nova-premier)
- Cohere (command-r, command-r-plus, command-a, command-r7b)
- X.AI Grok (grok-3/3-mini/3-beta/3-mini-beta, grok-4/4-fast, grok-4.1-fast, grok-code-fast-1)
- MiniMax (minimax-01, minimax-m1, minimax-m2/m2.1/m2.5/m2-her)
- ByteDance Seed (seed-1.6, seed-1.6-flash, seed-2.0, seed-2.0-thinking)
- MoonshotAI (kimi-k2, kimi-k2-thinking, kimi-k2.5, kimi-dev-72b)
- Mistral (codestral, devstral/devstral-medium/devstral-small, mistral-large/medium/small, mixtral-8x7b/8x22b, ministral-3b/8b/14b, pixtral, voxtral)
- NVIDIA (llama-3.1-nemotron-70b/ultra-253b, llama-3.3-nemotron-super-49b, nemotron-nano-9b-v2/12b-v2-vl, nemotron-3-nano-30b)
- Z.AI/GLM (glm-5, glm-4-32b, glm-4.5/4.5-air/4.5v, glm-4.6/4.6v, glm-4.7/4.7-flash)
- AllenAI (olmo-2/3-7b/32b-instruct/think, olmo-3.1-32b-instruct/think, molmo-2-8b)
- Perplexity (sonar, sonar-pro, sonar-pro-search, sonar-deep-research, sonar-reasoning-pro)
- NousResearch (hermes-2-pro/3/4-llama series, hermes-4-70b/405b, deephermes-3)
- Baidu ERNIE (ernie-4.5 series including VL and thinking variants)
- Aurora (openrouter/aurora-alpha — free cloaked reasoning model)
- And 30+ more providers (ai21, aion-labs, alfredpros, alpindale, anthracite-org, arcee-ai, bytedance, deepcogito, essentialai, ibm-granite, inception, inflection, kwaipilot, liquid, meituan, morph, nex-agi, opengvlab, prime-intellect, relace, sao10k, stepfun-ai, tencent, thedrummer, tngtech, upstage, writer, xiaomi, etc.)
Embedding Models (for RAG and semantic search):
- OpenAI: text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002
- Google: text-embedding-004, gemini-embedding-001
- Together AI: BAAI/bge-large-en-v1.5, BAAI/bge-base-en-v1.5, m2-bert-80M-32k-retrieval, multilingual-e5-large-instruct, gte-modernbert-base
Each model in MODEL_INFO includes capability flags:
is_function_calling_supported: Whether the model reliably supports tool/function callingis_generation_supported: Whether the model supports text generationis_embedding_supported: Whether the model is an embedding model
🤗 Contributing
Contributions in the form of issues are welcome! KISS Sorcar should be able to take care of them.
📄 License
Apache-2.0
✍️ Authors
- Koushik Sen (ksen@berkeley.edu) | LinkedIn | X @koushik77
- Marius Momeu (marius.momeu@berkeley.edu) | LinkedIn | X @MariusMomeu
- Yogya Mehrotra (ymehrotr@ucsc.edu) | LinkedIn
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file kiss_agent_framework-2026.4.11.tar.gz.
File metadata
- Download URL: kiss_agent_framework-2026.4.11.tar.gz
- Upload date:
- Size: 12.9 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
21c25eb08650c6c51821de118fa5cc16774a0663ea2c9c56f65fd7b675ec5294
|
|
| MD5 |
2b85caa7949cc45dfb3853815bba2591
|
|
| BLAKE2b-256 |
5f25ca84a4c7fff056ed0d432af5636db6f52e726972c719eeb3b1cd7d511982
|
File details
Details for the file kiss_agent_framework-2026.4.11-py3-none-any.whl.
File metadata
- Download URL: kiss_agent_framework-2026.4.11-py3-none-any.whl
- Upload date:
- Size: 2.4 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.11.2 {"installer":{"name":"uv","version":"0.11.2","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
567602c72c6fb9e35392124713868d1337affb01c134a8f9b136ac7f05a051b4
|
|
| MD5 |
ae92568315c7dd8c2a5b7dfae9e23f36
|
|
| BLAKE2b-256 |
1431f246362b4de71ceb37f441f88d87cae50007a60d138ac3c4b76f0e436942
|