Skip to main content

Co-create PowerPoint slide decks with AI

Project description


title: SlideDeck AI emoji: 🏢 colorFrom: yellow colorTo: green sdk: streamlit sdk_version: 1.44.1 app_file: app.py pinned: false license: mit

PyPI codecov License: MIT Open in Streamlit

SlideDeck AI

We spend a lot of time creating slides and organizing our thoughts for any presentation. With SlideDeck AI, co-create slide decks on any topic with Artificial Intelligence and Large Language Models. Describe your topic and let SlideDeck AI generate a PowerPoint slide deck for you—it's as simple as that!

Star History

Star History Chart

Process

SlideDeck AI works in the following way:

  1. Given a topic description, it uses a Large Language Model (LLM) to generate the initial content of the slides. The output is generated as structured JSON data based on a pre-defined schema.
  2. Next, it uses the keywords from the JSON output to search and download a few images with a certain probability.
  3. Subsequently, it uses the python-pptx library to generate the slides, based on the JSON data from the previous step. A user can choose from a set of pre-defined presentation templates.
  4. At this stage onward, a user can provide additional instructions to refine the content. For example, one can ask to add another slide or modify an existing slide. A history of instructions is maintained.
  5. Every time SlideDeck AI generates a PowerPoint presentation, a download button is provided. Clicking on the button will download the file.

In addition, SlideDeck AI can also create a presentation based on PDF files.

Python API Usage

from slidedeckai.core import SlideDeckAI


slide_generator = SlideDeckAI(
    model='[gg]gemini-2.5-flash-lite',
    topic='Make a slide deck on AI',
    api_key='your-google-api-key',  # Or set via environment variable
)
pptx_path = slide_generator.generate()
print(f'🤖 Generated slide deck: {pptx_path}')

CLI Usage

Generate a new slide deck:

slidedeckai generate --model '[gg]gemini-2.5-flash-lite' --topic 'Make a slide deck on AI' --api-key 'your-google-api-key'

Launch the Streamlit app:

slidedeckai launch

List supported models (these are the only models supported by SlideDeck AI):

slidedeckai --list-models

Summary of the LLMs

SlideDeck AI allows the use of different LLMs from several online providers—Azure OpenAI, Google, Cohere, Together AI, and OpenRouter. Most of these service providers offer generous free usage of relevant LLMs without requiring any billing information.

Based on several experiments, SlideDeck AI generally recommends the use of Mistral NeMo, Gemini Flash, and GPT-4o to generate the slide decks.

The supported LLMs offer different styles of content generation. Use one of the following LLMs along with relevant API keys/access tokens, as appropriate, to create the content of the slide deck:

LLM Provider (code) Requires API key Characteristics
Claude Haiku 4.5 Anthropic (an) Mandatory; get here Faster, detailed
Gemini 2.0 Flash Google Gemini API (gg) Mandatory; get here Faster, longer content
Gemini 2.0 Flash Lite Google Gemini API (gg) Mandatory; get here Fastest, longer content
Gemini 2.5 Flash Google Gemini API (gg) Mandatory; get here Faster, longer content
Gemini 2.5 Flash Lite Google Gemini API (gg) Mandatory; get here Fastest, longer content
GPT-4.1-mini OpenAI (oa) Mandatory; get here Faster, medium content
GPT-4.1-nano OpenAI (oa) Mandatory; get here Faster, shorter content
GPT-5 OpenAI (oa) Mandatory; get here Slow, shorter content
GPT Azure OpenAI (az) Mandatory; get here NOTE: You need to have your subscription/billing set up Faster, longer content
Command R+ Cohere (co) Mandatory; get here Shorter, simpler content
Gemini-2.0-flash-001 OpenRouter (or) Mandatory; get here Faster, longer content
GPT-3.5 Turbo OpenRouter (or) Mandatory; get here Faster, longer content
DeepSeek-V3.1-Terminus SambaNova (sn) Mandatory; get here Fast, detailed content
Llama-3.3-Swallow-70B-Instruct-v0.4 SambaNova (sn) Mandatory; get here Fast, shorter
DeepSeek V3-0324 Together AI (to) Mandatory; get here Slower, medium-length
Llama 3.3 70B Instruct Turbo Together AI (to) Mandatory; get here Slower, detailed
Llama 3.1 8B Instruct Turbo 128K Together AI (to) Mandatory; get here Faster, shorter

IMPORTANT: SlideDeck AI does NOT store your API keys/tokens or transmit them elsewhere. If you provide your API key, it is only used to invoke the relevant LLM to generate contents. That's it! This is an Open-Source project, so feel free to audit the code and convince yourself.

In addition, offline LLMs provided by Ollama can be used. Read below to know more.

Icons

SlideDeck AI uses a subset of icons from bootstrap-icons-1.11.3 (MIT license) in the slides. A few icons from SVG Repo (CC0, MIT, and Apache licenses) are also used.

Local Development

SlideDeck AI uses LLMs via different providers. To run this project by yourself, you need to use an appropriate API key, for example, in a .env file. Alternatively, you can provide the access token in the app's user interface itself (UI).

Offline LLMs Using Ollama

SlideDeck AI allows the use of offline LLMs to generate the contents of the slide decks. This is typically suitable for individuals or organizations who would like to use self-hosted LLMs for privacy concerns, for example.

Offline LLMs are made available via Ollama. Therefore, a pre-requisite here is to have Ollama installed on the system and the desired LLM pulled locally. You should choose a model to use based on your hardware capacity. However, if you have no GPU, gemma3:1b can be a suitable model to run only on CPU.

In addition, the RUN_IN_OFFLINE_MODE environment variable needs to be set to True to enable the offline mode. This, for example, can be done using a .env file or from the terminal. The typical steps to use SlideDeck AI in offline mode (in a bash shell) are as follows:

# Environment initialization, especially on Debian
sudo apt update -y
sudo apt install python-is-python3 -y
sudo apt install git -y
# Change the package name based on the Python version installed: python -V
sudo apt install python3.11-venv -y

# Install Git Large File Storage (LFS)
sudo apt install git-lfs -y
git lfs install

ollama list  # View locally available LLMs
export RUN_IN_OFFLINE_MODE=True  # Enable the offline mode to use Ollama
git clone [https://github.com/barun-saha/slide-deck-ai.git](https://github.com/barun-saha/slide-deck-ai.git)
cd slide-deck-ai
git lfs pull  # Pull the PPTX template files - ESSENTIAL STEP!

python -m venv venv  # Create a virtual environment
source venv/bin/activate  # On a Linux system
pip install -r requirements.txt

streamlit run ./app.py  # Run the application

💡If you have cloned the repository locally but cannot open and view the PPTX templates, you may need to run git lfs pull to download the template files. Without this, although content generation will work, the slide deck cannot be created.

The .env file should be created inside the slide-deck-ai directory.

The UI is similar to the online mode. However, rather than selecting an LLM from a list, one has to write the name of the Ollama model to be used in a textbox. There is no API key asked here.

The online and offline modes are mutually exclusive. So, setting RUN_IN_OFFLINE_MODE to False will make SlideDeck AI use the online LLMs (i.e., the "original mode."). By default, RUN_IN_OFFLINE_MODE is set to False.

Finally, the focus is on using offline LLMs, not going completely offline. So, Internet connectivity would still be required to fetch the images from Pexels.

Live Demo

Award

SlideDeck AI has won the 3rd Place in the Llama 2 Hackathon with Clarifai in 2023.

Contributors

SlideDeck AI is glad to have the following community contributions:

  • Aditya: added support for page range selection for PDF files and new chat button.
  • Sagar Bharatbhai Bharadia: added support for Gemini 2.5 Flash Lite and Gemini 2.5 Flash LLMs.
  • Sairam Pillai: unified the project's LLM access by migrating the API calls to LiteLLM.
  • Srinivasan Ragothaman: added OpenRouter support and API keys mapping from the .env file.

Thank you all for your contributions!

All Contributors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

slidedeckai-8.0.4.tar.gz (2.8 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

slidedeckai-8.0.4-py3-none-any.whl (2.9 MB view details)

Uploaded Python 3

File details

Details for the file slidedeckai-8.0.4.tar.gz.

File metadata

  • Download URL: slidedeckai-8.0.4.tar.gz
  • Upload date:
  • Size: 2.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for slidedeckai-8.0.4.tar.gz
Algorithm Hash digest
SHA256 7426e62590bceb9cb67f0a693bebdf81c1d233b93808be1f0329b3647cdbcb36
MD5 9b4115efe35cb55036336994bf7eba4d
BLAKE2b-256 81ddbe46cb5f828ecefeb2c82d89ec7611c3fc0cb7085bf85fbdc8a80e1a2a18

See more details on using hashes here.

Provenance

The following attestation bundles were made for slidedeckai-8.0.4.tar.gz:

Publisher: publish-to-pypi.yml on barun-saha/slide-deck-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file slidedeckai-8.0.4-py3-none-any.whl.

File metadata

  • Download URL: slidedeckai-8.0.4-py3-none-any.whl
  • Upload date:
  • Size: 2.9 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for slidedeckai-8.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 1428f59962c790c54b22c0820e8e325ef4fb94ea85c9bec1f81f1e365146b203
MD5 29401b94a469815831e2315fad2e8d18
BLAKE2b-256 f7e5f37f932f38be9404b3dbf2292040c00ebedb67e97084afa95d0913325f62

See more details on using hashes here.

Provenance

The following attestation bundles were made for slidedeckai-8.0.4-py3-none-any.whl:

Publisher: publish-to-pypi.yml on barun-saha/slide-deck-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page