Skip to main content

AI-Powered Ethical Hacking Assistant

Project description

Nebula – AI-Powered Penetration Testing Assistant

Nebula is an advanced, AI-powered penetration testing open-source tool that revolutionizes penetration testing by integrating state-of-the-art AI models into your command-line interface. Designed for cybersecurity professionals, ethical hackers, and developers, Nebula automates vulnerability assessments and enhances security workflows with real-time insights and automated note-taking.

Nebula AI-Powered Penetration Testing CLI Interface

Acknowledgement

First i would like to thank the All-Mighty God who is the source of all knowledge, without Him, this would not be possible.

News

Introducing the Deep Application Profiler (DAP). DAP uses neural networks to analyze an executable's internal structure and intent, rather than relying on traditional virus signatures. This approach enables it to detect new, zero-day malware that conventional methods often miss. DAP also provides detailed breakdowns for rapid analyst review and is available as both a web service and an API. Learn More Here

Nebula: AI-Powered Penetration Testing Platform

Nebula is a cutting-edge, AI-powered penetration testing tool designed for cybersecurity professionals and ethical hackers. It integrates advanced open-source AI models such as OpenAI's models (any model that is available via API) Meta's Llama-3.1-8B-Instruct, Mistralai's Mistral-7B-Instruct-v0.2, and DeepSeek-R1-Distill-Llama-8B—directly into the command line interface (CLI). By leveraging these state-of-the-art models, Nebula not only enhances vulnerability assessments and penetration testing workflows but also supports any tool that can be invoked from the CLI.

Installation

System Requirements:

For CPU-Based Inference(Ollama)(Note that Ollama Supports GPU too):

  • At least 16GB of RAM
  • Python 3.10 – 3.13.9
  • Ollama

Installation Command:

python -m pip install nebula-ai --upgrade

Running Nebula

Important:

Ollama Local Model Based Usage

Install Ollama and download your preferred models for example

 ollama pull mistral

Then enter the model's exact name as it appears in Ollama in the engagement settings.

OpenAI Models Usage

To use OpenAI models, add your API keys to your env like so

export OPENAI_API_KEY="sk-blah-blaj"

Then enter the OpenAI model's exact name in the engagement settings.

Run nebula

nebula

Using docker

First allow local connections to your X server:

xhost +local:docker
docker run --rm -it   -e DISPLAY=$DISPLAY   -v /home/YOUR_HOST_NAME/.local/share/nebula/logs:/root/.local/share/nebula/logs -v YOUR_ENGAGEMENT_FOLDER_ON_HOST_MACHINE:/engagements -v /tmp/.X11-unix:/tmp/.X11-unix   berylliumsec/nebula:latest

Interacting with the models.

To interact with the models, begin your input with a ! or use the AI/Terminal button to switch between modes. For example: ! write a python script to scan the ports of a remote system the "!" is not needed if you use the context button

Key Features

  • AI-Powered Internet Search via agents:
    Enhance responses by integrating real-time, internet-sourced context to keep you updated on cybersecurity trends. "whats in the news on cybersecurity today"

  • AI-Assisted Note-Taking:
    Automatically record and categorize security findings.

  • Real-Time AI-Driven Insights:
    Get immediate suggestions for discovering and exploiting vulnerabilities based on terminal tool outputs.

  • Enhanced Tool Integration:
    Seamlessly import data from external tools for AI-powered note-taking and advice.

  • Integrated Screenshot & Editing:
    Capture and annotate images directly within Nebula for streamlined documentation.

  • Manual Note-Taking & Automatic Command Logging:
    Maintain a detailed log of your actions and findings with both automated and manual note-taking features.

  • Status feed:
    This panel displays your most recent penetration testing activities, it refreshes every five minutes

Roadmap

  • Create custom models that are more useful for penetration testing

Troubleshooting

Logs are located at /home/[your_username]/.local/share/nebula/logs. You would most likely find the reason for the error in one of those logs

Get More Support

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nebula_ai-2.0.0b28.tar.gz (20.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nebula_ai-2.0.0b28-py3-none-any.whl (20.8 MB view details)

Uploaded Python 3

File details

Details for the file nebula_ai-2.0.0b28.tar.gz.

File metadata

  • Download URL: nebula_ai-2.0.0b28.tar.gz
  • Upload date:
  • Size: 20.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nebula_ai-2.0.0b28.tar.gz
Algorithm Hash digest
SHA256 a3283d43aa1c1c776ebe81478b68387bbf7c3ec43b99f127e981b1a98b60f2c9
MD5 facd01b5fbe775c6e7ec1ca26ff401b8
BLAKE2b-256 4dde9d3dda9a6b6be8101c1c987fca3dc811f5a54bb4e657205a10c188a44fc0

See more details on using hashes here.

Provenance

The following attestation bundles were made for nebula_ai-2.0.0b28.tar.gz:

Publisher: publish.yml on berylliumsec/nebula

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file nebula_ai-2.0.0b28-py3-none-any.whl.

File metadata

  • Download URL: nebula_ai-2.0.0b28-py3-none-any.whl
  • Upload date:
  • Size: 20.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nebula_ai-2.0.0b28-py3-none-any.whl
Algorithm Hash digest
SHA256 dc57cbdc8e573f6d8ac5a364d325631c9adb18cb1a6210d108a401628cc4248a
MD5 e206ebfbebbd818e4df47c14dabe8ea0
BLAKE2b-256 18295af989e786f32c495138abb505febb769c2c00bb1f4382ed3d15a30f32e1

See more details on using hashes here.

Provenance

The following attestation bundles were made for nebula_ai-2.0.0b28-py3-none-any.whl:

Publisher: publish.yml on berylliumsec/nebula

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page