Skip to main content

AI-Powered Ethical Hacking Assistant

Project description

Nebula – AI-Powered Penetration Testing Assistant

Nebula is an advanced, AI-powered penetration testing open-source tool that revolutionizes penetration testing by integrating state-of-the-art AI models into your command-line interface. Designed for cybersecurity professionals, ethical hackers, and developers, Nebula automates vulnerability assessments and enhances security workflows with real-time insights and automated note-taking.

Nebula AI-Powered Penetration Testing CLI Interface

Acknowledgement

First i would like to thank the All-Mighty God who is the source of all knowledge, without Him, this would not be possible.

News

Introducing the Deep Application Profiler (DAP). DAP uses neural networks to analyze an executable's internal structure and intent, rather than relying on traditional virus signatures. This approach enables it to detect new, zero-day malware that conventional methods often miss. DAP also provides detailed breakdowns for rapid analyst review and is available as both a web service and an API. Learn More Here

Nebula: AI-Powered Penetration Testing Platform

Nebula is a cutting-edge, AI-powered penetration testing tool designed for cybersecurity professionals and ethical hackers. It integrates advanced open-source AI models such as OpenAI's models (any model that is available via API) Meta's Llama-3.1-8B-Instruct, Mistralai's Mistral-7B-Instruct-v0.2, and DeepSeek-R1-Distill-Llama-8B—directly into the command line interface (CLI). By leveraging these state-of-the-art models, Nebula not only enhances vulnerability assessments and penetration testing workflows but also supports any tool that can be invoked from the CLI.

Installation

System Requirements:

For CPU-Based Inference(Ollama)(Note that Ollama Supports GPU too):

  • At least 16GB of RAM
  • Python 3.10 – 3.13.9
  • Ollama

Installation Command:

python -m pip install nebula-ai --upgrade

Running Nebula

Important:

Ollama Local Model Based Usage

Install Ollama and download your preferred models for example

 ollama pull mistral

Then enter the model's exact name as it appears in Ollama in the engagement settings.

OpenAI Models Usage

To use OpenAI models, add your API keys to your env like so

export OPENAI_API_KEY="sk-blah-blaj"

Then enter the OpenAI model's exact name in the engagement settings.

Run nebula

nebula

Using docker

First allow local connections to your X server:

xhost +local:docker
docker run --rm -it   -e DISPLAY=$DISPLAY   -v /home/YOUR_HOST_NAME/.local/share/nebula/logs:/root/.local/share/nebula/logs -v YOUR_ENGAGEMENT_FOLDER_ON_HOST_MACHINE:/engagements -v /tmp/.X11-unix:/tmp/.X11-unix   berylliumsec/nebula:latest

Interacting with the models.

To interact with the models, begin your input with a ! or use the AI/Terminal button to switch between modes. For example: ! write a python script to scan the ports of a remote system the "!" is not needed if you use the context button

Key Features

  • AI-Powered Internet Search via agents:
    Enhance responses by integrating real-time, internet-sourced context to keep you updated on cybersecurity trends. "whats in the news on cybersecurity today"

  • AI-Assisted Note-Taking:
    Automatically record and categorize security findings.

  • Real-Time AI-Driven Insights:
    Get immediate suggestions for discovering and exploiting vulnerabilities based on terminal tool outputs.

  • Enhanced Tool Integration:
    Seamlessly import data from external tools for AI-powered note-taking and advice.

  • Integrated Screenshot & Editing:
    Capture and annotate images directly within Nebula for streamlined documentation.

  • Manual Note-Taking & Automatic Command Logging:
    Maintain a detailed log of your actions and findings with both automated and manual note-taking features.

  • Status feed:
    This panel displays your most recent penetration testing activities, it refreshes every five minutes

Roadmap

  • Create custom models that are more useful for penetration testing

Troubleshooting

Logs are located at /home/[your_username]/.local/share/nebula/logs. You would most likely find the reason for the error in one of those logs

Get More Support

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nebula_ai-2.0.0b27.tar.gz (20.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nebula_ai-2.0.0b27-py3-none-any.whl (20.8 MB view details)

Uploaded Python 3

File details

Details for the file nebula_ai-2.0.0b27.tar.gz.

File metadata

  • Download URL: nebula_ai-2.0.0b27.tar.gz
  • Upload date:
  • Size: 20.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nebula_ai-2.0.0b27.tar.gz
Algorithm Hash digest
SHA256 65cff0853dac2b49515afc2bae08e3d85b362ebe7ebad11effb368467c2e6ee0
MD5 a5e84e6de4975084ae6599f513eb8ce3
BLAKE2b-256 3245b1f31af621826b41574904ba31a523e1ac11cbab1f49c796537498eddf3f

See more details on using hashes here.

Provenance

The following attestation bundles were made for nebula_ai-2.0.0b27.tar.gz:

Publisher: publish.yml on berylliumsec/nebula

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file nebula_ai-2.0.0b27-py3-none-any.whl.

File metadata

  • Download URL: nebula_ai-2.0.0b27-py3-none-any.whl
  • Upload date:
  • Size: 20.8 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nebula_ai-2.0.0b27-py3-none-any.whl
Algorithm Hash digest
SHA256 fa8b6bf231153c178e02aba61450836fc0cecc12c2a73c55e959acae668a267a
MD5 7dc99324d7231185478878175b81f2a7
BLAKE2b-256 c4affcf2eb7468a91128dc9a7318eaced18cb9a087e07dd12184a28a72e1ad16

See more details on using hashes here.

Provenance

The following attestation bundles were made for nebula_ai-2.0.0b27-py3-none-any.whl:

Publisher: publish.yml on berylliumsec/nebula

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page