Skip to main content

A pythonic nextcord extension including useful tools for bot development and debugging.

Project description

Python versions License Status Issues Commit activity


  Onami

a debugging and utility extension for nextcord bots
Read the documentation online.


Fork

Onami is a actively maintained fork of Jishaku for nextcord

onami is an extension for bot developers that enables rapid prototyping, experimentation, and debugging of features for bots.

One of onami's core philosophies is to be dynamic and easy-to-use. Here's the two step install:

  1. Download onami on the command line using pip:
pip install -U onami
  1. Load the extension in your bot code before it runs:
bot.load_extension('onami')

That's it!

You can also import the module to use the command development utilities.

Index

Command reference

> onami [py|python] <argument>

> onami [pyi|python_inspect] <argument>

The Python commands execute or evaluate Python code passed into them. It supports simple expressions:

> oni py 3+4

Beep Bot

7

It also supports async expressions:

> oni py await _ctx.pins()

Beep Bot

[<Message id=123456789012345678 ...>, ...]

You can pass in codeblocks for longer blocks of code to execute, and you can use yield to return intermediate results within your processing.

The inspect variant of the command will return a codeblock with detailed inspection information on all objects returned.

The variables available by default in all execution contexts are:

_ctx The Context that invoked the command.
_bot The running Bot instance.
_author
_channel
_guild
_message
_msg
Shortcuts for attributes on _ctx.
_find
_get
Shortcuts for nextcord.utils functions.

The underscore prefix on the provided variables is intended to help prevent shadowing when writing large blocks of code within the command.
If you decide that you don't want the prefix, you can disable it by setting the onami_NO_UNDERSCORE environment variable to true.

Each Python command is individually scoped. That means variables you create won't be retained in later invocations.
You can use onami retain on to change this behavior and retain variables, and onami retain off if you change your mind later.

> onami [dis|disassemble] <argument>

This command compiles Python code in an asynchronous context, and then disassembles the resulting function into Python bytecode in the style of dis.dis.

This allows you to quickly and easily determine the bytecode that results from a given expression or piece of code. The code itself is not actually executed.

> onami [sh|shell] <argument>

The shell command executes commands within your system shell.

If you're on Linux and are using a custom shell, onami will obey the SHELL environment variable, otherwise, it will use /bin/bash.
On Windows, onami will use PowerShell if it's detected, otherwise, it will use Command Prompt.

The results from the commands you pass in are returned through a paginator interface live as the command runs. If you need to stop a command, you can press the stop button reaction, or use oni cancel.

The execution will terminate automatically if no output is produced for 120 seconds.

> onami git <argument>

> onami pip <argument>

These commands act as shortcuts to the shell command, so you can save typing a word if you use these frequently.

> onami [load|reload] [extensions...]

> onami unload [extensions...]

These commands load, reload, or unload extensions on your bot.

You can reload onami itself with oni reload onami.
oni reload ~ will reload all extensions on your bot.

You can load, reload, or unload multiple extensions at once: oni reload cogs.one cogs.two

> onami shutdown

This command gracefully shuts down your bot.

> onami rtt

This command calculates Round-Trip Time for your bot to the API. It does this by calculating response time samples, so you can tell if your bot is being slow or not.

> onami cat <file>

This command reads a file from your file system, automatically detecting encoding and (if applicable) highlighting.

You can use this to read things like bot logs or source files in your project.

> onami curl <url>

This command reads text from a URL and attempts to detect encoding and language, similar to oni cat.

You can use this to display contents of files online, for instance, the message.txt files created when a message is too long, or raw files from paste sites.

> onami exec [member and/or channel...] <command string>

> onami debug <command string>

> onami repeat <times> <command string>

These commands serve as command control for other commands.

onami exec allows you to execute a command as another user, in another channel, or both. Using aliases with a postfix exclamation mark (such as oni exec! ...) executes the command bypassing checks and cooldowns.

onami debug executes a command with an exception wrapper and a timer. This allows you to quickly get feedback on reproducable command errors and slowdowns.

onami repeat repeats a command a number of times.

> onami permtrace <channel> [targets...]

This command allows you to investigate the source of expressed permissions in a given channel. Targets can be either a member, or a list of roles (to simulate a member with those roles).

It will read all of the guild permissions and channel overwrites for the given member or roles in the channel, and provide a breakdown containing whether the permission is granted, and the most fundamental reason why.

Installing development versions

If you'd like to test the latest versions of onami, you can do so by downloading from the git hosts instead of from PyPI.

From GitHub:

pip install -U "onami @ git+https://github.com/VincentRPS/onami@master"

Please note that the new 2020 dependency resolver now no longer discounts git package sources from reinstall prevention, which means that if you are installing the onami development version multiple times within the same version target you may run into pip just discarding the update.

If you run into such a problem, you can force onami to be reinstalled like this:

From GitHub:

pip install -U --force-reinstall "onami @ git+https://github.com/VincentRPS/onami@master"

You must have installed onami with one of the commands above before doing this else you will probably end up with a broken installation.

Acknowledgements

The documentation and this README uses icons from the Material Design Icon library, which is licensed under the Apache License Version 2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

onami-2.6.3.tar.gz (50.8 kB view details)

Uploaded Source

File details

Details for the file onami-2.6.3.tar.gz.

File metadata

  • Download URL: onami-2.6.3.tar.gz
  • Upload date:
  • Size: 50.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.2

File hashes

Hashes for onami-2.6.3.tar.gz
Algorithm Hash digest
SHA256 ee6816ea6db2852353672f8f49bf5de6c336bcc6e300b40278ef9034826dfd0a
MD5 71137b2320dea8dda79c54f8b5513b11
BLAKE2b-256 e3611c57f49127a4944531c6032d1be82ca34803431911528841ace6c4cd4d32

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page