Skip to main content

An adapter interface for Ai programs to be run with a RobotPy Command interface

Project description

Ai Interpreter Command

This project allows the use of Ai-based commands in RobotPy programs for FRC.

What is Ai?

Ai is a programming language that was designed to be extensible and, more importantly, interruptable. More accurately, Ai was designed as a sort of automation language, similar to bash or make. The most important feature however, is the ability to interface directly with its outside environment in the form of Callables and Props, which let the user define arbitrary interactions.

For more information, see the Ai Project. (This package is included in the overall Ai Command project, since it was technically the driving force behind the whole thing, and reorganizing repositories is a pain).

What Happend to the Old interpreter-command

TL;DR: if you liked the old version, don't upgrade past 2024 versions.

The original version of this package was intended as a way to allow for on-the-fly changes to robot behaviour without having to go through the multi-second deploy and startup time, which quickly adds up over the course of developing, for example, a non-trivial autonomous routine.

It was fine. It worked. It accomplished the goal. But it was clunky. It did nothing more than execute a series of commands line-by-line, with a very simple ability to repeat or skip single commands using a highly inflexible syntax. It was less a language and more a glorified state machine. And if you wanted to do anything in parallel? You had to do that natively, then register that parallelism under a single name, meaning it was completely inflexible.

Honestly, the only thing it had going for it was the dispatch system, which allowed you to define multiple commands that used the same initial name, then dispatching based on the second word, and so on. That was (in my humble opinion as the author), kind of neat, especially since it was mostly an unintended side-effect, but it was a pain to set up.

Ai is different. Ai is a real, (technically) Turing-complete language, complete with compiler, intepreter, and support for transferring a pre-compiled program between environments (say, between a host device and a connected robot), in order to more efficiently distribute work. It's built in Rust, and embedded in Python, so it's more efficient and doesn't just freeload off of Python's type system, which makes it much safer to use (albeit slightly less expressive). It also has a complete logic system that allows skipping arbitrary code, instead of one single command. Most importantly though, Ai has built-in support for parallel groups, which allows for more logic to be written within Ai, from more elemental components.

From version 2025 onward, this project will be backed by Ai, and bears no similarity to the old project, except maybe a few shared names. The API is and will be completely different and will not be compatible with pre-2025 versions.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

interpreter_command-2025.1.6.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

interpreter_command-2025.1.6-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file interpreter_command-2025.1.6.tar.gz.

File metadata

  • Download URL: interpreter_command-2025.1.6.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for interpreter_command-2025.1.6.tar.gz
Algorithm Hash digest
SHA256 85670a6eb23ba454fc01e77242d8499a18fd3ea6b9859592d07bd85413e204ac
MD5 5d88f45297f6f4384572eff1e93e6a41
BLAKE2b-256 2257d2dd0f07cd9e55df8d3bf1b829b54bbc2743063b22f1d8c8f9f1860a63cb

See more details on using hashes here.

File details

Details for the file interpreter_command-2025.1.6-py3-none-any.whl.

File metadata

  • Download URL: interpreter_command-2025.1.6-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for interpreter_command-2025.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 dd68f74108689b86e86733369606ccbbca2606b7eeb513f71b0b4aa2d619be3d
MD5 334f69df42b75b5feff0fbda5563892e
BLAKE2b-256 02303cd1605132df9522545c8013d4e6e33b3aba40f55a30a81ec08bed1618cc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page