Skip to main content

No project description provided

Project description

AutoTx

Discord | :star: the repo !

AutoTx is a personal assistant that generates on-chain transactions for you. These transactions are submitted to a smart account so users can easily approve & execute them.

Demo GIF of AutoTx

[!WARNING]
This project is still early and experimental. Exercise caution when using real funds.

How It Works

AutoTx employs a multi-agent orchestration architecture to easily compose functionality. Given a user prompt, AutoTx will create a new shared context amongst all agents in the form of an Autogen Group Chat. Individual agents will contribute their unique expert opinions to the shared conversation. Agent tools will be selected and run to progressively solve for the goal(s) defined within the user's original prompt.

Agent tools can add transactions to a batch, which will later be proposed to the user's smart account for final approval before being executed on-chain. Currently AutoTx supports Safe smart accounts. AutoTx uses a locally-stored private key to submit transactions to the user's smart account.

Agents

Below is a list of existing and anticipated agents that AutoTx can use. If you'd like to help build one of these agents, see the How To Contribute section below.

Agent Description Status
Send Tokens Send tokens (ERC20 & ETH) to a receiving address. :rocket:
Swap Tokens Swap from one token to another. Currently integrated with Li.Fi. :rocket:
Token Research Research tokens, liquidity, prices, graphs, etc. :rocket:
Earn Yield Stake assets to earn yield. :memo: draft
Bridge Tokens Bridge tokens from one chain to another. :memo: draft
Social Search Research accounts, posts, and sentiment across social networks (ex: Twitter, Farcaster) :memo: draft
Web3 Domains Purchase and manage domains (ex: ENS) :memo: draft
NFTs Basic NFT integration: mint, transfer, set approval, etc. :memo: draft
NFT Market NFT marketplace functionality: list, bid, etc. :thought_balloon:
LP Provide liquidity to AMMs. :thought_balloon:
Governance Vote or delegate in DAOs. :thought_balloon:
Predict Generate future predictions based on research. :thought_balloon:
Donate Donate to public goods projects. :thought_balloon:
Invest Participate in LBPs, IDOs, etc. :thought_balloon:

Getting Started

Pre-Requisites

Please install the following:

Installation

  1. Clone the repository via git clone https://github.com/polywrap/AutoTx and cd AutoTx into the directory.
  2. Create a new .env file via cp .env.example .env
  3. Find the line that says OPENAI_API_KEY=, and add your unique OpenAI API Key OPENAI_API_KEY=sk-...
  4. (Optional) If you have an Infura/Alchemy API Key, find the line that says CHAIN_RPC_URL=, and update it, for example: CHAIN_RPC_URL=https://mainnet.infura.io/v3/YOUR_INFURA_KEY (see https://www.infura.io/ or https://alchemy.com to get your own API key).
  5. (Optional) If you have a Coingecko API Key, find the line that says COINGECKO_API_KEY=, and add it COINGECKO_API_KEY=CG-... (see Coingecko API Documentation). Note: Without the Coingecko API Key, the Token Research Agent will not be added to the agent's execution loop.
  6. Start a new poetry shell poetry shell
  7. Install python dependencies poetry install

Using AutoTx

  1. Run poetry run start-devnet if you want to test locally. More information below.
  2. Run poetry run ask and AutoTx will ask you for a prompt to start solving for (ex: Send 1 ETH to vitalik.eth). Prompts can also be passed as an argument (ex: poetry run ask "...").

Additional run Options:

  • -v, --verbose Enable verbose logging.
  • -n, --non-interactive Disable all requests for user input, as well as the clarifier agent.
  • -l, --logs DIRECTORY Path to the directory where logs will be stored.

Test Locally

Run poetry run start-devnet to create a local fork of the network set by the CHAIN_RPC_URL env variable. This step required Docker to be running in the background. The devnet includes a new smart account, as well as a development address with test ETH for tx execution. Running poetry run stop-devnet will shutdown the local fork.

Connect a Smart Account

AutoTx can be connected to your existing smart account by doing the following:

  1. Set the SMART_ACCOUNT_ADDRESS to the address of your smart account in your .env. This tells AutoTx which account it should interact with.
  2. AutoTx's agent address, which it generates locally, must be set as a signer in your Safe's configuration to allow it to create transactions on behalf of the smart account. To get this address, run poetry run agent address.
  3. Update the CHAIN_RPC_URL value in your .env with the correct RPC URL of the network where your smart account is deployed.
  4. Run AutoTx as you would normally.

Prompts

AutoTx currently supports prompts such as:

Category Prompt
Token Research Research the top AI coins by trading volume.
Token Research Conduct a thorough analysis of Worldcoin, including whether to hold or sell
Token Research Find leveraged tokens I can buy directly on Ethereum mainnet
Send Tokens Send tokens 1 ETH and 1000 USDC to vitalik.eth
Swap Tokens Buy 100 USDC with ETH
Multi Task Identify the top AI coins by trading volume on Ethereum mainnet. Buy 1 ETH of the top 2.
Multi Task Swap ETH to 0.05 WBTC, then swap WBTC to 1000 USDC, and finally send 50 USDC to vitalik.eth
Multi Task, Airdrop Buy 10 WLD with ETH, then send the WLD in equal amounts to each of these addresses: vitalik.eth, abc.eth, and maxi.eth
Multi Task, Airdrop Buy 1 ETH of the highest mcap meme coin on ethereum mainnet, then airdrop it in equal parts to: vitalik.eth, abc.eth, and maxi.eth
Multi Task, Strategy I want to use 3 ETH to purchase 10 of the best projects in: GameFi, NFTs, ZK, AI, and MEMEs. Please research the top projects, come up with a strategy, and purchase the tokens that look most promising. All of this should be on ETH mainnet.

Future possibilities:

  • Purchase mainnet ETH with my USDC on optimism
  • What proposals are being voted on right now?
  • Donate $100 to environmental impact projects
  • ...

Use AutoTx With Open-Source Models

To run AutoTx with your favorite OS model, you can use any provider that simulates the OpenAI API. One of the easiest way to do this is using together.ai and following these steps:

  1. Make a together.ai account.
  2. Set OPENAI_API_KEY in the .env file to your together.ai account's API key (found here)
  3. Set OPENAI_BASE_URL to point to https://api.together.xyz/v1
  4. Set OPENAI_MODEL_NAME to one of these recommended JSON-enabled models: mistralai/Mixtral-8x7B-Instruct-v0.1, mistralai/Mistral-7B-Instruct-v0.1

Now simply run AutoTx as normally do. For more tips on choosing the best model, you can follow this guide. NOTE: Non-interactive mode is recommended when using less powerful models (like Open Source models) to avoid hallucinations.

How To Contribute

Interested in contributing to AutoTx? Here are some ideas:

  • Contribute prompt ideas above
  • Build an agent
  • Discuss AutoTx's future in issues

Connect with us on Discord if you have any questions or ideas to share.

Building Agents

To add agents to AutoTx, we recommend starting with the ExampleAgent.py starter template. From there you'll want to:

  1. Define the agent's name and system_message.
  2. Implement the tools (functions) you want the agent to be able to call.
  3. Add all tools to the agent's tools=[...] array.
  4. Add your new agent to AutoTx's constructor in cli.py.

Testing

Tests are located in the ./autotx/tests directory.

Use the following commands to run your tests:

# run all tests
poetry run pytest -s

# run a specific file
poetry run pytest -s ./autotx/tests/file_name.py

# run a specific test
poetry run pytest -s ./autotx/tests/file_name.py::function_name

Additionally you can run benchmarks to measure consistency:

# run tests in a directory with 5 iterations each
python benchmarks.py ./autotx/tests/dir_name 5

# run tests in a file with 5 iterations each
python benchmarks.py ./autotx/tests/file_name.py 5

# run a specific test with 5 iterations
python benchmarks.py ./autotx/tests/file_name.py::function_name 5

# run a specific test with 5 iterations and name the output folder (instead of the default timestamp)
python benchmarks.py ./autotx/tests/file_name.py::function_name 5 output_folder_name

API Server

To view the API server documentation, please see the API.md file.

Need Help?

Join our Discord community for support and discussions.

Join us on Discord

If you have questions or encounter issues, please don't hesitate to create a new issue to get support.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autotx-0.1.3.tar.gz (680.1 kB view details)

Uploaded Source

Built Distribution

autotx-0.1.3-py3-none-any.whl (710.3 kB view details)

Uploaded Python 3

File details

Details for the file autotx-0.1.3.tar.gz.

File metadata

  • Download URL: autotx-0.1.3.tar.gz
  • Upload date:
  • Size: 680.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/6.5.0-35-generic

File hashes

Hashes for autotx-0.1.3.tar.gz
Algorithm Hash digest
SHA256 17ca7279cecdf1ef0d312bf814586c853f85f47ac840f8319d2633c69bb21b66
MD5 c8f97a57f0021e498bb195095a6a7216
BLAKE2b-256 75e31e54794b2cd20dfae073b2985e3d350bc68a0c0ff0b5a46c70bcddecba1f

See more details on using hashes here.

File details

Details for the file autotx-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: autotx-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 710.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.10.6 Linux/6.5.0-35-generic

File hashes

Hashes for autotx-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a0f872db21834b3277dd6f34f21119c4dd2674d991bacc37799451c159ed1bea
MD5 e44cd0a600f3dc57d091e573bfa1ad5c
BLAKE2b-256 6937ba19e771d950065cbe8c5363d188264d130c611002ef1aecfad89302c828

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page