Skip to main content

Add your description here

Project description

Akaibu

Akaibu reads arXiv RSS feed and finds papers that match your requirements by using LLMs.

Setup

Install

uv tool install akabu

Set your LLM provider

If you have ollama running on your machine, just do,

akaibu set-endpoint http://localhost:11434/v1 ollama

The first argument is the url for the LLM API endpoint, the second is the key. For local ollama, you don't need a key, so it can actually be anything.

Create a feed

A feed is composed of three things,

  • Name: Can be any string. This is just to identify the feed.
  • URL: This is a URL for the arXiv RSS feed you want to subscribe. Check this page to know how it works. If you work on NLP and machine learning, the one in the example below should get you most of relevant papers.
  • Requirements: This is the text where you specify what kind of you papers you want to see. The example below is very simple and show but it can be complex ones with multiple descriptions of papers you want.
akaibu create-feed "awesome-llms" "https://rss.arxiv.org/rss/cs.LG+cs.CL" "I want any papers about LLMs."

Can be more complex

akaibu create-feed "llm-agent-memory" "https://rss.arxiv.org/rss/cs.LG+cs.CL" "Papers that that explore the use of LLMs as Agents. Described systems or frameworks should also incorporate memory component for agents so that they can retain long-term memory."

Use it

After setting up everything, finally, you can run the command to get the papers, simply run the following command

akaibu <number-of-papers-to-consider> <model-name> -l <library-name>

So, an example that corresponding the previous one would be something like,

akaibu 25 qwen3:4b -l awesome-llms

This will download 25 most recent papers from arXiv RSS feed, and filter the ones that match your requirements and present with a short summary. The result would look like,

akaibu saves which papers have been scanned with an LLM, so you cannot have the same papers recommended.

Read past papers

All the previously found relevant papers are saved in a database (locally on your machine ofc), you can see them by running,

akaibu show-past-papers <library-name>

You can also print them as markdown file which makes it easier to keep in your note or share with your colleagues,

akaibu show-past-papers <library-name> --in-markdown

Road map

It takes time to consume many papers

Currently all of the papers are checked by an LLM. If you are digesting many papers at once, this can take time. I am planning to pre-filter with a ligher model before LLM check.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

akaibu-0.4.0.tar.gz (168.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

akaibu-0.4.0-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file akaibu-0.4.0.tar.gz.

File metadata

  • Download URL: akaibu-0.4.0.tar.gz
  • Upload date:
  • Size: 168.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.18

File hashes

Hashes for akaibu-0.4.0.tar.gz
Algorithm Hash digest
SHA256 54db4515d007fe58ea9c93585c8ee6280b9459bcde7b36e54c64191069504028
MD5 4984fe3a5764a92877f7d8b483227562
BLAKE2b-256 26653da16740a56b9393402fd3ec8a6b332036562cb70285a4a6ddc05ae190d1

See more details on using hashes here.

File details

Details for the file akaibu-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: akaibu-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 8.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.18

File hashes

Hashes for akaibu-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3d69330788e778ed9731f6154882dd581dabf92dffed1a4b841040a32590a1a5
MD5 896ebf007cbbef021b57eae16635629d
BLAKE2b-256 895bb317c5a29d3536bc18c4feb5f795e82d9af1a06222121c08c8e14d032f03

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page