Skip to main content

Python Packages of functions for performing stats for bilby

Project description

BilbyStats

A collection of statistical and machine learning functions for use in the Bilby pipeline

Clone the package

To pull the repo run

git clone --depth=1 https://github.com/bilbyai/bilbystats/

Set up API keys

API keys must be added in order to call LLMs. To do so they must be added to the environment. The simplest way to do so is to add lines of the following form to your .bashrc or .zshrc file.

export OPENROUTER_API_KEY=exampleapikey
export OPENAI_API_KEY=exampleapikey
export CLAUDE_API_KEY=exampleapikey

Where you should replace exampleapikey with the corresponding API key in each case.

Install the package

Optional: Create a conda environment

To create a conda environment for this repo, just run

conda create --name bilbystats python=3.13
conda activate bilbystats

Installation using uv

Navigate to the root directory of the package and run

uv pip install .

Note that the requirements-pip.txt is required to deal with dependencies which uv struggles to install.

Alternative: Installation using just pip

Finally to install the package navigate to the root directory of the package and run

pip install .
pip install -r requirements-pip.txt

Importing the package

Then the package can be imported from within python via e.g.

import bilbystats as bs

Setting local defaults

In order to set local defaults. E.g. for checkpoint saving etc. You'll need to navigate to

cp bilbystats/defaults/local_defaults_example.env bilbystats/defaults/local_defaults.env

and then change the default parameters and reinstall the package.

Run local LLMs using Ollama

If you'd like to use the Ollama functions which allow you to call LLMs on your local machine you'll need to install Ollama. Once you've installed ollama you can download LLMs such as llama3.2

ollama run llama3.2

or deepseek-r1:7b

ollama run deepseek-r1:7b

See https://ollama.com/search for a full list of the available models.

To call the LLM programmatically using bilbystats you can run

bs.llm_api('test call', 'you are an llm', 'llama3.2')

or

bs.llm_api('test call', 'you are an llm', 'deepseek-r1:7b')

Or in general use the model name in any llm related function such as bs.translate.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bilbystats-0.1.7-py3-none-any.whl (6.7 MB view details)

Uploaded Python 3

File details

Details for the file bilbystats-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: bilbystats-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 6.7 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for bilbystats-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 a2454ee51fc5811f221c40302ffd1f0a7d1b1ca9203b6563b4e5a75a7b3b48a6
MD5 b9d935b91e2808f66ce49d2c0bbf0878
BLAKE2b-256 88e5917a3b5ba98508e6a0a5a1d5d77129c458709fdd7e702c40405fd1349dc7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page