Skip to main content

No project description provided

Project description

nbwrite

Note: This is an experimental use case for LLMs, the output may at times be unhelpful or inappropriate

nbwrite is a CLI tool which generates notebook-based Python examples using LLMs

Potential use cases include:

  1. You are writing a Python package and you want to produce executable tutorials for your stakeholders
  2. You are using a Python package and you want to generate a kick-start guide
  3. You want to generate regression tests for a python package

Features

  • Converts a set of steps and a task description into an executable Python notebook
  • Configurable OpenAI API parameters
  • Generate notebooks based on your own code using retrieval augmented generation

Getting Started

1. Install via any Python package manager

pip install nbwrite

2. Setup your OpenAI API Access

You will need to create an account and potentially buy credits via https://platform.openai.com/

export OPENAI_API_KEY='sk-xxxx'

3. Create a spec file for your generation job

e.g. nbwrite/example1.yaml:

task: |
  Create a hello world notebook 'x.ipynb', use nbmake's NotebookRun class to test it from a Python application
steps:
  - Create a hello world notebook using nbformat
  - Use nbmake's NotebookRun class to execute it from a Python application
  - Check the output notebook printed what we were expecting
packages:
  - nbmake
generation:
  count: 2

4. Generate some notebooks

nbwrite ./nbwrite/example1.yaml

Your outputs will be in your current directory

Guides

Generate guides for my closed-source code

By default, OpenAI's models can generate docs based on parametric knowledge. This is limited to popular open source libraries.

The packages input in the spec file can be used to reference Python packages in your current environment, which will be indexed in a local Vector DB. Code relevant to the task is then stuffed into the prompt.

You can pass in an arbitrary number of packages, just remember that the code will be sent to OpenAI to create embeddings, and this costs money.

example:

packages:
  - my_internal_pkg
  - another.internal.pkg

Customise the OpenAI parameters

You can modify both the system prompt and the llm args to try out different OpenAI models, temperatures, etc. See Langchain's API ref

Note! This is a confusing use case -- change it to something relevant to your work.

task: | 
  Create a hello world notebook 'x.ipynb', use nbmake's NotebookRun class to test it from a Python application
steps:
  - Create a hello world notebook using nbformat
  - Use nbmake's NotebookRun class to execute it from a Python application
  - Check the output notebook printed what we were expecting
packages:
  - nbmake
  - nbformat
  - nbclient
generation:
  count: 2 # number of notebooks to generate
  # system_prompt:
  llm_kwargs:
    # https://api.python.langchain.com/en/latest/llms/langchain.llms.openai.BaseOpenAI.html#langchain.llms.openai.BaseOpenAI
    model_name: gpt-3.5-turbo # The API name of the model as per https://platform.openai.com/docs/models
    temperature: 0.5
  retriever_kwargs:
    k: 3
    search_type: similarity

FAQs and Troubleshooting

How much does this cost

It depends on (a) the model you use and other params such as context length, (b) the number of outputs you generate.

See OpenAI usage here https://platform.openai.com/account/usage

Debugging with Phoenix

This is an Alpha stage product, and we encourage you to investigate and report bugs

For any errors occurring during the main generation process, it's possible to view traces using Phoenix.

  1. Start Phoenix with this script

    #! /usr/bin/env python
    
    import phoenix
    phoenix.launch_app()
    
    input("Press any key to exit...")
    
  2. In another termianl, run nbwrite with the following var set: export NBWRITE_PHOENIX_TRACE=1

  3. Check the phoenix traces in the dashboard (default http://127.0.0.1:6060/)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nbwrite-0.1.tar.gz (9.6 kB view details)

Uploaded Source

Built Distribution

nbwrite-0.1-py3-none-any.whl (11.1 kB view details)

Uploaded Python 3

File details

Details for the file nbwrite-0.1.tar.gz.

File metadata

  • Download URL: nbwrite-0.1.tar.gz
  • Upload date:
  • Size: 9.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.6

File hashes

Hashes for nbwrite-0.1.tar.gz
Algorithm Hash digest
SHA256 1af3c01f65f3065ac746dbaaed027eee71b7837daf3ad26078f88a729dcfa6f6
MD5 7ea564dcb6e333288d3922666fba6ccd
BLAKE2b-256 6a2aaa60b2898c9c9e52f62a79fdc71a12fc4a46cfe3dd4d828ee7fab9492bf8

See more details on using hashes here.

File details

Details for the file nbwrite-0.1-py3-none-any.whl.

File metadata

  • Download URL: nbwrite-0.1-py3-none-any.whl
  • Upload date:
  • Size: 11.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.6

File hashes

Hashes for nbwrite-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d1c2fef19aa40831419fa3bb130118eb9d87eed78c55b5b9860229fdbecd5ef8
MD5 c9d8c633ed41172db72a8ff0cf8cb715
BLAKE2b-256 1f7d758fd01d26032973b5168de1c12d4e67a5b630c687f4335b259ea29d6c4a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page