Skip to main content

No project description provided

Project description

AI-powered Jupyter Notebook

Thread is a Jupyter alternative that integrates an AI copilot into your Jupyter Notebook editing experience.

Best of all, Thread runs locally and can be used for free with Ollama or your own API key. To start:

pip install thread-dev

To start thread-dev, run the following

thread

Key features

1. Familiar Jupyter Notebook editing experience

SameEditorExperience

2. Natural language code edits

CellEditing

3. Generate cells to answer natural language questions

ThreadGenerateMode

4. Ask questions in a context aware chat sidebar

ThreadChatDemo480

5. Automatically explain or fix errors

image

Demo

https://github.com/squaredtechnologies/thread/assets/18422723/b0ef0d7d-bae5-48ad-b293-217b940385fb

ThreadIntro

Feature Roadmap

These are some of the features we are hoping to launch in the next few month. If you have any suggestions or would like to see a feature added, please don't hesitate to open an issue or reach out to us via email or discord.

  • Add co-pilot style inline code suggestions
  • Data warehouse + SQL support
  • No code data exploration
  • UI based chart creation
  • Ability to collaborate on notebooks
  • Publish notebooks as shareable webapps
  • Add support for Jupyter Widgets
  • Add file preview for all file types

Thread.dev Cloud

Eventually we hope to integrate Thread into a cloud platform that can support collaboration features as well hosting of notebooks as web application. If this sounds interesting to you, we are looking for enterprise design partners to partner with and customize the solution for. If you're interested, please reach out to us via email or join our waitlist.

Development instructions

To run the repo in development mode, you need to run two terminal commands. One will run Jupyter Server, the other will run the NextJS front end.

To begin, run:

yarn install

Then in one terminal, run:

sh ./run_dev.sh

And in another, run:

yarn dev

Navigate to localhost:3000/thread and you should see your local version of Thread running.

If you would like to develop with the AI features, navigate to the proxy folder and run:

yarn install

Then:

yarn dev --port 5001

Using Thread with Ollama

You can use Ollama for a fully offline AI experience. To begin, install and run thread using the commands above.

Once you have run thread, in the bottom left, select the Settings icon:

image

Next, select the Model Settings setting:

image

This is what you will see:

image

Navigate to Ollama and enter your model details:

image

Use Ctrl / Cmd + K and try running a query to see how it looks!

Why we built Thread

We initially got the idea when building Vizly a tool that lets non-technical users ask questions from their data. While Vizly is powerful at performing data transformations, as engineers, we often felt that natural language didn't give us enough freedom to edit the code that was generated or to explore the data further for ourselves. That is what gave us the inspiration to start Thread.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

thread_dev-0.1.21.tar.gz (14.9 MB view hashes)

Uploaded Source

Built Distribution

thread_dev-0.1.21-py3-none-any.whl (15.1 MB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page