Skip to main content

Build your costomized skill library

Project description

◓ Open Creator

Discord JA doc ZH doc License paper

Build your costomized skill library
An open-source LLM tool helps create your tools


open-creator is an innovative package designed to extract skills from existing conversations or a requirement, save them, and retrieve them when required. It offers a seamless way to consolidate and archive refined versions of codes, turning them into readily usable skill sets, thereby enhancing the power of the open-interpreter.

Features

  • Skill Library: Efficiently save and retrieve structured function calls.
  • Reflection Agent: Automatically structures and categorizes your function calls.
  • cache Chat LLM runs by using SQLite which is stored in ~/.cache/open_creator/llm_cache/.langchain.db: Save time and money by reusing previous runs.
  • Sreaming: Stream your function calls
  • Community Hub: Share and utilize skills from the wider community. Support huggingface_hub. langchain_hub not yet

Installation

pip install -U open-creator

Usage

import creator

1. Create a Skill

  • 1.1 from a request
  • 1.2 from a conversation history (openai messages format)
  • 1.3 from a skill json file
  • 1.4 from a messages_json_path
  • 1.5 from code file content
  • 1.6 from doc file content
  • 1.7 from file path
  • 1.8 from huggingface

1.1 Create a skill from a request

request = "help me write a script that can extracts a specified section from a PDF file and saves it as a new PDF"
skill = creator.create(request=request)

1.5 Create a skill from code file content

code_content = """
import json

def convert_to_openai_messages(messages):
    new_messages = []

    for message in messages:  
        new_message = {
            "role": message["role"],
            "content": ""
        }

        if "message" in message:
            new_message["content"] = message["message"]

        if "code" in message:
            new_message["function_call"] = {
                "name": "run_code",
                "arguments": json.dumps({
                    "language": message["language"],
                    "code": message["code"]
                }),
                # parsed_arguments isn't actually an OpenAI thing, it's an OI thing.
                # but it's soo useful! we use it to render messages to text_llms
                "parsed_arguments": {
                    "language": message["language"],
                    "code": message["code"]
                }
            }

        new_messages.append(new_message)

        if "output" in message:
            output = message["output"]

            new_messages.append({
                "role": "function",
                "name": "run_code",
                "content": output
            })

    return new_messages
"""
skill = creator.create(file_content=code_content)

1.6 Create a skill from doc file content

doc_content = """
# Installation
\`\`\`shell
pip install langchain openai 
\`\`\`
The chat model will respond with a message.
\`\`\`python
from langchain.schema import (
    AIMessage,
    HumanMessage,
    SystemMessage
)
from langchain.chat_models import ChatOpenAI

chat = ChatOpenAI()
chat([HumanMessage(content="Translate this sentence from English to French: I love programming.")])
\`\`\`
you will get AIMessage(content="J'adore la programmation.", additional_kwargs={}, example=False)

We can then wrap our chat model in a ConversationChain, which has built-in memory for remembering past user inputs and model outputs.

\`\`\`python
from langchain.chains import ConversationChain  
  
conversation = ConversationChain(llm=chat)  
conversation.run("Translate this sentence from English to French: I love programming.")
\`\`\`
output: 'Je adore la programmation.'

conversation.run("Translate it to German.")

output: 'Ich liebe Programmieren.'
"""

skill = creator.create(file_content=doc_content)

1.7 Create a skill from file path

skill = creator.create(file_path="creator/utils/partial_json_parse.py")

1.8 Create a skill from huggingface

skill = creator.create(huggingface_repo_id="YourRepoID", huggingface_skill_path="your_skill_path")

2. Save a Skill

  • 2.1 Save to default path
  • 2.2 Save to specific skill path
  • 2.3 Save to huggingface

2.1 Save to default path

creator.save(skill)

2.2 Save to specific skill path

creator.save(skill, skill_path="path/to/your/skill/directory")

2.3 Save to huggingface

creator.save(skill, huggingface_repo_id="YourRepoID")

3. Search skills

  • 3.1 Local Search

3.1 Local Search

skills = creator.search("your_search_query")
for skill in skills:
    print(skill)

4. Use a skill

  • 4.1 Use a skill
from rich.markdown import Markdown
from rich import print
skill = creator.search("pdf extract section")[0]
input_args = {
    "pdf_path": "creator.pdf",
    "start_page": 3,
    "end_page": 8,
    "output_path": "creator3-8.pdf"
}
print(Markdown(repr(skill)))
resp = skill.run(input_args)
print(resp)

Contributing

We welcome contributions from the community! Whether it's a bug fix, new feature, or a skill to add to the library, your contributions are valued. Please check our Contributing Guidelines for guidelines.

License

Open Creator is licensed under the MIT License. You are permitted to use, copy, modify, distribute, sublicense and sell copies of the software.

Reference

[1] Lucas, K. (2023). open-interpreter [Software]. Available at: https://github.com/KillianLucas/open-interpreter

[2] Qian, C., Han, C., Fung, Y. R., Qin, Y., Liu, Z., & Ji, H. (2023). CREATOR: Disentangling Abstract and Concrete Reasonings of Large Language Models through Tool Creation. arXiv preprint arXiv:2305.14318.

[3] Wang, G., Xie, Y., Jiang, Y., Mandlekar, A., Xiao, C., Zhu, Y., Fan, L., & Anandkumar, A. (2023). Voyager: An Open-Ended Embodied Agent with Large Language Models. arXiv preprint arXiv:2305.16291.

Paper and Citation

If you find our work useful, please consider citing us!

@techreport{gong2023opencreator,
  title = {Open-Creator: Bridging Code Interpreter and Skill Library},
  author = {Gong, Junmin and Wang, Sen and Zhao, Wenxiao and Guo, Jing},
  year = {2023},
  month = {9},
  url = {https://github.com/timedomain-tech/open-creator/blob/main/docs/tech_report/open-creator.pdf},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_creator-0.1.0.tar.gz (39.2 kB view details)

Uploaded Source

Built Distribution

open_creator-0.1.0-py3-none-any.whl (52.3 kB view details)

Uploaded Python 3

File details

Details for the file open_creator-0.1.0.tar.gz.

File metadata

  • Download URL: open_creator-0.1.0.tar.gz
  • Upload date:
  • Size: 39.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for open_creator-0.1.0.tar.gz
Algorithm Hash digest
SHA256 17c14753870ba5c3bf0f9da1f156f419f91e67b1cf0b57dfd1e5c23460855647
MD5 5f0ddb5092fd0909c22e3f56307790fc
BLAKE2b-256 c17ffbe5e71427cb3a761c4b6e6b022a8b5bcc0ded4bb1f21b527c8f5edba003

See more details on using hashes here.

File details

Details for the file open_creator-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: open_creator-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 52.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for open_creator-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d4684e092571ef4278a3981995b99d3ded6a9aad69047e40b6ec4ab528200208
MD5 efd45743ef16ed0f5b9ad8722025786b
BLAKE2b-256 650480c7d91d1181d1307315ce78496bd2942b71f8be6f22794af001629b1fb3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page