Skip to main content

A small tool for automating collecting data from ChatGPT

Project description

image

A small tool for automating collecting data from ChatGPT over long periods of time.

Stargazers Releases Issues

What does it do?

ChatGPT currently limits the number of questions that users may ask per hour. The goal of this project is to allow users to just leave their computers on for extended periods of time to collect large amounts of responses from ChatGPT. There might not be a lot of practical use for this. Its main use is in research or data analysis.

Install as a Python Library

pip install sleepyask

image

Documentation

Authentication

You are required to provide an organization as well as an API Key
organization - Your organization ID. Get it here: https://platform.openai.com/account/org-settings
api_key - You create an API Key on OpenAI by. Get it here: https://platform.openai.com/account/api-keys

Clicking on your profile picture on the top-right > View API Keys > Create new secret key.  

count - This specifies the number of workers to create for asking questions. You can have multiple workers asking questions in parallel.

Sample config

config = {
	"organization": "Your OpenAI organization",
	"api_key": "Your OpenAI api key",
	"count": 1 
}

Sample code

from sleepyask.openai import chat

# Your ChatGPT login information
config_1 = {
	"organization": "Your ChatGPT organization",
	"api_key": "Your ChatGPT api key",
	"count": 1
}

config_2 = {
	"organization": "Your ChatGPT organization",
	"api_key": "Your ChatGPT api key",
	"count": 1
}

configs = [config_1, config_2]

## List of questions you would like to ask ChatGPT
question_list = [
  'What is 1 + 1?',
  'What is 1 + 2?',
  'What is 1 + 3?'
]

# The filename in which you would like your responses to be stored.
# sleepyask will create this file for you. If you create it yourself, there might be some problems.
output_file_path = 'draw.json'  

# Run sleepy_ask
chat.ask(configs=configs,
           questions=question_list,
           output_file_path=output_file_path,
           verbose=True)

# chat.ask has the following optional parameters:
# verbose : bool = Whether or not sleepyask should print its prompts. It is False by default.
# model: str = The ChatGPT model to ask. This is "gpt-3.5-turbo" by default.
# system_text: str | None = System text to prime ChatGPT. This is None by default.

Get involved

  • 🐛 Found a bug or interested in adding a feature? - Create an issue
  • 🤗 Want to help? - Create a pull-request!

Credits

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sleepyask-4.3.0.tar.gz (17.0 kB view hashes)

Uploaded Source

Built Distribution

sleepyask-4.3.0-py3-none-any.whl (17.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page