Skip to main content

A small tool for automating collecting data from ChatGPT

Project description

image

A small tool for automating collecting data from ChatGPT over long periods of time.

Stargazers Releases Issues

What does it do?

ChatGPT currently limits the number of questions that users may ask per hour. The goal of this project is to allow users to just leave their computers on for extended periods of time to collect large amounts of responses from ChatGPT. There might not be a lot of practical use for this. Its main use is in research or data analysis.

Install as a Python Library

pip install sleepyask

image

Documentation

Authentication

You are required to provide an organization as well as an API Key
organization - Your organization ID. Get it here: https://platform.openai.com/account/org-settings
api_key - You create an API Key on OpenAI by. Get it here: https://platform.openai.com/account/api-keys

Clicking on your profile picture on the top-right > View API Keys > Create new secret key.  

count - This specifies the number of workers to create for asking questions. You can have multiple workers asking questions in parallel.

Sample config

config = {
	"organization": "Your OpenAI organization",
	"api_key": "Your OpenAI api key",
	"count": 1 
}

Sample code

from sleepyask.openai import chat

# Your ChatGPT login information
config_1 = {
	"organization": "Your ChatGPT organization",
	"api_key": "Your ChatGPT api key",
	"count": 1
}

config_2 = {
	"organization": "Your ChatGPT organization",
	"api_key": "Your ChatGPT api key",
	"count": 1
}

configs = [config_1, config_2]

## List of questions you would like to ask ChatGPT
question_list = [
  'What is 1 + 1?',
  'What is 1 + 2?',
  'What is 1 + 3?'
]

# The filename in which you would like your responses to be stored.
# sleepyask will create this file for you. If you create it yourself, there might be some problems.
output_file_path = 'draw.json'  

# Run sleepy_ask
chat.ask(configs=configs,
           questions=question_list,
           output_file_path=output_file_path,
           verbose=True)

# chat.ask has the following optional parameters:
# verbose : bool = Whether or not sleepyask should print its prompts. It is False by default.
# model: str = The ChatGPT model to ask. This is "gpt-3.5-turbo" by default.
# system_text: str | None = System text to prime ChatGPT. This is None by default.

Get involved

  • 🐛 Found a bug or interested in adding a feature? - Create an issue
  • 🤗 Want to help? - Create a pull-request!

Credits

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sleepyask-4.3.0.tar.gz (17.0 kB view details)

Uploaded Source

Built Distribution

sleepyask-4.3.0-py3-none-any.whl (17.5 kB view details)

Uploaded Python 3

File details

Details for the file sleepyask-4.3.0.tar.gz.

File metadata

  • Download URL: sleepyask-4.3.0.tar.gz
  • Upload date:
  • Size: 17.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for sleepyask-4.3.0.tar.gz
Algorithm Hash digest
SHA256 71cc595ece554244ea5d21d9e4c45e8e6d63cd1f185c812059bb9cf27c0fe07c
MD5 3b78235b4e2574af110af404dec93cc1
BLAKE2b-256 7205ed04362fde7952f56242562cbad6b37db99078c429d28f59bbbeb6a24f07

See more details on using hashes here.

File details

Details for the file sleepyask-4.3.0-py3-none-any.whl.

File metadata

  • Download URL: sleepyask-4.3.0-py3-none-any.whl
  • Upload date:
  • Size: 17.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for sleepyask-4.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fb9af7443503c26de0bff16d4b22583025bb74fff81495c44afee47fde7bab77
MD5 f9474edc488932fd35eebb2648f6fb05
BLAKE2b-256 0339d4be9af1f272d683cc591ae798c0b15150ef7176f8fa6fa9e3f7ca8efbb6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page