Socket-based AI chat server and multi-user chat client
Project description
socktalk - A Robust Socket-based AI Chat Server and Multi-user Chat Client
socktalk is a chat server and client developed using Python's socket programming. It affords real-time, multi-user interactions and integrates with OpenAI's GPT models to offer an AI-driven chatting experience.
Features
- Multi-User Support: Manages multiple connections simultaneously to support multi-user interactions.
- Efficient Message Handling: Uses non-blocking sockets and
select.select()
for real-time, concurrent message processing. - AI Integration: Augments multi-user chat with AI-driven responses, configurable for different operational modes.
- Standardized Protocol: Implements a simple message protocol with fixed-length headers to streamline communication.
- Versatile Client Options: Offers both a sophisticated graphical user interface and a lightweight terminal-based client.
For further details, visit the GitHub repository for socktalk.
Installation
From GitHub
To install "socktalk" from GitHub:
- Clone the repository:
git clone https://github.com/mdkmk/socktalk cd socktalk
- Install the package:
python setup.py install
Using pip
- To install using pip directly from PyPI:
pip install socktalk
Usage - Run different components of "socktalk" using the following commands:
Set up the .env file in your working directory to configure AI behaviors and server/client settings without using command-line flags. Alternatively, you can use command-line flags to override .env file or default settings.
AI-enhanced chat server:
socktalk --ai
With flag override:
socktalk --ai --openai_api_key=YOUR_OPENAI_API_KEY_HERE --ai_mode2_active=False --ai_mode1_interval=3
Chat server without AI integration:
socktalk --server
With flag override:
socktalk --server --server_ip=127.0.0.1 --server_port=1234
GUI-enabled multi-user chat client:
socktalk --client
With flag override:
socktalk --client --server_ip=127.0.0.1 --server_port=1234
Terminal-based chat client:
socktalk --terminal
With flag override:
socktalk --terminal --server_ip=127.0.0.1 --server_port=1234
AI Chatbot Configuration
AI Client Modes:
- Respond every N lines: The AI reads the conversation and responds after every N lines.
- Respond every N seconds: The AI generates new content at specified intervals.
Example .env File Configuration
Below is an example .env file. Update the OpenAI chatgpt API key and ensure you have at least $5 credit on your OpenAI account. A valid OPENAI_API_KEY is required for AI functionality. The AI client has two modes which can be toggled on or off. AI response intervals, model, and content prompt are configurable. Full chat history can be sent to the API for context. If the remaining .env variables are not specified or called as a flag, defaults in the example below will be used.
OPENAI_API_KEY=YOUR_OPENAI_API_KEY_HERE
SERVER_IP=127.0.0.1
SERVER_PORT=1234
SEND_FULL_CHAT_HISTORY=True
AI_MODE1_ACTIVE=True
AI_MODE1_INTERVAL=1
AI_MODE1_MODEL=gpt-3.5-turbo
AI_MODE2_ACTIVE=True
AI_MODE2_INTERVAL=60
AI_MODE2_MODEL=gpt-3.5-turbo
AI_MODE2_CONTENT="Say something interesting from a random Wikipedia page and start your response with 'Did you know', but don't mention the source."
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file socktalk-0.2.3.tar.gz
.
File metadata
- Download URL: socktalk-0.2.3.tar.gz
- Upload date:
- Size: 15.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2b24d68d82ac121fd587ccdf2f7af5b64f904d6427709771caf1c3ee375bbb74 |
|
MD5 | fa83006a1032ea4ce9b9dac734658d90 |
|
BLAKE2b-256 | 848d02bf3351715d70cd90b262c086b91928d969c9d00b9d7ccf58303ffa8719 |
File details
Details for the file socktalk-0.2.3-py3-none-any.whl
.
File metadata
- Download URL: socktalk-0.2.3-py3-none-any.whl
- Upload date:
- Size: 16.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 44fcfe5d0471c8ff5fba6958df84b728f62557ba6700c3df4fb81dcee78bc2c9 |
|
MD5 | e4a7697a9c037cfd02290a69b079f646 |
|
BLAKE2b-256 | 6fb9fadc2b80384a2abc768736eb2b8501b7d9808316dc695456c58fe05f4497 |