Skip to main content

A huggingchat python api.

Project description

hugging-chat-api

English | 简体中文

Unofficial HuggingChat Python API, extensible for chatbots etc.

PyPi Support_Platform Downloads

Note
Recently new updates:

Installation

pip install hugchat

or

pip3 install hugchat

Usage

API

from hugchat import hugchat
from hugchat.login import Login

# Log in to huggingface and grant authorization to huggingchat
sign = Login(email, passwd)
cookies = sign.login()

# Save cookies to the local directory
cookie_path_dir = "./cookies_snapshot"
sign.saveCookiesToDir(cookie_path_dir)

# Load cookies when you restart your program:
# sign = login(email, None)
# cookies = sign.loadCookiesFromDir(cookie_path_dir) # This will detect if the JSON file exists, return cookies if it does and raise an Exception if it's not.

# Create a ChatBot
chatbot = hugchat.ChatBot(cookies=cookies.get_dict())  # or cookie_path="usercookies/<email>.json"


# non stream response
query_result = chatbot.query("Hi!")
print(query_result) # or query_result.text or query_result["text"]

for resp in chatbot.query(
    "Hello",
    stream=True
):
    print(resp)  # stream response

# Use web search *new
query_result = chatbot.query("Hi!", web_search=True)
print(query_result) # or query_result.text or query_result["text"]
for source in query_result.web_search_sources:
    print(source.link)
    print(source.title)
    print(source.hostname)

# Create a new conversation
id = chatbot.new_conversation()
chatbot.change_conversation(id)

# Get conversation list
conversation_list = chatbot.get_conversation_list()

# Switch model (default: meta-llama/Llama-2-70b-chat-hf. )
chatbot.switch_llm(0) # Switch to `OpenAssistant/oasst-sft-6-llama-30b-xor`
chatbot.switch_llm(1) # Switch to `meta-llama/Llama-2-70b-chat-hf`

The query() function receives these parameters:

  • text: Required[str].
  • temperature: Optional[float]. Default is 0.9
  • top_p: Optional[float]. Default is 0.95
  • repetition_penalty: Optional[float]. Default is 1.2
  • top_k: Optional[int]. Default is 50
  • truncate: Optional[int]. Default is 1024
  • watermark: Optional[bool]. Default is False
  • max_new_tokens: Optional[int]. Default is 1024
  • stop: Optional[list]. Default is ["</s>"]
  • return_full_text: Optional[bool]. Default is False
  • stream: Optional[bool]. Default is True
  • use_cache: Optional[bool]. Default is False
  • is_retry: Optional[bool]. Default is False
  • retry_count: Optional[int]. Number of retries for requesting huggingchat. Default is 5

CLI

version 0.0.5.2 or newer

Simply run the following command in your terminal to start the CLI mode

python -m hugchat.cli

CLI params:

  • -u <your huggingface email> : Provide account email to login.
  • -p : Force request password to login, ignores saved cookies.
  • -s : Enable streaming mode output in CLI.

Commands in cli mode:

  • /new : Create and switch to a new conversation.

  • /ids : Shows a list of all ID numbers and ID strings in current session.

  • /switch <id> : Switches to the ID number or ID string passed.

  • /del <id> : Deletes the ID number or ID string passed. Will not delete active session.

  • /clear : Clear the terminal.

  • /llm : Get available models you can switch to.

  • /llm <index> : Switches model to given model index based on /llm.

  • /sharewithauthor <on|off> : Changes settings for sharing data with model author. On by default.

  • /exit : Closes CLI environment.

  • /stream <on|off>: streaming the response.

  • /web <on|off>: web search.

  • /web-hint <on|off>: display web search hint.

  • AI is an area of active research with known problems such as biased generation and misinformation. Do not use this application for high-stakes decisions or advice.

  • Server resources are precious, it is not recommended to request this API in a high frequency. (Hugging Face's CTO🤗 just liked the suggestion)

Disclaimers

This is not an official Hugging Face product. This is a personal project and is not affiliated with Hugging Face in any way. Don't sue us.

Star History

Star History Chart

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hugchat-0.3.1.tar.gz (27.4 kB view hashes)

Uploaded Source

Built Distribution

hugchat-0.3.1-py3-none-any.whl (26.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page