Skip to main content

Use LLMs for Free.

Project description

 Logo

FREE_LLMs

PyPi Published Linter

🤔 What is Free_LLMs?

Free LLMs is a framework that allows you to use a browser-based interface for large language models such as ChatGPT in an API-like style for FREE!!!. It provides an easier way to interact with browser-based LLMs and nothing else. All ownership belongs to the original owners of the respective LLMs.

Quick Install

With pip:

pip install free_llms

Models-Supported:

Model Supported
ChatGPT
Preplexity ai
Mistral
Claude

ChatGPT

from free_llms.models import GPTChrome

driver_config = []  # pass in selnium driver config except for the following ["--disable-gpu", f"--window-size=1920,1080"]
with GPTChrome(
    driver_config=driver_config,
    email="",  # for gpt we do not need email
    password="", # # for gpt we do not need password
) as session:  # A single session started with ChartGPT
    data = session.send_prompt("""Write an SQL Query which shows how to get third highest salary
    """)  # First Message
    data1 = session.send_prompt("Now convert it into python")  # Second message
    print(session.messages)  # Messages in the current session in pair of <Human,AI>

Preplexity AI

from free_llms.models import PreplexityChrome
driver_config = [] # pass in selnium driver config except for the following ["--disable-gpu", f"--window-size=1920,1080"]
with PreplexityChrome(driver_config=driver_config,
               email = '', # for preplexity we do not need email
               password = '',# for preplexity we do not need password
               ) as session: # A single session started with Preplexity
    data = session.send_prompt("""Make the following sentence correct:
    I did went to Lahore.                           
    """) # First Message
    data = session.send_prompt("""Who is george hotz?""") # Second Message, right now each message is independent in preplexity ai
    print(session.messages) # Messages in the current session in pair of <Human,AI>

Mistral

from free_llms.models import MistralChrome
driver_config = [] # pass in selnium driver config except for the following ["--disable-gpu", f"--window-size=1920,1080"]
with MistralChrome(driver_config=driver_config,
               email = '21110290@lums.edu.pk', # Mistral Email
               password = '',# Mistral Password
               ) as session: # A single session started with Mistral
    session.send_prompt('Write a short long horro story of 100 woirds')
    session.send_prompt('Make it funny')
    print(session.messages)

Claude

from free_llms.models import ClaudeChrome
driver_config = [] # pass in selnium driver config except for the following ["--disable-gpu", f"--window-size=1920,1080"]
with ClaudeChrome(driver_config=driver_config,
               email = 'mohammad.mohtashim78@gmail.com',
               password = '', # password not needed for ClaudeChrome
               ) as session: # A single session started with ClaudeChrome
    # once you login, you will get a code at your email which you need to type in
    session.send_prompt('What is silicon valley?')
    session.send_prompt('How many seasons it had?')
    print(session.messages)

Integration with Langchain

from free_llms.langchain_model import FreeLLMs
from langchain.prompts import PromptTemplate
# model name can be any of the following: GPTChrome,PreplexityChrome,MistralChrome,ClaudeChrome
model = FreeLLMs(model_name = 'PreplexityChrome', llm_kwargs = {
              'driver_config':[],
               'email':'email',
               'password':''})

prompt = PromptTemplate.from_template('Write me a joke about {topic}')
chain =  prompt | model | str
print(chain.invoke({'topic':'coding'}))

Note:

  • Free_LLMs only uses a Patched Chrome Driver as it's main driver. The driver can be found here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

free_llms-0.1.3.1.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

free_llms-0.1.3.1-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file free_llms-0.1.3.1.tar.gz.

File metadata

  • Download URL: free_llms-0.1.3.1.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.9.19

File hashes

Hashes for free_llms-0.1.3.1.tar.gz
Algorithm Hash digest
SHA256 a7be9c898b0df8379969ade96615bf3ecd38b3bc7be42f28ec4be70304a790e2
MD5 42c5f9089b4e8869bfa87171a08e64c0
BLAKE2b-256 5f733e8bf82715c0941ee8099ee74e3182c42e36de47f73f6dc58fb54dee4134

See more details on using hashes here.

File details

Details for the file free_llms-0.1.3.1-py3-none-any.whl.

File metadata

  • Download URL: free_llms-0.1.3.1-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.9.19

File hashes

Hashes for free_llms-0.1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c9eb90546de29dfea695c7c3b5fad9ded034086562d6de2d42efa84f860726ce
MD5 809629e225c41a038c28f69a85d17167
BLAKE2b-256 eb8aef45bd00bb65cbc9e10261eabfff833d41c25889fe47baac8e1132046173

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page