Skip to main content

Create your chatbot or AI agent using Intellinode. We make any model smarter.

Project description

Intelli

Create chatbots and AI agent work flows. Intellinode python module connect your data with multiple AI models like OpenAI, Gemini, Anthropic, Stable Diffusion, or Mistral through a unified access layer.

pip install intelli

Latest changes

  • Add Anthropic claude 3 as a chatbot provider.
  • Add KerasAgent to load open source models offline.

For detailed instructions, refer to intelli documentation.

Code Examples

Create Chatbot

Switch between multiple chatbot providers without changing your code.

from intelli.function.chatbot import Chatbot
from intelli.model.input.chatbot_input import ChatModelInput

def call_chatbot(provider, model=None):
    # prepare common input 
    input = ChatModelInput("You are a helpful assistant.", model)
    input.add_user_message("What is the capital of France?")

    # creating chatbot instance
    openai_bot = Chatbot(YOUR_API_KEY, provider)
    response = openai_bot.chat(input)

    return response

# call openai
call_chatbot("openai", "gpt-4")

# call mistralai
call_chatbot("mistral", "mistral-medium")

# call claude3
call_chatbot(ChatProvider.ANTHROPIC, "claude-3-sonnet-20240229")

# call google gemini
call_chatbot("gemini")

Chat With Docs

Chat with your docs using multiple LLMs. To connect your data, visit the IntelliNode App, start a project using the Document option, upload your documents or images, and copy the generated One Key. This key will be used to connect the chatbot to your uploaded data.

# creating chatbot with the intellinode one key
bot = Chatbot(YOUR_OPENAI_API_KEY, "openai", {"one_key": YOUR_ONE_KEY})

input = ChatModelInput("You are a helpful assistant.", "gpt-3.5-turbo")
input.add_user_message("What is the procedure for requesting a refund according to the user manual?")

response = bot.chat(input)

Generate Images

Use the image controller to generate arts from multiple models with minimum code change:

from intelli.controller.remote_image_model import RemoteImageModel
from intelli.model.input.image_input import ImageModelInput

# model details - change only two words to switch
provider = "openai"
model_name = "dall-e-3"

# prepare the input details
prompts = "cartoonishly-styled solitary snake logo, looping elegantly to form both the body of the python and an abstract play on data nodes."
image_input = ImageModelInput(prompt=prompt, width=1024, height=1024, model=model_name)

# call the model openai/stability
wrapper = RemoteImageModel(your_api_key, provider)
results = wrapper.generate_images(image_input)

Create AI Flows

You can create a flow of tasks executed by different AI models. Here's an example of creating a blog post flow:

  • ChatGPT agent to write a post.
  • Google gemini agent to write image description.
  • Stable diffusion to generate images.
from intelli.flow.agents.agent import Agent
from intelli.flow.tasks.task import Task
from intelli.flow.sequence_flow import SequenceFlow
from intelli.flow.input.task_input import TextTaskInput
from intelli.flow.processors.basic_processor import TextProcessor

# define agents
blog_agent = Agent(agent_type='text', provider='openai', mission='write blog posts', model_params={'key': YOUR_OPENAI_API_KEY, 'model': 'gpt-4'})
copy_agent = Agent(agent_type='text', provider='gemini', mission='generate description', model_params={'key': YOUR_GEMINI_API_KEY, 'model': 'gemini'})
artist_agent = Agent(agent_type='image', provider='stability', mission='generate image', model_params={'key': YOUR_STABILITY_API_KEY})

# define tasks
task1 = Task(TextTaskInput('blog post about electric cars'), blog_agent, log=True)
task2 = Task(TextTaskInput('Generate short image description for image model'), copy_agent, pre_process=TextProcessor.text_head, log=True)
task3 = Task(TextTaskInput('Generate cartoon style image'), artist_agent, log=True)

# start sequence flow
flow = SequenceFlow([task1, task2, task3], log=True)
final_result = flow.start()

To build async AI flows with multiple paths, refer to the flow tutorial.

Pillars

  • The wrapper layer provides low-level access to the latest AI models.
  • The controller layer offers a unified input to any AI model by handling the differences.
  • The function layer provides abstract functionality that extends based on the app's use cases.
  • Flows: create a flow of ai agents working toward user tasks.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

intelli-0.4.5.tar.gz (43.4 kB view details)

Uploaded Source

Built Distribution

intelli-0.4.5-py3-none-any.whl (61.8 kB view details)

Uploaded Python 3

File details

Details for the file intelli-0.4.5.tar.gz.

File metadata

  • Download URL: intelli-0.4.5.tar.gz
  • Upload date:
  • Size: 43.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for intelli-0.4.5.tar.gz
Algorithm Hash digest
SHA256 f451984adcb84de49ac7296d3ee602eb62981bc86a2461ac2c8ed5a7d15a109d
MD5 0e2777abb668e2a54e4bc895aa9618b1
BLAKE2b-256 3196d7dbbf29d28bc6d422d6c98f886a254b23976e3e79388470bc3e1eccf5cc

See more details on using hashes here.

File details

Details for the file intelli-0.4.5-py3-none-any.whl.

File metadata

  • Download URL: intelli-0.4.5-py3-none-any.whl
  • Upload date:
  • Size: 61.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for intelli-0.4.5-py3-none-any.whl
Algorithm Hash digest
SHA256 7a0a5fcd055dac84eb33b99e640da3db0545aaa448de33f18598189889634a23
MD5 a110de1c9b27a54590ea340d8fe9065c
BLAKE2b-256 09cfd8469995185d32351f7e851a948a80e434ad4a44fe73d3c9de903a40527b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page