Skip to main content

Description of Noema

Project description

ReadMe Banner

Noema is an application of the declarative programming paradigm to a langage model. With Noema, you can control the model and choose the path it will follow. This framework aims to enable developpers to use LLM as an interpretor, not as a source of truth. Noema is built on llamacpp and guidance's shoulders.

Concept

  • Noesis: can be seen as the description of a function
  • Noema: is the representation (step by step) of this description
  • Constitution: is the process of transformation Noesis->Noema.
  • (Transcendantal) Subject: the object producing the Noema via the constitution of the noesis. Here, the LLM.
  • Horizon: the environement of the subject, in other words, a context.

Noema/Noesis, Subject, Horizon and Constitution are a pedantic and naive application of concept borrowed from Husserl's phenomenology.

Installation

pip install Noema

Features

Create the Subject

from Noema import *

s = Subject("path_to_model.gguf") # Full Compatibiliy with LLamaCPP.

s.add(thougth = "Time is the only problem") # store "Time is the only problem" in thougth

Create an horizon and its constitution

from Noema import *

s = Subject("path_to_model.gguf") # Full Compatibiliy with LLamaCPP.
s.add(thougth = "Time is the only problem") # store "Time is the only problem" in thougth

s= Horizon(
  Sentence(thougth_explanation = "You explain why {thougth}"), # The sentence produced is stored in thougth_explanation
  Int(explanation_note = "Give a note between 0 and 10 to qualify the quality of your explanation."), # The model produce an python integer that is stored in explanation_note
).constituteWith(s) # The horizon is constituted by the LLM

# Read the noema
print(s.noema)
# You are functioning in a loop of thought. Here is your reasoning step by step:
#   #THOUGTH_EXPLANATION: Explain why '{thougth}'.
#   #EXPLANATION_NOTE: Give a note between 0 and 10 to qualify the quality of your explanation.

# Here is the result of the reasoning:
#  #THOUGTH_EXPLANATION: The reason is that time is the only thing that is constant and cannot be changed.
#  #EXPLANATION_NOTE: 10

# Acces to each constitution separatly
print(s.explanation_note * 2) # The value of 'explanation_note' is an int.
# 20

Simple generators

Generators can be used to generate content from the subject (LLM) through the noesis (here, the task description).

from Noema import *

horizon = Horizon(
  Sentence(var_name = "task description"), # Produce a sentence stored in var_name
  Word(var_name = "task description"),     # Produce a word stored in var_name
  Int(var_name = "task description"),      # Produce an int stored in var_name
  Float(var_name = "task description"),    # Produce a float stored in var_name
  Bool(var_name = "task description"),     # Produce a bool stored in var_name
)

Composed generators

ListOf can be built with simple generators or a custom Step.

from Noema import *

horizon = Horizon(
  ListOf(Word, var_name = "task description",),  # Produce a list of Word stored in var_name
  ListOf(Int, var_name = "task description",),   # Produce a list of int stored in var_name
  ...
)

Selector

from Noema import *

s = Subject("path_to_model.gguf")

s= Horizon(
  Select(this_is_the_future = "Are local LLMs the future?", options=["Yes of course","Never!"]), # The model can only choose between "Yes of course" and "Never!". 
).constituteWith(s) # The horizon is constituted by the LLM

Information

Information are useful to insert some context to the current step of the noesis. Here we use a simple string, but we can also call a python function to do some RAG or other tasks.

from Noema import *

s = Subject("path_to_model.gguf")

s= Horizon(
    Information("You act like TARS in interstellar."),
    Sentence(joke = "Tell a short joke."),
    Print("{joke}")
).constituteWith(s)

Control Flow

IF/ELSE

from Noema import *

s = Subject("path_to_model.gguf")
s.add(thougth = "Time is the only problem") # store "Time is the only problem" in thougth

s = Horizon(
    Var(final_thought=None), # Create a variable final_thought
    Sentence(thougth_explanation = "Explain why '{thougth}'."), 
    Int(explanation_note = "Give a note between 0 and 10 to qualify the quality of your explanation."), 
    Select(auto_analysis="Do some auto-analysis, and choose a word to qualify your note", options=["Fair","Over optimistic","Neutral"]),
    IF(lambda: s.explanation_note < 5, [
        Information("The explanation is not clear enough, and the note is too low."),
        Int(points_to_add = "How many points do you think you should add to be fair?"),
        Sentence(points_explanation = "Explain why you think you should add {points_to_add} points."),
        Var(final_thought = "The explanation is not clear enough, and the note is too low. I should add {points_to_add} points."),
    ],ELSE=[
       IF(lambda: s.auto_analysis == 'Over optimistic', [  
            Int(points_to_remove ="How many points do you think you should remove to be fair?"),
            Sentence(points_explanation = "Explain why you think you should remove {points_to_remove} points."),
            Var(final_thought = "The explanation is not clear enough, and the note is too low. I should remove {points_to_remove} points."),
       ],ELSE=[
            Print("The explanation is clear enough, and the note is fair."),   
            Var(final_thought = "The note is fair."),
        ]),
    ])
).constituteWith(s) # The horizon is constituted by the LLM

print(s.final_thought) # Print the final thought
# The explanation is not clear enough, and the note is too low. I should add 5 points.

ForEach

from Noema import *

s = Subject("path_to_model.gguf")

s = Horizon(
    ListOf(Sentence, problems =  "What are the problems you are facing (in order of difficulty)?"), # The model produce a list of sentence that is stored in {problems}
    ForEach(lambda: s.problems, [
        Sentence(item_explanation = "Explain why '{item}' is the problem No {idx}."), 
        Print("Pb Nb {idx}: {item}. Explanation: {item_explanation}") # Print doesn't interfere with the Noema 
    ])
).constituteWith(s) # The horizon is constituted by the LLM

# Pb Nb 1: I don't know what to do next.. Explanation: Because if you don't know what to do next, you can't make progress and achieve your goals.
# Pb Nb 2: I don't have enough information to make a decision.. Explanation: Because if you don't have enough information, you can't make an informed decision and may make a mistake that could set you back or cause problems down the line.
# Pb Nb 3: I'm not sure if I'm on the right track.. Explanation: Because if you're not sure if you're on the right track, you may be wasting time and effort on a path that won't lead to your goals, and you may not realize it until it's too late to change course.

While

s = Subject("path_to_model.gguf")

s = Horizon(
    Information("You have to choose a job name in the field of computer science."),
    Var(word_length = 0 ),
    While(lambda: s.word_length < 9,[
        Word(job_name = "Give a good job name:"),
        Int(word_length = "How many letters are in the word {job_name}?"),
        Print("Selected job {job_name}"),
        Information("You have to choose a new job name each time."),
    ]),
    Print("The word {job_name} has more than 10 letters."),
    PrintNoema()
).constituteWith(s)

NOESIS

The Noesis is the descriptive process of a thought. You can think about it as a set of rules aiming to attain a goal. In a function we think about steps, here you have to declare how to think about the steps.

A Noesis need a description, here: "Find a job name in a field." and can take optionnal parameters. The Return is optional.

from Noema import *

s = Subject("path_to_model.gguf")

find_job_name = Noesis("Find a job name in a field.",["field_name","max_length"],[
    Information("You have to choose a job name in the field of {field_name}."),
    Var(word_length = 0),
    While(lambda: s.word_length < s.max_length, [
        Word(job_name = "Give a good job name:"),
        Int(word_length = "How many letters are in the word {job_name}?"),
        Print("Selected job {job_name}"),
        Information("You have to choose a new job name each time."),
    ]),
    Return("{job_name} is a good job name in the field of {field_name}.") #Return value
])

s = Horizon(
    Constitute(job_name = lambda:find_job_name(s, field_name="IT",max_length=10)), 
    Print("{job_name} has more than 10 letters."),
).constituteWith(s) # The horizon is constituted by the LLM

# Selected job programmer
# Duration for 'Find a job name in a field.' : 00:01s
# programmer is a good job name in the field of IT. has more than 10 letters.

Python Function Call

In the Noesis we can call a python function. The parameters can be value extracted from the context i.e. a Var using {var_name}. Return value of the python function called can be stored in a Var.

from Noema import *

s = Subject("path_to_model.gguf")

def count_letters(word):
    return len(word)

s = Horizon(
    Var(palindrome = "TENET"), # store "TENET" in {palindrome}
    CallFunction(word_length = lambda: count_letters(s.palindrome)), # store the result of the function count_letters in {word_length}
    Print("The word '{palindrome}' has {word_length} letters."),
).constituteWith(s)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Noema-1.0.2.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

Noema-1.0.2-py3-none-any.whl (22.6 kB view details)

Uploaded Python 3

File details

Details for the file Noema-1.0.2.tar.gz.

File metadata

  • Download URL: Noema-1.0.2.tar.gz
  • Upload date:
  • Size: 17.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for Noema-1.0.2.tar.gz
Algorithm Hash digest
SHA256 bc14d7cf28501d72170cdf179890a6141d385df0fb931b6e00b87a67efe15e27
MD5 4057d8c886e07e710e5c4ab8cd3a3c5b
BLAKE2b-256 3881bf8b1f21bd90f3435ea76cf51520024d5c3bc2eca028daf23901a6f9b75f

See more details on using hashes here.

File details

Details for the file Noema-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: Noema-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 22.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.7

File hashes

Hashes for Noema-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e02e37ae27682b308d789cd552bbd050485aaf7d3da756975de7d806e691c4e1
MD5 895072bae867f94a3422bf784df63929
BLAKE2b-256 624e5e7892ba4f851ed7b01edb36ddeb790e9315cc485cb344f460b8777f3952

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page