Skip to main content

For QandA

Project description

LenseGt: An Advanced Generative AI Tool for Document Q&A

LenseGt is a sophisticated web application designed to cater to both no-code users and developers, facilitating document-based Q&A using the LLM RAG approach. It offers a user-friendly, choice-based environment that supports a variety of document formats, including CSV, XLSX, DOC, TXT, and PPT. Users can select from multiple response generation engines, such as OpenAI, Azure OpenAI, and Huggingface, all within a zero-code environment. For developers, LenseGt serves as a powerful toolkit, enabling them to generate query responses and integrate these responses into their solutions seamlessly. This tool supports interaction with a wide range of models, including proprietary models like OpenAI and Azure OpenAI, as well as local models running on platforms such as MIXTRAL, GEMMA, PHI, TINYLLAMA, STABLELM, and H2O.

LenseGt is designed to provide versatility and ease of use, making it an essential tool for both non-technical users seeking a straightforward Q&A application and developers looking for a robust development toolkit to enhance their solutions.

Table of Contents

Motivation:

The development of LenseGt is driven by the need to bridge the gap between non-technical users and developers in the realm of document-based Q&A using advanced Generative AI technologies. Here are the key motivations behind this tool:

  1. Accessibility for No-Code Users:

    • Simplifying AI Interactions: Many users lack the technical expertise to leverage the power of AI for document-based queries. LenseGt democratizes access to advanced AI by providing a zero-code environment, enabling users to easily interact with various document formats and response generation engines.
    • Versatile Document Support: By supporting a wide range of document types (CSV, XLSX, DOC, TXT, PPT), LenseGt ensures that users can work with the documents they are most familiar with, enhancing usability and convenience.
  2. Empowering Developers:

    • Comprehensive Toolkit: For developers, LenseGt offers a robust development toolkit that allows them to generate and integrate query responses into their applications. This flexibility supports the creation of sophisticated AI-driven solutions.
    • Model Versatility: By providing access to both proprietary models (OpenAI, Azure OpenAI) and local models (MIXTRAL, GEMMA, PHI, TINYLLAMA, STABLELM, H2O), LenseGt ensures developers have the freedom to choose the most appropriate model for their specific use case.
  3. Integration and Innovation:

    • Seamless Integration: Developers can easily integrate the query responses generated by LenseGt into their existing solutions, facilitating innovation and the development of new AI-driven applications.
    • Encouraging Experimentation: The tool's support for multiple models encourages experimentation and optimization, helping users and developers discover the best-performing models for their specific tasks.

Installation

To install and set up LenseGt, follow these steps:

  1. Install LenseGt via pip: Open your terminal or command prompt and run the following command to install LenseGt:

    pip install lense
    
  2. Launch the Web Application: To start the LenseGt web application, use the following command:

    from lensegt import lense
    lense.start()
    

    This will launch the web interface where you can interact with the LenseGt tool.

  3. Access the Web Interface: Open your web browser and go to the URL provided in the terminal output (typically http://localhost:9017). This will open the LenseGt user interface.

  4. Configure Document Sources and Models:

    • Document Sources: Upload your documents (e.g., CSV, XLSX, DOC, TXT, PPT) through the web interface.
    • Response Generation Engines: Select the desired response generation engines (e.g., OpenAI, Azure OpenAI, Huggingface) from the configuration options.
  5. Use LenseGt:

    • For no-code users, you can start using the Q&A functionality directly through the web interface. image landing_page engine_selection result

    • For using LenseGt as a developer:

To utilize LenseGt as a development toolkit for generating query responses and integrating them into your solutions, you can use the CLI-mode. Below is a step-by-step guide on how to achieve this:

6.1. Install LenseGt: Ensure you have LenseGt installed by running the following command:

pip install lense

6.2. Import LenseGt and Load a File: Use the following Python code to import LenseGt, load your document, and set up the engine configurations:

# Import LenseGt
from lensegt import lense

# Specify the filename
filename = "user_file.pdf"

# Load the file
lense.load_file(filename)

6.3. Set Up Engine Configurations: Configure the response generation engine you wish to use. LenseGt supports multiple engines, including OpenAI and Azure OpenAI.

  • For OpenAI Connections:

    lense.engine(engine_type="OpenAI", api_key="sk-I4*****J1TbO1")
    
  • For Azure OpenAI Connections:

    lense.engine(engine_type="Azure OpenAI", 
                 api_key="17b156f", 
                 api_version="2017-*****-preview", 
                 azure_endpoint="https://covalenseopenaieastus2.openai.azure.com/")
    

6.4. Generate Query Responses: Once the engine is configured, you can generate query responses by sending queries to the LenseGt engine. For example, to get a summary of the file, use the following code:

response = lense.chat_query("summary of the file")
print(response)

By following these steps, developers can leverage LenseGt to integrate sophisticated document-based Q&A functionalities into their applications. Below is a consolidated code snippet for clarity:

# Import LenseGt
from lensegt import lense
 
# Specify the filename
filename = "abc.pdf"
 
# Load the file
lense.load_file(filename)
 
# Set up the engine configuration
 
# For OpenAI connections
lense.engine(engine_type="OpenAI", api_key="sk-I4*****J1TbO1")
 
# For Azure OpenAI connections
lense.engine(engine_type="Azure OpenAI", 
             api_key="17b156f", 
             api_version="2017-*****-preview", 
             azure_endpoint="https://covalenseopenaieastus2.openai.azure.com/")
 
# Generate query responses
response = lense.chat_query("summary of the file")
print(response)

This approach allows developers to interact with various models, including proprietary and local models, to generate and use query responses within their solutions.

By following these steps, you will have LenseGt installed and ready to use for both document-based Q&A and as a development toolkit.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

lense-0.0.75-py3-none-any.whl (4.9 MB view details)

Uploaded Python 3

File details

Details for the file lense-0.0.75-py3-none-any.whl.

File metadata

  • Download URL: lense-0.0.75-py3-none-any.whl
  • Upload date:
  • Size: 4.9 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.11.9

File hashes

Hashes for lense-0.0.75-py3-none-any.whl
Algorithm Hash digest
SHA256 d5baf15247d5af85188b54830103c3296d428e4f4c8f1fef3913fc606fe1409c
MD5 6c87568e2df2b78d0ffb60688a9ca919
BLAKE2b-256 e84d4a1dd5a6c87e333fb85e709d9960a4bd67258f92e6ac9cfdf13a9673bbf8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page