Skip to main content

llama-index readers azcognitive_search integration

Project description

Azure Cognitive Search Loader

pip install llama-index-readers-azcognitive-search

The AzCognitiveSearchReader Loader returns a set of texts corresponding to documents retrieved from specific index of Azure Cognitive Search. The user initializes the loader with credentials (service name and key) and the index name.

Usage

Here's an example usage of the AzCognitiveSearchReader.

from llama_index.readers.azcognitive_search import AzCognitiveSearchReader

reader = AzCognitiveSearchReader(
    "<Azure_Cognitive_Search_NAME>",
    "<Azure_Cognitive_Search_KEY>",
    "<Index_name>",
)


query_sample = ""
documents = reader.load_data(
    query="<search_term>",
    content_field="<content_field_name>",
    filter="<azure_search_filter>",
)

Usage in combination with langchain

from llama_index.core import VectorStoreIndex, download_loader
from langchain.chains.conversation.memory import ConversationBufferMemory
from langchain.agents import Tool, AgentExecutor, load_tools, initialize_agent

from llama_index.readers.azcognitive_search import AzCognitiveSearchReader

az_loader = AzCognitiveSearchReader(
    COGNITIVE_SEARCH_SERVICE_NAME, COGNITIVE_SEARCH_KEY, INDEX_NAME
)

documents = az_loader.load_data(query, field_name)

index = VectorStoreIndex.from_documents(
    documents, service_context=service_context
)

tools = [
    Tool(
        name="Azure cognitive search index",
        func=lambda q: index.query(q),
        description=f"Useful when you want answer questions about the text on azure cognitive search.",
    ),
]
memory = ConversationBufferMemory(memory_key="chat_history")
agent_chain = initialize_agent(
    tools, llm, agent="zero-shot-react-description", memory=memory
)

result = agent_chain.run(input="How can I contact with my health insurance?")

This loader is designed to be used as a way to load data into LlamaIndex.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_readers_azcognitive_search-0.5.0.tar.gz.

File metadata

  • Download URL: llama_index_readers_azcognitive_search-0.5.0.tar.gz
  • Upload date:
  • Size: 4.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_readers_azcognitive_search-0.5.0.tar.gz
Algorithm Hash digest
SHA256 7d9a4e445e1be2de4e102b7b23cd180441265e24f855f99452d64909021b3476
MD5 474c917e67b643e9983b87bdda4bf8d1
BLAKE2b-256 fffe650e19ece7bca77d6aea05083bcb0b03df3fc0fe332ca275b2f272267f2e

See more details on using hashes here.

File details

Details for the file llama_index_readers_azcognitive_search-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_readers_azcognitive_search-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 4.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_readers_azcognitive_search-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3480161581eb371bbc6bcb6ef05e068263fa16afaac44832450c93f7b9c9248e
MD5 4b45b5c2e36677bf989df30dc7594bd4
BLAKE2b-256 5c4f2a09a78fd7287a717c38682ff8b7018f80e1900eca46487a73aa2b427ba9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page