Skip to main content

llama-index vector_stores mongodb integration

Project description

LlamaIndex Vector_Stores Integration: MongoDB

Setting up MongoDB Atlas as the Datastore Provider

MongoDB Atlas is a multi-cloud database service made by the same people that build MongoDB. Atlas simplifies deploying and managing your databases while offering the versatility you need to build resilient and performant global applications on the cloud providers of your choice.

You can perform semantic search on data in your Atlas cluster running MongoDB v6.0.11, v7.0.2, or later using Atlas Vector Search. You can store vector embeddings for any kind of data along with other data in your collection on the Atlas cluster.

In the section, we provide detailed instructions to run the tests.

Deploy a Cluster

Follow the Getting-Started documentation to create an account, deploy an Atlas cluster, and connect to a database.

Retrieve the URI used by Python to connect to the Cluster

When you deploy, this will be stored as the environment variable: MONGODB_URI, It will look something like the following. The username and password, if not provided, can be configured in Database Access under Security in the left panel.

export MONGODB_URI="mongodb+srv://<username>:<password>@cluster0.foo.mongodb.net/?retryWrites=true&w=majority"

There are a number of ways to navigate the Atlas UI. Keep your eye out for "Connect" and "driver".

On the left panel, navigate and click 'Database' under DEPLOYMENT. Click the Connect button that appears, then Drivers. Select Python. (Have no concern for the version. This is the PyMongo, not Python, version.) Once you have got the Connect Window open, you will see an instruction to pip install pymongo. You will also see a connection string. This is the uri that a pymongo.MongoClient uses to connect to the Database.

Test the connection

Atlas provides a simple check. Once you have your uri and pymongo installed, try the following in a python console.

from pymongo.mongo_client import MongoClient

client = MongoClient(uri)  # Create a new client and connect to the server
try:
    client.admin.command(
        "ping"
    )  # Send a ping to confirm a successful connection
    print("Pinged your deployment. You successfully connected to MongoDB!")
except Exception as e:
    print(e)

Troubleshooting

  • You can edit a Database's users and passwords on the 'Database Access' page, under Security.
  • Remember to add your IP address. (Try curl -4 ifconfig.co)

Create a Database and Collection

As mentioned, Vector Databases provide two functions. In addition to being the data store, they provide very efficient search based on natural language queries. With Vector Search, one will index and query data with a powerful vector search algorithm using "Hierarchical Navigable Small World (HNSW) graphs to find vector similarity.

The indexing runs beside the data as a separate service asynchronously. The Search index monitors changes to the Collection that it applies to. Subsequently, one need not upload the data first. We will create an empty collection now, which will simplify setup in the example notebook.

Back in the UI, navigate to the Database Deployments page by clicking Database on the left panel. Click the "Browse Collections" and then "+ Create Database" buttons. This will open a window where you choose Database and Collection names. (No additional preferences.) Remember these values as they will be as the environment variables, MONGODB_DATABASE and MONGODB_COLLECTION.

Set Datastore Environment Variables

To establish a connection to the MongoDB Cluster, Database, and Collection, plus create a Vector Search Index, define the following environment variables. You can confirm that the required ones have been set like this: assert "MONGODB_URI" in os.environ

IMPORTANT It is crucial that the choices are consistent between setup in Atlas and Python environment(s).

Name Description Example
MONGODB_URI Connection String mongodb+srv://<user>:<password>@llama-index.zeatahb.mongodb.net
MONGODB_DATABASE Database name llama_index_test_db
MONGODB_COLLECTION Collection name llama_index_test_vectorstore
MONGODB_INDEX Search index name vector_index

The following will be required to authenticate with OpenAI.

Name Description
OPENAI_API_KEY OpenAI token created at https://platform.openai.com/api-keys

Create an Atlas Vector Search Index

The final step to configure MongoDB as the Datastore is to create a Vector Search Index. The procedure is described here.

Under Services on the left panel, choose Atlas Search > Create Search Index > Atlas Vector Search JSON Editor.

The Plugin expects an index definition like the following. To begin, choose numDimensions: 1536 along with the suggested EMBEDDING variables above. You can experiment with these later.

{
  "fields": [
    {
      "numDimensions": 1536,
      "path": "embedding",
      "similarity": "cosine",
      "type": "vector"
    }
  ]
}

Running MongoDB Integration Tests

In addition to the Jupyter Notebook in examples/, a suite of integration tests is available to verify the MongoDB integration. The test suite needs the cluster up and running, and the environment variables defined above.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_vector_stores_mongodb-0.5.0.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file llama_index_vector_stores_mongodb-0.5.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_vector_stores_mongodb-0.5.0.tar.gz
Algorithm Hash digest
SHA256 0e49559326d754b19fb311999e46cdbe69424b77ae5cfa8544bde20049476f23
MD5 dc5185a6c40b645bb0fc9d4efd56cfe9
BLAKE2b-256 f8b3ab925b0ce8d6d121ec0379dc51d81b0a4ce12f521d7e634f78939f5a9cdf

See more details on using hashes here.

File details

Details for the file llama_index_vector_stores_mongodb-0.5.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_vector_stores_mongodb-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f1431df41866c25e7a5ba546c9fb4d73246055948d0def638fdce11589808e5a
MD5 1eea12421047d20822f355330e705ff3
BLAKE2b-256 dadbdd0ac6deafb618ec9233058076bd69abe5d89b702a9b7c2a92b8b4ebece9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page