Skip to main content

Tending your Elasticsearch indices

Project description

NOTE: this package is a fork of Elasticsearch Curator without the CLI functionality (and without a dependency on Click). Only the API is exposed for library usage.

Curator API

Have indices in Elasticsearch? This is the library for you!

Like a museum curator manages the exhibits and collections on display, Elasticsearch Curator helps you curate, or manage your indices.

Build Status

Branch

Status

Master

master

5.x

5_x

PyPI: pypi_pkg

Curator API Documentation

Curator ships with both an API and a wrapper script (which is actually defined as an entry point). The API allows you to write your own scripts to accomplish similar goals, or even new and different things with the Curator API, and the Elasticsearch Python API.

Getting Started

See the Installation guide and the command-line usage guide

Running curator --help will also show usage information.

Contributing

  • fork the repo

  • make changes in your fork

  • add tests to cover your changes (if necessary)

  • run tests

  • sign the CLA

  • send a pull request!

To run from source, use the run_curator.py script in the root directory of the project.

Running Tests

To run the test suite just run python setup.py test

When changing code, contributing new code or fixing a bug please make sure you include tests in your PR (or mark it as without tests so that someone else can pick it up to add the tests). When fixing a bug please make sure the test actually tests the bug - it should fail without the code changes and pass after they’re applied (it can still be one commit of course).

The tests will try to connect to your local elasticsearch instance and run integration tests against it. This will delete all the data stored there! You can use the env variable TEST_ES_SERVER to point to a different instance (for example, ‘otherhost:9203’).

Origins

Curator was first called clearESindices.py [1] and was almost immediately renamed to logstash_index_cleaner.py [1]. After a time it was migrated under the [logstash](https://github.com/elastic/logstash) repository as expire_logs. Soon thereafter, Jordan Sissel was hired by Elasticsearch, as was the original author of this tool. It became Elasticsearch Curator after that and is now hosted at <https://github.com/elastic/curator>

[1] <https://logstash.jira.com/browse/LOGSTASH-211>

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

elasticsearch-curator-api-5.8.3.tar.gz (195.5 kB view details)

Uploaded Source

Built Distribution

elasticsearch_curator_api-5.8.3-py2.py3-none-any.whl (80.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file elasticsearch-curator-api-5.8.3.tar.gz.

File metadata

  • Download URL: elasticsearch-curator-api-5.8.3.tar.gz
  • Upload date:
  • Size: 195.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.0.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.8.0

File hashes

Hashes for elasticsearch-curator-api-5.8.3.tar.gz
Algorithm Hash digest
SHA256 bae2b13f5245e24a0d89529954c03cb97bd8e486f9ed8abf6c6eac5b6a6df7cd
MD5 61e9e2f1edfd4fb11504499fd96811b1
BLAKE2b-256 32e53e7f29f806df79e2650e65c711044e9a6277fa008e6fce551c4ba977083d

See more details on using hashes here.

File details

Details for the file elasticsearch_curator_api-5.8.3-py2.py3-none-any.whl.

File metadata

  • Download URL: elasticsearch_curator_api-5.8.3-py2.py3-none-any.whl
  • Upload date:
  • Size: 80.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.0.0 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.8.0

File hashes

Hashes for elasticsearch_curator_api-5.8.3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 365d308dc86fed85838e4f2b4b10eac1efdf1de6e70d0ac7cf5b1ba0ab2c7757
MD5 e347cdd7eda0aa8506f58db767999cde
BLAKE2b-256 db8479b8ac43822db9cb889c36a95f3a79ea9702f8d2a782a10322099ea7e030

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page