Skip to main content

a solr import tool to save data from solr to local storage

Project description

# Solring

Solring is an easy-to-use import tool from solr to local storage. By supporting various options we can create custom
queries and save from a running Solr server to a file.

## How it works

```
$ pip install solring==0.0.2

$ solring --help
usage: Solring.py [-h] [--version] --url URL [--output OUTPUT]
[--save_format {csv,txt}] --core CORE [--rows ROWS] [-fl FL]
[-q Q] [-fq FQ] [--score] [--qt QT]
{group} ...

optional arguments:
-h, --help show this help message and exit
--version show program's version number and exit
--url URL, -u URL The host:port of the running solr.
--output OUTPUT, -o OUTPUT
Output file name.
--save_format {csv,txt}, -sf {csv,txt}
File type of saved records. Default is txt.
--core CORE, -c CORE The core/collection in solr.
--rows ROWS, -r ROWS The number of row numbers returned. By default, Solr
returns 5 batches at a time to save records.
-fl FL Field list to retrieve. By default, Solr returns the
id field.
-q Q Search query. By default, Solr returns all records.
-fq FQ Filter queries.
--score Learn score of each record in a score field.
--qt QT solr request handle to query on, default is '/select'.

group command:
{group} group help
```

The group command parameters:

```
$ solring group --help
usage: Solring.py group [-h] --group_fl GROUP_FL --group_agg
{mean,min,max,count} --group_column GROUP_COLUMN

optional arguments:
-h, --help show this help message and exit
--group_fl GROUP_FL The field(s) we want to use to group by.
--group_agg {mean,min,max,count}
Aggregation functions to use in group by. Default is
count.
--group_column GROUP_COLUMN
The field(s) where we want to aggregate.
```

Create a custom query where the query we search for is 'boat', we have two filter queries, and we only need to know
their ids and titles as follows:

```
solring --url http://127.0.0.1:8983\
-c boats \
-fq "cabin:[6 TO *]" \
-fq harbors:marmaris \
-q boat \
-fl id,title,boat_size,group_id

$ ls
output.txt
```

Let's now aggregate the above request with group options. We can learn the min and max of boats_size of each group:

```
solring --url http://127.0.0.1:8983\
-c boats \
-fq "cabin:[6 TO *]" \
-fq harbors:marmaris \
-q boat \
-r 100 \
-fl id,title,boat_size,group_id \
-o groupby_boats \
group --group_agg min --group_agg max --group_column boat_size --group_fl group_id

$ ls æ
groupby_boats.txt
```
## LICENSE

MIT




Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

solring-0.0.2.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

solring-0.0.2-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file solring-0.0.2.tar.gz.

File metadata

  • Download URL: solring-0.0.2.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.6

File hashes

Hashes for solring-0.0.2.tar.gz
Algorithm Hash digest
SHA256 b9325d201ca48aa41091252b6c54410ba425563a28000372826b003ba6ec2f5e
MD5 8ea86d2e61482c195788af954045a545
BLAKE2b-256 26a450715e5f0c0f8e686c005fa51e845de9b0b2774f0ff9ea44a3a1f1b61f49

See more details on using hashes here.

File details

Details for the file solring-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: solring-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.6

File hashes

Hashes for solring-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6d09216579572bd352c27aef1f26f25af6e6426b9285251bda689912008d61d1
MD5 ebe1c866c4b098dee77695aed8ce5dcc
BLAKE2b-256 f9c3be750bfdda8edf9c740caf01f665127c5b52cdfc412ce9c561f453b3e903

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page