Skip to main content

Command line tool to load CSV content into a Solr index for the UCLA Digital Library's frontend, Ursus (https://digital.library.ucla.edu/)

Project description

feed_ursus

Command line tools to load CSV content into a Solr index for the UCLA Digital Library's frontend, Ursus (https://digital.library.ucla.edu/) and the Sinai Manuscripts Digital Library

Using feed_ursus

For basic use, you can install feed_ursus as a systemwide command directly from pypi, without having to first clone the repository.

Installation

Installing with UV

We recommend installing with uv. On MacOS, you can install uv with homebrew:

brew install uv

Then:

uv tool install feed_ursus

UV will install feed_ursus in its own virtualenv, but make the command accessible from anywhere so you don't need to active the virtualenv yourself.

Installing with pipx

If you are already using pipx, you can use it instead of uv:

pipx install feed_ursus

### Use

Convert a csv into a json document that follows the data model of an Ursus solr index:

feed_ursus [path/to/your.csv]


This repo includes a docker-compose.yml file that will run local instances of solr and ursus for use in testing this script. To use them, first install [docker](https://docs.docker.com/install/) and [docker compose](https://docs.docker.com/compose/install/). Then run:

docker-compose up --detach docker-compose run web bundle exec rails db:setup


It might take a minute or so for solr to get up and running, at which point you should be able to see your new site at http://localhost:3000. Ursus will be empty, because you haven't loaded any data yet.

To load data from a csv:

feed_ursus --solr_url=http://localhost:8983/solr/californica --mapping=dlp load [path/to/your.csv]


### Mappers

Different metadata mappings are included for general Digital Library use (`--mapping=dlp`) and for the Sinai Manuscripts Digital Library (`--mapping=sinai`). The default is "dlp" – "sinai" is not guaranteed to be up to date as the sinai project is using a forked version at https://github.com/uclalibrary/feed_sinai.

## Developing feed_ursus

### Installing

For development, clone the repository and use uv to set up the virtualenv:

git clone git@github.com:UCLALibrary/feed_ursus.git cd feed_ursus uv install


Then, to activate the virtualenv:

source .venv/bin/activate


The following will assume the virtualenv is active. You could also run e.g. `uv run feed_ursus [path/to/your.csv]`

### Using the development version

feed_ursus --solr_url http://localhost:8983/solr/californica load [path/to/your.csv]


### Running the tests

Tests are written for [pytest](https://docs.pytest.org/en/latest/):

pytest


### Running the formatter and linters:

ruff (formatter and linter) will run in check mode in ci, so make sure you run it before committing:

ruff format . ruff check --fix


mypy (static type checker):

mypy


### VSCode Debugger Configuration

To debug with VSCode, the python environment has to be created within the project directory.

TODO: update this section for uv. UV seems more predictable overall so it's probablly easier? Just a matter of `rm -rf .venv && uv install`?

If it exists, remove the existing setup and install in the project directory:

- `poetry env list`
- `poetry env remove <name of environment you want to delete>`
- `poetry config virtualenvs.in-project true`
- `poetry install`

Add an appropriate `.vscode/launch.json`, this assumes you have the python debugger extension installed.

{ // Use IntelliSense to learn about possible attributes. // Hover to view descriptions of existing attributes. // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387 "version": "0.2.0", "configurations": [ { "name": "Python: Run the feed_ursus module", "type": "debugpy", "request": "launch", "cwd": "${workspaceFolder}", "console": "integratedTerminal", "module": "feed_ursus.feed_ursus", "justMyCode": true, } ] }


# Caveats

## IIIF Manifests

When importing a work, the script will always assume that a IIIF manifest exists at https://iiif.library.ucla.edu/[ark]/manifest, where [ark] is the URL-encoded Archival Resource Key of the work. This link should work, as long as a manifest has been pushed to that location by importing the work into [Fester](https://github.com/UCLALibrary/fester) or [Californica](https://github.com/UCLALibrary/californica). If you haven't done one of those, obviously, the link will fail and the image won't be visible, but metadata will import and be visible. A manifest can then be created and pushed to the expected location without re-running feed_ursus.py.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

feed_ursus-1.2.1.tar.gz (193.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

feed_ursus-1.2.1-py3-none-any.whl (28.7 kB view details)

Uploaded Python 3

File details

Details for the file feed_ursus-1.2.1.tar.gz.

File metadata

  • Download URL: feed_ursus-1.2.1.tar.gz
  • Upload date:
  • Size: 193.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for feed_ursus-1.2.1.tar.gz
Algorithm Hash digest
SHA256 6bf96535e7230e249351e764e1c5ad7f45c20ed5349a88727fcd35222c48b6c1
MD5 135c13eb69aaef463f03dbf8133f2661
BLAKE2b-256 9d4b8f31494d20ff038be72cb233de7938b6140df9857c50305dcbc945c352a0

See more details on using hashes here.

File details

Details for the file feed_ursus-1.2.1-py3-none-any.whl.

File metadata

  • Download URL: feed_ursus-1.2.1-py3-none-any.whl
  • Upload date:
  • Size: 28.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for feed_ursus-1.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5b35a7ab80009869de5dbbb2325687857f691c38ce2c7d43c04ebb2bad04a4b7
MD5 078251f46cbbb9ffbf0d8ab1c07b3a05
BLAKE2b-256 d602bc06c8efe5c1d0cd19e2b39af6bc91084dd5caffee7382dead708f626556

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page