Skip to main content

Command line tool to load CSV content into a Solr index for the UCLA Digital Library's frontend, Ursus (https://digital.library.ucla.edu/)

Project description

feed_ursus

Command line tools to load CSV content into a Solr index for the UCLA Digital Library's frontend, Ursus (https://digital.library.ucla.edu/) and the Sinai Manuscripts Digital Library

Using feed_ursus

For basic use, you can install feed_ursus as a systemwide command directly from pypi, without having to first clone the repository.

Installation

Installing with UV

We recommend installing with uv. On MacOS, you can install uv with homebrew:

brew install uv

Then:

uv tool install feed_ursus

UV will install feed_ursus in its own virtualenv, but make the command accessible from anywhere so you don't need to active the virtualenv yourself.

Installing with pipx

If you are already using pipx, you can use it instead of uv:

pipx install feed_ursus

### Use

Convert a csv into a json document that follows the data model of an Ursus solr index:

feed_ursus [path/to/your.csv]


This repo includes a docker-compose.yml file that will run local instances of solr and ursus for use in testing this script. To use them, first install [docker](https://docs.docker.com/install/) and [docker compose](https://docs.docker.com/compose/install/). Then run:

docker-compose up --detach docker-compose run web bundle exec rails db:setup


It might take a minute or so for solr to get up and running, at which point you should be able to see your new site at http://localhost:3000. Ursus will be empty, because you haven't loaded any data yet.

To load data from a csv:

feed_ursus --solr_url=http://localhost:8983/solr/californica --mapping=dlp load [path/to/your.csv]


### Mappers

Different metadata mappings are included for general Digital Library use (`--mapping=dlp`) and for the Sinai Manuscripts Digital Library (`--mapping=sinai`). The default is "dlp" – "sinai" is not guaranteed to be up to date as the sinai project is using a forked version at https://github.com/uclalibrary/feed_sinai.

## Developing feed_ursus

### Installing

For development, clone the repository and use uv to set up the virtualenv:

git clone git@github.com:UCLALibrary/feed_ursus.git cd feed_ursus uv install


Then, to activate the virtualenv:

source .venv/bin/activate


The following will assume the virtualenv is active. You could also run e.g. `uv run feed_ursus [path/to/your.csv]`

### Using the development version

feed_ursus --solr_url http://localhost:8983/solr/californica load [path/to/your.csv]


### Running the tests

Tests are written for [pytest](https://docs.pytest.org/en/latest/):

pytest


### Running the formatter and linters:

ruff (formatter and linter) will run in check mode in ci, so make sure you run it before committing:

ruff format . ruff check --fix


mypy (static type checker):

mypy


### VSCode Debugger Configuration

To debug with VSCode, the python environment has to be created within the project directory.

TODO: update this section for uv. UV seems more predictable overall so it's probablly easier? Just a matter of `rm -rf .venv && uv install`?

If it exists, remove the existing setup and install in the project directory:

- `poetry env list`
- `poetry env remove <name of environment you want to delete>`
- `poetry config virtualenvs.in-project true`
- `poetry install`

Add an appropriate `.vscode/launch.json`, this assumes you have the python debugger extension installed.

{ // Use IntelliSense to learn about possible attributes. // Hover to view descriptions of existing attributes. // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387 "version": "0.2.0", "configurations": [ { "name": "Python: Run the feed_ursus module", "type": "debugpy", "request": "launch", "cwd": "${workspaceFolder}", "console": "integratedTerminal", "module": "feed_ursus.feed_ursus", "justMyCode": true, } ] }


# Caveats

## IIIF Manifests

When importing a work, the script will always assume that a IIIF manifest exists at https://iiif.library.ucla.edu/[ark]/manifest, where [ark] is the URL-encoded Archival Resource Key of the work. This link should work, as long as a manifest has been pushed to that location by importing the work into [Fester](https://github.com/UCLALibrary/fester) or [Californica](https://github.com/UCLALibrary/californica). If you haven't done one of those, obviously, the link will fail and the image won't be visible, but metadata will import and be visible. A manifest can then be created and pushed to the expected location without re-running feed_ursus.py.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

feed_ursus-1.2.4.tar.gz (193.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

feed_ursus-1.2.4-py3-none-any.whl (29.0 kB view details)

Uploaded Python 3

File details

Details for the file feed_ursus-1.2.4.tar.gz.

File metadata

  • Download URL: feed_ursus-1.2.4.tar.gz
  • Upload date:
  • Size: 193.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for feed_ursus-1.2.4.tar.gz
Algorithm Hash digest
SHA256 800cffd0ff4178ea2b8c29de77fa6aff295edeb28b2181c035c865a8d70e1e5e
MD5 a57ded44c158b1c9b95a803dfa44e1ec
BLAKE2b-256 ec2cfd713ea13dbda969d64065665d11a3618b62d32bfe801a9bed7b8f2850ae

See more details on using hashes here.

File details

Details for the file feed_ursus-1.2.4-py3-none-any.whl.

File metadata

  • Download URL: feed_ursus-1.2.4-py3-none-any.whl
  • Upload date:
  • Size: 29.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for feed_ursus-1.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 afb061a1fa2e55979dd4de30202eeb77d595347ae7e0ed87eff9dbdda7b058d3
MD5 bfa81778ebaf1b4b7ffd5a828f391164
BLAKE2b-256 5fa62ab047d4cb3cce25ad05d96b76ae3c1c5c5c0375fbcae9387e8b9e62e9c2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page