Skip to main content

Light-weight client to manipulate alerts from Fink

Project description

pypi Quality Gate Status Maintainability Rating Sentinel codecov Documentation Status

Fink client

fink-client is a light package to manipulate catalogs and alerts issued from the fink broker programmatically. It is used in the context of 2 major Fink services: Livestream and Data Transfer.

Installation

fink_client requires a version of Python 3.9+.

Install with pip

pip install fink-client --upgrade

Use or develop in a controlled environment

For development, we recommend the use of a virtual environment:

git clone https://github.com/astrolabsoftware/fink-client.git
cd fink-client
python -m venv .fc_env
source .fc_env/bin/activate
pip install -r requirements.txt
pip install .

Registration

In order to connect and poll alerts from Fink, you need to get your credentials:

  1. Subscribe to one or more Fink streams by filling this form.
  2. After filling the form, we will send your credentials. Register them on your laptop by simply running:
fink_client_register -username <USERNAME> -group_id <GROUP_ID> ...

Livestream usage

Once you have your credentials, you are ready to poll streams! You can easily access the documentation using -h or --help:

fink_consumer -h
usage: fink_consumer [-h] [--display] [--display_statistics] [-limit LIMIT]
                     [--available_topics] [--save] [-outdir OUTDIR]
                     [-schema SCHEMA] [--dump_schema] [-start_at START_AT]

Kafka consumer to listen and archive Fink streams from the Livestream service

optional arguments:
  -h, --help            show this help message and exit
  --display             If specified, print on screen information about
                        incoming alert.
  --display_statistics  If specified, print on screen information about queues,
                        and exit.
  -limit LIMIT          If specified, download only `limit` alerts. Default is
                        None.
  --available_topics    If specified, print on screen information about
                        available topics.
  --save                If specified, save alert data on disk (Avro). See also
                        -outdir.
  -outdir OUTDIR        Folder to store incoming alerts if --save is set. It
                        must exist.
  -schema SCHEMA        Avro schema to decode the incoming alerts. Default is
                        None (version taken from each alert)
  --dump_schema         If specified, save the schema on disk (json file)
  -start_at START_AT    If specified, reset offsets to 0 (`earliest`) or empty
                        queue (`latest`).

You can also look at an alert on the disk:

fink_alert_viewer -h
usage: fink_alert_viewer [-h] [-filename FILENAME]

Display cutouts and lightcurve from a ZTF alert

optional arguments:
  -h, --help          show this help message and exit
  -filename FILENAME  Path to an alert data file (avro format)

More information at docs/livestream.

Data Transfer usage

If you requested data using the Data Transfer service, you can easily poll your stream using:

usage: fink_datatransfer.py [-h] [-topic TOPIC] [-limit LIMIT] [-outdir OUTDIR] [-partitionby PARTITIONBY] [-batchsize BATCHSIZE] [-nconsumers NCONSUMERS]
                            [-maxtimeout MAXTIMEOUT] [-number_partitions NUMBER_PARTITIONS] [--restart_from_beginning] [--verbose]

Kafka consumer to listen and archive Fink streams from the data transfer service

optional arguments:
  -h, --help            show this help message and exit
  -topic TOPIC          Topic name for the stream that contains the data.
  -limit LIMIT          If specified, download only `limit` alerts from the stream. Default is None, that is download all alerts.
  -outdir OUTDIR        Folder to store incoming alerts. It will be created if it does not exist.
  -partitionby PARTITIONBY
                        Partition data by `time` (year=YYYY/month=MM/day=DD), or `finkclass` (finkclass=CLASS), or `tnsclass` (tnsclass=CLASS). `classId` is
                        also available for ELASTiCC data. Default is time.
  -batchsize BATCHSIZE  Maximum number of alert within the `maxtimeout` (see conf). Default is 1000 alerts.
  -nconsumers NCONSUMERS
                        Number of parallel consumer to use. Default (-1) is the number of logical CPUs in the system.
  -maxtimeout MAXTIMEOUT
                        Overwrite the default timeout (in seconds) from user configuration. Default is None.
  -number_partitions NUMBER_PARTITIONS
                        Number of partitions for the topic in the distant Kafka cluster. Do not touch unless you know what your are doing. Default is 10
                        (Fink Kafka cluster)
  --restart_from_beginning
                        If specified, restart downloading from the 1st alert in the stream. Default is False.
  --verbose             If specified, print on screen information about the consuming.

More information at docs/datatransfer.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fink-client-8.7.tar.gz (27.8 kB view details)

Uploaded Source

Built Distribution

fink_client-8.7-py3-none-any.whl (33.6 kB view details)

Uploaded Python 3

File details

Details for the file fink-client-8.7.tar.gz.

File metadata

  • Download URL: fink-client-8.7.tar.gz
  • Upload date:
  • Size: 27.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.12

File hashes

Hashes for fink-client-8.7.tar.gz
Algorithm Hash digest
SHA256 4cd6375e6152efa5d6f8c5d0440b8fc62b3568bef0a98d97afe8833a2c60af02
MD5 33bb2982c292a62666a81304acffb6a0
BLAKE2b-256 ebfb3696347bf8586b3e2599b856d3f56d63b1152e54ade05b653097759a572d

See more details on using hashes here.

File details

Details for the file fink_client-8.7-py3-none-any.whl.

File metadata

  • Download URL: fink_client-8.7-py3-none-any.whl
  • Upload date:
  • Size: 33.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.12

File hashes

Hashes for fink_client-8.7-py3-none-any.whl
Algorithm Hash digest
SHA256 a1b49fc6c379c3a191d0cde25d961cfc27345569c81346a5c2a2fadf0fded4f4
MD5 ca5d7f793c8ba9fcd621847b48961b7f
BLAKE2b-256 feeb59f41190413d43ecb62b653d7ee1758e297e0f4d802194a9420536f301dc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page