Skip to main content

CAproto-based pure-Python EPICS IOC for the Huber SMC 9300 motor controller

Project description

Pure-Python EPICS-IOC for the Huber SMC motion controller

Why & how?

EPICS is a distributed control and state representation system for large-scale experimental and industrial facilities.

Huber is a vendor of electronic equipment whose products, in particular its SMC 9000 / SMC 9300 series motion controllers, often end up in large-scale experimental and industrial facilities.

It comes as a no-brainer that the Huber controllers need an EPICS IOC ;-)

Exhub is written purely in Python and does not depend on C code. It uses caproto for all its EPICS communications infrastructure.

Since Exhub aims for testability on a number of levels, including "dry runs" without actual attached hardware, it uses the EDA motor model of EMMI to provide a simple, but fully functional, "simulation mode".

Installation

You install Exhub directly from PyPI:

pip install exhub-ioc

Or you can download the sources from Gitlab and install it locally:

git clone https://gitlab.com/kmc3-xpp/exhub
pip intall ./exhub

If you feel inclined to develop on Exhub, you surely know that you can use the -e option to install an "editable" version instead:

pip install -e ./exhub[test]

Operation

On live Huber controller hardware

The only thing Exhub absolutely requires for operation and can't provide useful defaults for is the connection to your Huber controller. It uses PyVISA for connection, so simply running exhub-ioc with a PyVISA connection should yield a usable IOC:

export EXHUB_VISA_DEVICE="TCPIP::10.0.0.7::1234::SOCKET"
exhub-ioc

This connects to the controller that has IP address 10.0.0.7, on port 1234 (apparently a standad port for Huber controllers). The output should be similar to this:

INFO:root:Prefix: SMC:Grumpy:
INFO:root:Connecting to TCPIP::10.0.0.7::1234::SOCKET via @py
INFO:root:Huber SMC Version: (1, 2, 26)
INFO:root:Starting IOC with 60 PVs, list following
INFO:root:  SMC:Grumpy:tthR
...
INFO:root:  SMC:Grumpy:thR
...
INFO:root:  SMC:Grumpy:mag
...
INFO:root:  SMC:Grumpy:z
...
NFO:caproto.ctx:Asyncio server starting up...
INFO:caproto.ctx:Listening on 0.0.0.0:5064
INFO:caproto.ctx:Server startup complete.

By default, Exhub will choose one of a few predefined EPCIS prefixes for you (in this case "SMC:Grumpy:..."), in the following referred to as {prefix}, and will proceed to create process variables (PVs) for every axis it finds, in the following called {axis}. If available, as in this case, it will use the axis aliases configured within the controller (here: "tthR", "thR", "mag" and "z") and build the PV suffixes based on those, e.g. SMC:Grumpy:tthR.VAL etc.

Exhub uses the CAproto motor record, which exports (almost?) all EPICS EPICS motor record fields. However, CAproto itself doesn't implement functionality for any of the fields, but Exhub currently supports the following:

  • Primary motion and status fields in user coordinates:
    • {prefix}{axis}.VAL and .RBV position setpoint and read-back value
    • {prefix}{axis}.HLS and .LLS high and low limit switch indicators
    • {prefix}{axis}.DMOV the "done moving" indicator
    • {prefix}{axis}.STOP motor halt instruction
    • {prefix}{axis}.MSTA motor status bits as defined in the EPICS motor record
  • Dial coordinates .DRBV and .DVAL, various dial-coordinate flags,
  • Calibration fields .OFF, .DIR, .SET and associated fields
  • There is a .VELO implementation, but most Huber firmare doesn't support on-the-fly change of velocity.

Support of motor fields is inherited from EMMI, where it is considered a work-in-progress. Check back periodicatlly if your favourite field is not represented (yet), or browse the code to find out new possibilities.

The EMMI motor model is based around a state automaton, and as such, Exhub also exports suffixes that govern that specific component (for all suffixes, the prefix is {prefix}{axis}, like above):

  • .state the current state (one of INIT, IDLE, BUSY, ERROR, STOP, FAIL)
  • .error a string representation of the current error, if the current state is ERROR
  • .clear accepts a 1 to clear the current error and attempt re-entering IDLE state, where the motor is ready to accept new commands.

A top-level PV is also {prefix}update, which is just a counter incrementing with every run through Exhub's internal main update loop. This can be used as a benchmark of how fast Exhub is updating position information, and whether it hangs.

Efforts are on the way to achieve direct compatibility with the way SPEC uses EPICS motor records, but this regarded as a "best effort" enterprising, not a top priority (...good luck though! :-p)

Exhub also reacts to the following environment variables at startup:

  • EXHUB_VISA_DEVICE: VISA address of the Huber device to connect to, typically in the format TCPIP::<ip>::1234::SOCKET. Change the port if your device is listening on something different than the typical Huber standard port 1234.

  • EXHUB_VISA_RESOURCE_MANAGER: indicates the VISA resource manager to use, and defaults to "@py". Shouldn't need to be changed unless you really know what you're doing.

  • EXHUB_PREFIX: the EPICS prefix to use for the exported PVs. If you intend to use colon separation between the prefix and the variable suffixes, you need to include it here. For instance:

    export EXHUB_PREFIX="KMC3:MOTION:"
    

    will yield you variables named "KMC3:MOTION:thR_VAL", "KMC3:MOTION:z_VAL" and so on.

    (Also note that the motor record suffix is separated by the rest of the PV name by an underscore "_"; this is non-configurable).

  • EXHUB_POLL_PERIOD: information from the Huber controller is polled in regular intervals (by default every 0.2 seconds). You can use this variable to adjust that interval.

  • EXHUB_LOG_LEVEL: can be one of DEBUG, INFO, ERROR, or WARNING and sets the Python logger level. Defaults to behavior of INFO, which prints useful information on startup (mostly about exported/auto-generated PVs) but is quiet during operation, unless something unexpected happens that the user should be made aware of.

  • EXHUB_IOC_TEST: if this is set to "yes" or 1, then any real Huber hardware is ingored and a mock-up, purely software-based, set of motors is used. Then a number of other variables can also be used. See Operation in simulation mode below.

In simulation mode

Exhub can be forced to run in a functional, pure-software simulation mode. "Functional" here means that the motors, indeed, display "kind-of useful" behavior when instructed to move or stop, or bump against simulated limit switches. To do this, use the EXHUB_IOC_TEST environment variable set to "yes":

$ export EXHUB_IOC_TEST=yes
$ exhub-ioc

This will result in a simple single-motor simulator:

INFO:exhub.application:Prefix: SMC:Doc:
INFO:exhub.motor:Mock-huber: ['mock:-1:+1']
INFO:emmi.api.caproto.motor:motor="mock" prefix=SMC:Doc:mock
INFO:exhub.application:Starting IOC with 5 PVs, list following
INFO:exhub.application:  SMC:Doc:update
INFO:exhub.application:  SMC:Doc:mock
INFO:exhub.application:  SMC:Doc:mock.state
INFO:exhub.application:  SMC:Doc:mock.clear
INFO:exhub.application:  SMC:Doc:mock.error
INFO:exhub.application:Application initialized
INFO:caproto.ctx:Asyncio server starting up...
INFO:caproto.ctx:Listening on 0.0.0.0:49331
INFO:caproto.ctx:Server startup complete.
INFO:emmi.api.caproto.motor:name=mock msg="Finishing motion" state=IDLE entering=True
INFO:exhub.motor:msg="Exhub mock set" vel=3.14

The prefix here has been randomly chosen to be "SMC:Doc:...". See above Operation on live controllers for a full description of the environment variables -- most of them make sense and can also be used in the simulation mode.

To simulate additional axes/motors, or to adjust the simulated limit switches (by default set at -10/+10), use the EXHUB_MOCK_HUBER variable. It takes a string of the form "name1:low1:hi1;[name2:low2:hi2;[name3...]]", where:

  • "name..." is a string label by which to address the motor, akin to a Huber axis alias
  • "low..." is the value of a low-limit switch, and
  • "hi..." is the value of a high-limit switch.

Examples:

  • "EXHUB_MOCK_HUBER=mock:-10:10" will cause the default behavior.

  • "EXHUB_MOCK_HUBER=x:-7:7;y:-8:8;z:-9:9" will simulate a device with three axes ("x", "y" and "z"), limited at +/-7, 8 and 9, respectively.

Of course, technically Exhub will also accept pyVISA-sim kind of simulations (set up via the EXHUB_VISA... variables). But in practice, the Huber SMC protocol is very convoluted in its details, and full of exceptions and special cases. With only light effort we haven't succeeded in fully simulating that. If you wish to exert a medium-to-heavy amount of work, we'd be very interested in the results.

The mock-Huber part of the simulation bypasses all the VISA connection and polling code (which otherwise is done asynchronously, using asyncio). However, it travereses in a regular manner all the rest -- i.e. the IOC and application logic. It can be used to test higher aspects of the IOC behavior, and can definitely be used to serve for integration testing of components higher up in the architecture stack. In fact, this is one of the core motivations behind Exhub.

Deployment as a service

Installation via PyPI (pip install ...) should get you a usable application. As all configuration is done via environment variables instead of local files, deployment as a service (e.g. via systemd) should be fairly standard.

However, the way we prefer to deploy Exhub is through a Podman (or Docker) container. For this, the root folder of Exhub's Git project over at Gitlab contains a Dockerfile which you can use right away. The Dockerfile doesn't pull its Exhub from PyPI or Gitlab; instead it allows you to use your own, local copy of Exhub when creating a container. For instance like this:

cd /tmp
git clone https://gitlab.com/kmc3-xpp/exhub
podman build -t exhub-ioc -f /tmp/exhub/Dockerfile -v /tmp/exhub:/exhub-src:z

Running Exhub from its container is straight forward, e.g. in its default simulation mode:

podman run -ti --rm \
    --name=exhub \
	--publish 5064-5065/tcp \
	--publish 5064-5065/udp \
	-e EXHUB_MOCK_TEST=yes \
	localhost/exhub-ioc:latest

The ports 5064 and 5065 tcp/udp apparently are needed by the EPICS CA-protocol for communication. If you are running only one IOC container on your host, the above should do. If you're trying to run several, every container started after the first one will, of course, fail to bind to 5064 and/or 5065.

You have two options:

  • start all your IOC containers with --net=host, so they can at least collaboratively use the UDP broadcast port, or

  • try a dedicated EPICS caRepeater on the host, and have all the container IOCs broadcast their PVs through that repeater.

Other implementations

There already exists an EPICS driver package for the SMC-9300 series, and possibly for other models at Huber's service website. If that fits your bill, go ahead and use it.

In any case you're welcome to try Exhub!

The main differences are:

  • Core functionality: Exhub focuses directly on controlling motion of the attached axes in the most direct and compatible way possible. As such, it exports a small, but useful subset of the EPICS motor record process variables only. It tries to hide details of the underlying hardware if those aren't directly required for motion operation. Setting up the controller hardware to "behave properly" is not the scope of the IOC; it should be done by the user in advance, by other means.

    The Huber package, in contrast, exposes internal architectural details specific to the Huber controllers into PVs. Motion control is done in a proprietary manner through specialized variables.

  • Application scope: Exhub is kept as general and application-agnostic as possible. It's an out-of-the-box EPICS IOC for Huber controller. For instance it autodetects and supports all axes exposed by a controller.

    The Huber package appears (as of September 2023) to have been designed and implemented for a specific application, and has undergone very limited efforts towards generalized extension.

  • Testability: One of the main goals in Exhub development was testability -- both of the "live" code on a controller (threre are some Python unit tests which can be executed on live hardware), and for "dry runs", using a simulated motor model instead of a live controller. Exhub is used in CI/CD setups for experimental physics beamline endstations.

    The Huber package focuses solely on manual operation on live hardware.

  • Documentation: Last, but not least, we try to make Exhub more accessible by providing useful documentation.

Bugs

If you find any, you're free to keep them! Apparently some people eat those. :-p

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

exhub_ioc-0.4.0.tar.gz (27.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

exhub_ioc-0.4.0-py3-none-any.whl (21.1 kB view details)

Uploaded Python 3

File details

Details for the file exhub_ioc-0.4.0.tar.gz.

File metadata

  • Download URL: exhub_ioc-0.4.0.tar.gz
  • Upload date:
  • Size: 27.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.1

File hashes

Hashes for exhub_ioc-0.4.0.tar.gz
Algorithm Hash digest
SHA256 1812dcd70464d1b7ee866e902c7a0cbf1417aa6b91413b248856d9b94cdc1d1b
MD5 2266215081ea4bf40306c535db73c661
BLAKE2b-256 e3443908364a86bac0b7500f5800f9deb5d34fb2738918fb0f36213433669a9c

See more details on using hashes here.

File details

Details for the file exhub_ioc-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: exhub_ioc-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 21.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.13.1

File hashes

Hashes for exhub_ioc-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d296021215df800b648fc7e1a2c203d8ff33b35649ad0cb060035997b6d89f89
MD5 5ec5a4cbe2ea4f0dee2efff23a2a16b1
BLAKE2b-256 24dd08233a9fefec349ef08c238efa17cdf603f1a141ca1ed8322cbff233c91b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page