Skip to main content

Tool to enable package managing for HDL VIP or IP cores (Verilog, SystemVerilog, VHDL) using Python pip

Project description

pip-hdl

Tests & Checks PyPI version

Most modern programming languages provide package managers to facilitate code reuse, dependency tracking, installation, etc. However, there is no such a standart thing for HDL languages such as SystemVerilog. There are some projects that try to solve this like well-known FuseSoc creating such system from scratch. But what if we try to solve this another way by reusing existing package manager?

pip-hdl enables package managing for HDL (e.g. SystemVerilog) VIP or IP cores using Python pip. pip is simple and friendly, it allows to create quite lightweight packages, which are easy to publish to PyPi or your local index, and install.

How to install pip-hdl

The simplest ways is just use pip as for any other Python package:

python -m pip install pip-hdl

To facilitate package managing process consider installing poetry as well:

python -m pip install poetry

All the examples and the guide below use it. However, packaging and publishing can be done in other ways for sure.

How to create a package

Run pip-hdl new and follow the interactive instructions. Refer examples to get an idea how a result looks like - only several files with 5-10 lines of code are required.

Flow in a nutshell:

  • Create a filelist (.f file) for your HDL component. It should contain all needed information to compile in EDA: include directories, defines and sources.
  • Put your component into Python package (basically directory with __init__.py). And add some meta information to your package.
  • Build and publish!

Check out the guide section below to get more details.

How to use a package

Run pip install <name> to get a package and then put pip-hdl inspect <name> <attr> calls into your scripts to get all the required information for compilation (paths, environment variables, etc.).

Check out the guide section below to get more details.

Guide

Create HDL component

What HDL code can be packed? Actually, any piece of code which can be described as a single compilation unit via filelist.

Use of filelists aka .f files is a common way in EDA world to describe compilation attributes for a single component. As the name suggests, .f is list of source files to be compiled. Usually, it also consists of include directories and possibly defines. All the things above are described almost equal in the most EDA tools, so filelists are pretty universal.

To make your HDL component pip and pip-hdl compatible simple requirements have to followed:

  • All sources and filelists have to be placed within Python package directory to be packed (directory with __init__.py). This directory would be the root directory for your sources.
  • There should be filelist.f in the sources root. This is an "entry point" discoverable by pip-hdl to help you compile and simulate your component later.
  • All directories and sources mentioned within filelist have to use environment variable ${<PACKAGENAME>_SOURCES_ROOT}. In other words all paths are absolute and based on a path to your sources root.
  • If your component depends on another one, then do not mention any sources or filelists of that component in your filelist.f. Instead, you need to correctly fill metadata of a package and use pip-hdl introspection possibilities to get all required fileslists in the correct order for your component and dependencies.

Add package metadata

To be successfully packed any Python package has to carry some portion of metadata. There are several ways to do it, but in general it ends up in filling pyproject.toml file nearby a package. This file usually includes: package name and version, author name, list of dependencies, and some metadata for packaging backend. Check out one of such files in examples.

Also pip-hdl adds a bit above to ease HDL components management. This additional metadata is stored inside the package, and can be added by these two lines within __init__.py:

from pip_hdl import PackageMetaInfo
metainfo = PackageMetaInfo("package_name")

Note, that pip-hdl always expects that your package has metainfo variable of type PackageMetaInfo inside.

Pack and publish package

Python package management system a bit messed up, as Python environment and other things, therefore you have a lot of options. Check out this tutorial to get an idea how package could be published.

pip-hdl is implicitly expects poetry use and publishing process in this system is pretty straightforward:

poetry build
poetry publish

You can also check out CI publish.yml script of this repository to get an idea how pip-hdl is published via poetry itself.

Install package

Your package and its dependencies can be installed from an index:

python -m pip install <package_name>

Or from a .whl file

python -m pip install <distribution_name>.whl

Or as a part of packages listed in requirements.txt

python -m pip install -r requirements.txt

Get package metadata

pip-hdl allows to get package attributes printed via inspect command. Output can be used within your Makefile, bash or other script.

$ pip-hdl inspect -h
usage: pip-hdl inspect [-h] OBJ ATTR

avaliable attributes for inspection:
    filelist              - show absolute path to filelist
    sources_root          - show absolute path to sources root
    sources_var           - show environment variable to setup sources root (in NAME=VAL format)
    all_filelists         - show all filelists in the dependency-resolved order
    all_filelists_as_args - show all filelists as above, but format them as EDA arguments (with -f)
    all_sources_roots     - show absolute paths to all sources directories
    all_sources_vars      - show all environment variables for all sources
    dependency_graph      - dump dependency graph as in image (graphviz required)

positional arguments:
  OBJ         object for inspection: name of pip-hdl-powered package or requirements.txt with such packages
  ATTR        attribute to inspect (list of available attributes is above)

Options are quite self-descriptive, but you can also refer an example Makefile.

Alternative way of getting the same metadata is to use Python:

import fizzbuzz_agent
print(fizzbuzz_agent.metadata.filelist)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pip_hdl-0.3.0.tar.gz (12.6 kB view details)

Uploaded Source

Built Distribution

pip_hdl-0.3.0-py3-none-any.whl (12.0 kB view details)

Uploaded Python 3

File details

Details for the file pip_hdl-0.3.0.tar.gz.

File metadata

  • Download URL: pip_hdl-0.3.0.tar.gz
  • Upload date:
  • Size: 12.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.8.18 Linux/6.2.0-1018-azure

File hashes

Hashes for pip_hdl-0.3.0.tar.gz
Algorithm Hash digest
SHA256 3fdbc4eaabd08634aa9d572c34efeac6299209d4b5d516ad5c927a9a609eef5d
MD5 115582c9386ded82cd0525a31c656098
BLAKE2b-256 0a00007a5526a3204cf8e3e7a2ed2559dcf778b0d4ce3e03a2611872239255d9

See more details on using hashes here.

File details

Details for the file pip_hdl-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: pip_hdl-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 12.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.8.18 Linux/6.2.0-1018-azure

File hashes

Hashes for pip_hdl-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 aa15cbb18c9e8d220e0ae06647117697e2258baa2a2fb2358edab71cd372324d
MD5 31e58dfd85cb8d8b26df801098f4f733
BLAKE2b-256 c2ff7a7a6f7b71a753471b77c568bc72254ffd845370979f683090f0a4199d03

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page