Automatic Differentiation for rigid-body-dynamics AlgorithMs
Project description
adam
Automatic Differentiation for rigid-body-dynamics AlgorithMs
adam implements a collection of algorithms for calculating rigid-body dynamics for floating-base robots, in mixed and body fixed representations (see Traversaro's A Unified View of the Equations of Motion used for Control Design of Humanoid Robots) using:
adam employs the automatic differentiation capabilities of these frameworks to compute, if needed, gradients, Jacobian, Hessians of rigid-body dynamics quantities. This approach enables the design of optimal control and reinforcement learning strategies in robotics.
adam is based on Roy Featherstone's Rigid Body Dynamics Algorithms.
Table of contents
🐍 Dependencies
Other requisites are:
urdfdom-pyPython package, that exposes theurdf_parser_pyPython modulejaxcasadipytorchnumpyarray-api-compat
They will be installed in the installation step!
💾 Installation
The installation can be done either using the Python provided by apt (on Debian-based distros) or via conda (on Linux and macOS).
🐍 Installation with pip
Install python3, if not installed, for example on Ubuntu:
sudo apt install python3 python3-pip python3-venv
Create a virtual environment, if you prefer. For example:
pip install virtualenv
python3 -m venv your_virtual_env
source your_virtual_env/bin/activate
Inside the virtual environment, install the library from pip:
-
Install Jax interface:
pip install adam-robotics[jax]
-
Install CasADi interface:
pip install adam-robotics[casadi]
-
Install PyTorch interface:
pip install adam-robotics[pytorch]
-
Install ALL interfaces:
pip install adam-robotics[all]
If you want the last version:
pip install adam-robotics[selected-interface]@git+https://github.com/ami-iit/adam
or clone the repo and install:
git clone https://github.com/ami-iit/adam.git
cd adam
pip install .[selected-interface]
📦 Installation with conda
Installation from conda-forge package
-
Install CasADi interface:
conda create -n adamenv -c conda-forge adam-robotics-casadi
-
Install Jax interface (warning: not available on Windows):
conda create -n adamenv -c conda-forge adam-robotics-jax
-
Install PyTorch interface (warning: not available on Windows):
conda create -n adamenv -c conda-forge adam-robotics-pytorch
-
Install ALL interfaces (warning: not available on Windows):
conda create -n adamenv -c conda-forge adam-robotics-all
[!NOTE] Check also the conda JAX installation guide here
🔨 Installation from repo
Install in a conda environment the required dependencies:
-
Jax interface dependencies:
conda create -n adamenv -c conda-forge jax numpy lxml prettytable matplotlib urdfdom-py array-api-compat
-
CasADi interface dependencies:
conda create -n adamenv -c conda-forge casadi numpy lxml prettytable matplotlib urdfdom-py array-api-compat
-
PyTorch interface dependencies:
conda create -n adamenv -c conda-forge pytorch numpy lxml prettytable matplotlib urdfdom-py array-api-compat
-
ALL interfaces dependencies:
conda create -n adamenv -c conda-forge jax casadi pytorch numpy lxml prettytable matplotlib urdfdom-py array-api-compat
Activate the environment, clone the repo and install the library:
conda activate adamenv
git clone https://github.com/ami-iit/adam.git
cd adam
pip install --no-deps .
🚀 Usage
The following are small snippets of the use of adam. More examples are arriving!
Have also a look at the tests folder.
Jax interface
[!NOTE] Check also the Jax installation guide here
import adam
from adam.jax import KinDynComputations
import icub_models
import numpy as np
import jax.numpy as jnp
from jax import jit, vmap
# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputations(model_path, joints_name_list)
# choose the representation, if you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
M = kinDyn.mass_matrix(w_H_b, joints)
print(M)
w_H_f = kinDyn.forward_kinematics('frame_name', w_H_b, joints)
# IMPORTANT! The Jax Interface function execution can be slow! We suggest to jit them.
# For example:
def frame_forward_kinematics(w_H_b, joints):
# This is needed since str is not a valid JAX type
return kinDyn.forward_kinematics('frame_name', w_H_b, joints)
jitted_frame_fk = jit(frame_forward_kinematics)
w_H_f = jitted_frame_fk(w_H_b, joints)
# In the same way, the functions can be also vmapped
vmapped_frame_fk = vmap(frame_forward_kinematics, in_axes=(0, 0))
# which can be also jitted
jitted_vmapped_frame_fk = jit(vmapped_frame_fk)
# and called on a batch of data
joints_batch = jnp.tile(joints, (1024, 1))
w_H_b_batch = jnp.tile(w_H_b, (1024, 1, 1))
w_H_f_batch = jitted_vmapped_frame_fk(w_H_b_batch, joints_batch)
[!NOTE] The first call of the jitted function can be slow, since JAX needs to compile the function. Then it will be faster!
CasADi interface
import casadi as cs
import adam
from adam.casadi import KinDynComputations
import icub_models
import numpy as np
# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputations(model_path, joints_name_list)
# choose the representation you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
M = kinDyn.mass_matrix_fun()
print(M(w_H_b, joints))
# If you want to use the symbolic version
w_H_b = cs.SX.eye(4)
joints = cs.SX.sym('joints', len(joints_name_list))
M = kinDyn.mass_matrix_fun()
print(M(w_H_b, joints))
# This is usable also with casadi.MX
w_H_b = cs.MX.eye(4)
joints = cs.MX.sym('joints', len(joints_name_list))
M = kinDyn.mass_matrix_fun()
print(M(w_H_b, joints))
PyTorch interface
import adam
from adam.pytorch import KinDynComputations
import icub_models
import numpy as np
# if you want to icub-models https://github.com/robotology/icub-models to retrieve the urdf
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputations(model_path, joints_name_list)
# choose the representation you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
M = kinDyn.mass_matrix(w_H_b, joints)
print(M)
PyTorch Batched interface
import adam
from adam.pytorch import KinDynComputationsBatch
import icub_models
# if you want to icub-models
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = [
'torso_pitch', 'torso_roll', 'torso_yaw', 'l_shoulder_pitch',
'l_shoulder_roll', 'l_shoulder_yaw', 'l_elbow', 'r_shoulder_pitch',
'r_shoulder_roll', 'r_shoulder_yaw', 'r_elbow', 'l_hip_pitch', 'l_hip_roll',
'l_hip_yaw', 'l_knee', 'l_ankle_pitch', 'l_ankle_roll', 'r_hip_pitch',
'r_hip_roll', 'r_hip_yaw', 'r_knee', 'r_ankle_pitch', 'r_ankle_roll'
]
kinDyn = KinDynComputationsBatch(model_path, joints_name_list)
# choose the representation you want to use the body fixed representation
kinDyn.set_frame_velocity_representation(adam.Representations.BODY_FIXED_REPRESENTATION)
# or, if you want to use the mixed representation (that is the default)
kinDyn.set_frame_velocity_representation(adam.Representations.MIXED_REPRESENTATION)
w_H_b = np.eye(4)
joints = np.ones(len(joints_name_list))
num_samples = 1024
w_H_b_batch = torch.tensor(np.tile(w_H_b, (num_samples, 1, 1)), dtype=torch.float32)
joints_batch = torch.tensor(np.tile(joints, (num_samples, 1)), dtype=torch.float32)
M = kinDyn.mass_matrix(w_H_b_batch, joints_batch)
w_H_f = kinDyn.forward_kinematics('frame_name', w_H_b_batch, joints_batch)
MuJoCo interface
adam supports loading models directly from MuJoCo MjModel objects. This is useful when working with MuJoCo simulations or models from robot_descriptions.
import mujoco
import numpy as np
from adam import Representations
from adam.numpy import KinDynComputations
# Load a MuJoCo model (e.g., from robot_descriptions)
from robot_descriptions.loaders.mujoco import load_robot_description
mj_model = load_robot_description("g1_mj_description")
# Create KinDynComputations directly from MuJoCo model
kinDyn = KinDynComputations.from_mujoco_model(mj_model)
# Set velocity representation (default is mixed)
kinDyn.set_frame_velocity_representation(Representations.MIXED_REPRESENTATION)
# Set gravity to match MuJoCo settings
kinDyn.g = np.concatenate([mj_model.opt.gravity, np.zeros(3)])
# Create MuJoCo data and set state
mj_data = mujoco.MjData(mj_model)
mj_data.qpos[:] = your_qpos # Your configuration
mj_data.qvel[:] = your_qvel # Your velocities
mujoco.mj_forward(mj_model, mj_data)
# Extract base transform from MuJoCo state (for floating-base robots)
from scipy.spatial.transform import Rotation as R
base_rot = R.from_quat(mj_data.qpos[3:7], scalar_first=True).as_matrix()
base_pos = mj_data.qpos[0:3]
w_H_b = np.eye(4)
w_H_b[:3, :3] = base_rot
w_H_b[:3, 3] = base_pos
# Joint positions (excluding free joint).
# Be sure the serialization between mujoco and adam is the same
joints = mj_data.qpos[7:]
# Compute dynamics quantities
M = kinDyn.mass_matrix(w_H_b, joints)
com_pos = kinDyn.CoM_position(w_H_b, joints)
J = kinDyn.jacobian('frame_name', w_H_b, joints)
[!NOTE] When using
from_mujoco_model, adam automatically extracts the joint names from the MuJoCo model. You can also specifyuse_mujoco_actuators=Trueto use actuator names instead of joint names.
[!WARNING] MuJoCo uses a different velocity representation for the floating base. The free joint velocity in MuJoCo is
[I \dot{p}_B, B \omega_B], while mixed representation uses[I \dot{p}_B, I \omega_B]. Make sure to handle this transformation when comparing with MuJoCo computations.
Inverse Kinematics
adam provides an interface for solving inverse kinematics problems using CasADi. The solver supports
- position, orientation, and full pose constraints
- frame-to-frame constraints (ball, fixed)
- optional joint limit constraints
import casadi as cs
import numpy as np
import adam
from adam.casadi import KinDynComputations
from adam.casadi.inverse_kinematics import InverseKinematics, TargetType
# Load your robot model
import icub_models
model_path = icub_models.get_model_file("iCubGazeboV2_5")
# The joint list
joints_name_list = ...
# Create IK solver
ik = InverseKinematics(model_path, joints)
# Add a pose target on a frame (e.g., the left sole)
ik.add_target("l_sole", target_type=TargetType.POSE, as_soft_constraint=True, weight=1.0)
ik.add_ball_constraint(frame_1, frame_2, as_soft_constraint=True)
# Update the target to a desired pose
desired_position = np.array([0.3, 0.2, 1.0])
desired_orientation = np.eye(3)
ik.update_target("l_sole", (desired_position, desired_orientation))
# Solve
ik.solve()
# Retrieve solution
w_H_b_sol, q_sol = ik.get_solution()
print("Base pose:\n", w_H_b_sol)
print("Joint values:\n", q_sol)
🦸♂️ Contributing
adam is an open-source project. Contributions are very welcome!
Open an issue with your feature request or if you spot a bug. Then, you can also proceed with a Pull-requests! :rocket:
[!WARNING] REPOSITORY UNDER DEVELOPMENT! We cannot guarantee stable API
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file adam_robotics-0.5.0.tar.gz.
File metadata
- Download URL: adam_robotics-0.5.0.tar.gz
- Upload date:
- Size: 98.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
633ac68c97a33869d4248955254b56700960f880099c2cd35cf0c417525839a7
|
|
| MD5 |
82e02b0e1e1423aaf443f629492000e3
|
|
| BLAKE2b-256 |
2b18bf16e448afb2d5f4dc1630ce845d9fcd329e14a37e62cef65968d3b81a5e
|
Provenance
The following attestation bundles were made for adam_robotics-0.5.0.tar.gz:
Publisher:
pypi-ci.yml on ami-iit/adam
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
adam_robotics-0.5.0.tar.gz -
Subject digest:
633ac68c97a33869d4248955254b56700960f880099c2cd35cf0c417525839a7 - Sigstore transparency entry: 740971844
- Sigstore integration time:
-
Permalink:
ami-iit/adam@219991c6c646ce0c6f1aa1251279e7906d7b4753 -
Branch / Tag:
refs/tags/v0.5.0 - Owner: https://github.com/ami-iit
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-ci.yml@219991c6c646ce0c6f1aa1251279e7906d7b4753 -
Trigger Event:
release
-
Statement type:
File details
Details for the file adam_robotics-0.5.0-py3-none-any.whl.
File metadata
- Download URL: adam_robotics-0.5.0-py3-none-any.whl
- Upload date:
- Size: 78.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d8ae9a73b6ec2e625d2eeb0efde76439868457049b09da7cf7bfd08f63b04d5c
|
|
| MD5 |
864ce38eb0ac160a370ce4c920a870e3
|
|
| BLAKE2b-256 |
ee10f6d93731ac46a4616e910e1e28eb2656b6fb6db7ac0f52305f88c6c86c8d
|
Provenance
The following attestation bundles were made for adam_robotics-0.5.0-py3-none-any.whl:
Publisher:
pypi-ci.yml on ami-iit/adam
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
adam_robotics-0.5.0-py3-none-any.whl -
Subject digest:
d8ae9a73b6ec2e625d2eeb0efde76439868457049b09da7cf7bfd08f63b04d5c - Sigstore transparency entry: 740971869
- Sigstore integration time:
-
Permalink:
ami-iit/adam@219991c6c646ce0c6f1aa1251279e7906d7b4753 -
Branch / Tag:
refs/tags/v0.5.0 - Owner: https://github.com/ami-iit
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
pypi-ci.yml@219991c6c646ce0c6f1aa1251279e7906d7b4753 -
Trigger Event:
release
-
Statement type: