Vision module for the OpenMMLA platform.
Project description
OpenMMLA Vision
Video module of the mBox - an open multimodal learning analytic platform. For more details, please refer to mbox System design.
Other modules of the mBox:
Uber Server Setup
Before setting up the video base, you need to set up a server hosting the InfluxDB, Redis, Mosquitto, and Nginx services. Please refer to mbox-uber module.
Video Base & Server Setup
Downloading and Setting up the mbox-video analysis system is accomplished in three steps:
(1) Clone the repository from GitHub to your local home directory.
(2) Install openmmla-vision.
(3) Set up folder structure.
-
Clone the repository from GitHub
git clone https://github.com/ucph-ccs/mbox-video.git
-
Install openmmla-vision with conda environment
-
Conda
# For Raspberry Pi wget "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh" bash Miniforge3-$(uname)-$(uname -m).sh # For Mac and Linux wget "https://repo.anaconda.com/miniconda/Miniconda3-latest-$(uname)-$(uname -m).sh" bash Miniconda3-latest-$(uname)-$(uname -m).sh
-
Video Base
conda create -c conda-forge -n video-base python=3.10.12 -y conda activate video-base # install openmmla-vision pip install openmmla-vision # install openmmla-vision in devlopment mode pip install -e .
-
-
Set up folder structure
cd mbox-video ./reset.sh
Usage
After setting up the uber server and installation of dependencies, you can run the real-time video analysis by following the instructions:
-
Stream video from your camera(s) to the server (e.g., Raspberry Pi, MacBook, etc.)
If you have multiple cameras and each wired to one machine, you can either run the video streaming on each machine or stream all cameras to a centralized RTMP server by following the guidelines in raspi_rtmp_streaming.md. -
Calibrate camera's intrinsic parameters
- Go to ./camera_calib/pattern, print out the chessboard image and stick it on a flat surface.
- Capture the chessboard image with your camera and calibrate the camera by running
./calib_camera.sh
bash script.
-
Synchronize multi-cameras' coordinate systems
If more than one camera is to be used at the same time, you need to calculate the transformation matrix between the main and the alternative camera. The transformation matrix is used to convert the alternative camera's coordinate system to the main camera's coordinate system.
- Centralized mode
If your cameras are streaming to a centralized RTMP server or serially wired to a single machine, then run# :param -d the number of cameras needed to sync, default to 2. # :param -s the number of camera sync manager, default to 1. # e.g. ./sync_camera.sh -d 2 -s 1
- Distributed Mode
If your cameras are not streamed to a centralized RTMP server and serially wired to a single machine, run camera detector on each your camera hosting machine, and camera sync manager on your synchronizing machine.# on camera hosting machine, e.g. Raspberry Pi ./sync_camera.sh -d 1 -s 0 # on synchronizing machine, e.g. MacBook ./sync_camera.sh -d 0 -s 1
- Centralized mode
-
Run real-time video analysis system
- Centralized mode
If your cameras are streaming to a centralized RTMP server or serially wired to a single machine, then run# :param -b the number of video base needed to run, default to 1. # :param -s the number of video base synchronizer needed to run, default to 1. # :param -v the number of visualizer need to run, default to 1. # :param -g whether to display the graphic window, default to true. # :param -r whether to record the video frames in image for clip baking, default to false. # :param -v whether to store the real time visualizations with visualizer, default to false. # e.g. ./run.sh
- Distributed Mode
If your cameras are not streamed to a centralized RTMP server and serially wired to a single machine, run video base on each camera hosting machine and run the synchronizer and visualizer on your synchronizing machine.# on camera hosting machine, e.g. Raspberry Pi ./run.sh -b 1 -s 0 -v 0 -g false # on synchronizing machine, e.g. MacBook ./run.sh -b 0 -s 1 -v 1
- Centralized mode
FAQ
Citation
If you use this code in your research, please cite the following paper:
@inproceedings{inproceedings,
author = {Li, Zaibei and Jensen, Martin and Nolte, Alexander and Spikol, Daniel},
year = {2024},
month = {03},
pages = {785-791},
title = {Field report for Platform mBox: Designing an Open MMLA Platform},
doi = {10.1145/3636555.3636872}
}
References
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file openmmla_vision-0.1.0.post1.tar.gz
.
File metadata
- Download URL: openmmla_vision-0.1.0.post1.tar.gz
- Upload date:
- Size: 38.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0086eda9a99814122c9a76c86f7c9948f986280b3fff46dde6990a8a2c083fb7 |
|
MD5 | 1491fb765bd55f26df85a8b80c254b8d |
|
BLAKE2b-256 | 4fcc07b50d3e0c5c3ad851ee47845f979798bceff05a129b2bbb77c4568cbb3b |
File details
Details for the file openmmla_vision-0.1.0.post1-py3-none-any.whl
.
File metadata
- Download URL: openmmla_vision-0.1.0.post1-py3-none-any.whl
- Upload date:
- Size: 45.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 67ef3704e11b6045fd9a2dc54074f5b91ab2d259da681b1f348e236c7bf3c756 |
|
MD5 | 015ee2acc5593e810f523e40d4dac6ac |
|
BLAKE2b-256 | ae9bd62a67c0ddfa04265760b638a7f339bf711f07a83a886c2e7ce10ee2e708 |