A toolkit for multimodal information processing
Project description
MMKit: Multimodal Kit
A toolkit for multimodal information processing
Installation
pip install mmkit
An example
A Neural Network in PyTorch for Tabular Data with Categorical Embeddings. Here
from mmk.prediction import MultimodalPredictionModel
input_features=[ ... ]
categorical_features = [...]
output_feature = "..."
output_error=0
all_features=input_features+[output_feature]
mmpm=MultimodalPredictionModel("data/multimodal_data.csv",
all_features,
categorical_features,
output_feature,
output_error)
mmpm.train()
acc=mmpm.get_last_accuracy()
print(acc)
License
The mmkit
project is provided by Donghua Chen.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
mmkit-0.0.1a0.tar.gz
(13.4 kB
view hashes)
Built Distribution
mmkit-0.0.1a0-py3-none-any.whl
(12.1 kB
view hashes)