Paper - Pytorch
Project description
Vortex Fusion
This is the first ever implementation of a joint Transformer + Mamba + LSTM architecture. The flow is the following: mamba -> transformer -> lstm
in a loop. Perhaps with more iteration on model design, we can find a better architecture but this architecture is the future.
install
$ pip3 install -U vortex-fusion
Usage
import torch
from vortex_fusion import VortexFusion
# Generate random input tensor
x = torch.randint(0, 10000, (1, 10))
# Create an instance of the VortexFusion model with dimension 512
model = VortexFusion(dim=512)
# Pass the input tensor through the model to get the output
output = model(x)
# Print the shape of the output tensor
print(output.shape)
License
MIT
Citation
Please cite Swarms in your paper or your project if you found it beneficial in any way! Appreciate you.
@misc{swarms,
author = {Gomez, Kye},
title = {{Swarms: The Multi-Agent Collaboration Framework}},
howpublished = {\url{https://github.com/kyegomez/swarms}},
year = {2023},
note = {Accessed: Date}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
vortex_fusion-0.0.1.tar.gz
(4.4 kB
view hashes)
Built Distribution
Close
Hashes for vortex_fusion-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c96b34f5d5a2b12467b7c0ecc236e4c74ad2db31f18db5e3129b1f03ce7e06a9 |
|
MD5 | d81cccd0e9e43a55c95479778a2163c2 |
|
BLAKE2b-256 | a4b7ad03791e157df1c53804020411f831a1d335b89886ce709e4752d6759b8c |