A Python wrapper for the Hyperstack API
Project description
Hyperstack Python Client
This is a Python client for interacting with the Hyperstack API
Installation
pip install hyperstack
Usage
First ensure you have your API key set in an environment variable.
To create an API key, visit you can review Hyperstack's documentation.
Then add your API key to the environment variables.
export HYPERSTACK_API_KEY=<your API Key>
import hyperstack
Create an environment if you don't have one
hyperstack.create_environment('your-environment-name')
Set your environment
hyperstack.set_environment('your-environment-name')
Create a VM
hyperstack.create_vm(
name='first-vm',
image_name="Ubuntu Server 22.04 LTS R535 CUDA 12.2",
flavor_name='n2-RTX-A5000x1',
key_name="your-key",
user_data="",
create_bootable_volume=False,
assign_floating_ip=False,
count=1)
One-click Deployments
Before you make any deployments, ensure you have set up your environment and ssh-key as per the Hyperstack Documentation. Please note that the one-click deployments can take 5-10 minutes to be ready after the script has completed. The script completes when the machine is ready, but the machine still needs to download the docker image with ollama or python with notebooks within it. Further details can be found below within One-click deployment further details section.
Deploy Ollama Server
First set-up your ssh key and environment. Then navigate to the hyperstack library and run:
python3 hyperstack/deploy.py ollama --name ollama-server --flavor_name n2-RTX-A5000x1 --key_name your-key --environment your-environment
Deploy Pytorch server
The same command as above, but change ollama to pytorch
python3 hyperstack/deploy.py pytorch --name ollama-server --flavor_name n2-RTX-A5000x1 --key_name your-key --environment your-environment
Deploy from Python
from hyperstack.deploy import deploy
deploy(deployment_type="pytorch", name="pytorch-vm", environment="your-environment", flavor_name="n2-RTX-A5000x1", key_name="your-key")
from hyperstack.deploy import deploy
deploy(deployment_type="ollama", name="ollama-vm", environment="your-environment", flavor_name="n2-RTX-A5000x1", key_name="your-key")
One-click deployment further details
Here's a sample command to run the deployment.
python3 hyperstack/deploy.py pytorch --name ollama-server --flavor_name n2-RTX-A5000x1 --key_name your-key --environment your-environment
After you run a command (for example the Pytorch server), you will receive the requested configuration and useful details will print out.
Environment set to: your-environment
Booting 12345
Attempt 1/4: VM 12345 status is BUILD. Waiting for 30 seconds.
Machine 12345 Ready
Public IP: xxx.xxx.xxx.xxx
In container credentials:
username: dockeruser
Password: xxxxxxxx
This includes your virtual machine id, public ip for to ssh into the machine, and your dockeruser credentials. This will default to a uuid, unless explicitly provided. The machine is ready and will now download docker, the docker image with pytorch, juptyer notebooks inside and the configuration. This will take another 5-10 minutes, but for now, you will be able to ssh into the machine:
ssh -i ~/.ssh/ed25519 ubuntu@xxx.xxx.xxx.xxx
If you'd like to follow the installation:
Once ssh in the machine, you can try to open the docker container. Please note: it can sometimes take 5-10 minutes for docker to install and the docker image to be downloaded after the machine is ready. Please be patient with this. If you'd like to debug this you can do the following once ssh into the machine.
Apply the docker group to your current shell session:
newgrp docker
Check that docker and the image has indeed downloaded. This may take 5-10 minutes.
docker ps -a
Once it has downloaded you should see output like the below:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
xxxxxxxxxxxx balancedscorpion/python3-pytorch-ubuntu "/entrypoint.sh /bin…" 2 minutes ago Up 2 minutes 0.0.0.0:8888->8888/tcp, :::8888->8888/tcp pytorch
Now you can access the docker container. You can access this. For example, using VScode you can download ssh-connect and dev-containers to access this directly.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hyperstack-0.2.5.tar.gz
.
File metadata
- Download URL: hyperstack-0.2.5.tar.gz
- Upload date:
- Size: 15.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.4 Darwin/23.5.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a56322257ecb513c50ac79ed630a40d7fd4b8e73bda6ece4d26880cea6ff55e0 |
|
MD5 | 9c79d80117d2e39b2cd0b965b0ad6a90 |
|
BLAKE2b-256 | 9343c3f00ad4b25b541779abebb0824c139e17bd18af9d1502004bc144825e94 |
File details
Details for the file hyperstack-0.2.5-py3-none-any.whl
.
File metadata
- Download URL: hyperstack-0.2.5-py3-none-any.whl
- Upload date:
- Size: 18.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.4 Darwin/23.5.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7ced7fa53ce7d76ad866ebaf6895ed78d98c5e8827ada31e832723c6eb6b24e0 |
|
MD5 | 1fd6309651015aff9f1ba2caf4f14d77 |
|
BLAKE2b-256 | fdebe8886a565f49792a3c4d2ac568fa23fbd555e0367006b351c3127e2016be |