Skip to main content

Virtual Computation Cube

Project description

VirtuGhan

Name is combination of virtual cube , where cube translated to nepali word “घन”

Background

We started initially by looking at how Google Earth Engine (GEE) computes results on-the-fly at different zoom levels on large-scale Earth observation datasets. We were fascinated by the approach and felt an urge to replicate something similar on our own in an open-source manner. We knew Google uses their own kind of tiling, so we started from there.

Initially, we faced a challenge – how could we generate tiles and compute at the same time without pre-computing the whole dataset? Pre-computation would lead to larger processed data sizes, which we didn’t want. And so, the exploration began and the concept of on the fly tiling computation introduced

At university, we were introduced to the concept of data cubes and the advantages of having a time dimension and semantic layers in the data. It seemed fascinating, despite the challenge of maintaining terabytes of satellite imagery. We thought – maybe we could achieve something similar by developing an approach where one doesn’t need to replicate data but can still build a data cube with semantic layers and computation. This raised another challenge – how to make it work? And hence come the virtual data cube

We started converting Sentinel-2 images to Cloud Optimized GeoTIFFs (COGs) and experimented with the time dimension using Python’s xarray to compute the data. We found that AWS’s effort to store Sentinel images as COGs made it easier for us to build virtual data cubes across the world without storing any data. This felt like an achievement and proof that modern data cubes should focus on improving computation rather than worrying about how to manage terabytes of data.

We wanted to build something to show that this approach actually works and is scalable. We deliberately chose to use only our laptops to run the prototype and process a year’s worth of data without expensive servers.

Purpose

1. Efficient On-the-Fly Tile Computation

This research explores how to perform real-time calculations on satellite images at different zoom levels, similar to Google Earth Engine, but using open-source tools. By using Cloud Optimized GeoTIFFs (COGs) with Sentinel-2 imagery, large images can be analyzed without needing to pre-process or store them. The study highlights how this method can scale well and work efficiently, even with limited hardware. Our main focus is on how to scale the computation on different zoom-levels without introducing server overhead

2. Virtual Computation Cubes: Focusing on Computation Instead of Storage

We believe that instead of focusing on storing large images, data cube systems should prioritize efficient computation. COGs make it possible to analyze images directly without storing the entire dataset. This introduces the idea of virtual computation cubes, where images are stacked and processed over time, allowing for analysis across different layers ( including semantic layers ) without needing to download or save everything. So original data is never replicated. In this setup, a data provider can store and convert images to COGs, while users or service providers focus on calculations. This approach reduces the need for terra-bytes of storage and makes it easier to process large datasets quickly.

3. Cloud Optimized GeoTIFF and STAC API for Large Earth Observation Data

This research introduces methods on how to use COGs, the SpatioTemporal Asset Catalog (STAC) API, and NumPy arrays to improve the way large Earth observation datasets are accessed and processed. The method allows users to focus on specific areas of interest, process data across different bands and layers over time, and maintain optimal resolution while ensuring fast performance. By using the STAC API, it becomes easier to search for and only process the necessary data without needing to download entire images ( not even the single scene , only accessing the parts ) The study shows how COGs can improve the handling of large datasets, not only making the access faster but also making computation efficient, and scalable across different zoom levels . image

Learn about COG and how to generate one for this project Here

Installation and Setup

Prerequisites

  • Python 3.10 or higher
  • poetry

Install Poetry

If you don't have poetry installed, you can install it using the following command:

pip install poetry

Install

poetry install

Run

poetry run uvicorn main:app --reload

Resources and Credits

Copyright © 2024 – Concept by Kshitij and Upen

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

virtughan-0.1.0.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

virtughan-0.1.0-py3-none-any.whl (21.5 kB view details)

Uploaded Python 3

File details

Details for the file virtughan-0.1.0.tar.gz.

File metadata

  • Download URL: virtughan-0.1.0.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.11.11 Linux/6.5.0-1025-azure

File hashes

Hashes for virtughan-0.1.0.tar.gz
Algorithm Hash digest
SHA256 4002ecaaf409c01e29f809cd1d23d7c065ffc0e9358ccc4667d870c739d5d235
MD5 e7ba63f00a10a02c1a2093bf5277af3d
BLAKE2b-256 f2f36cb1b46c3d4de6d3883e6291e2559c6116d15378ed3cbde287a0eaf3361e

See more details on using hashes here.

File details

Details for the file virtughan-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: virtughan-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 21.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.0 CPython/3.11.11 Linux/6.5.0-1025-azure

File hashes

Hashes for virtughan-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e1acb40358f0bd28171a1485b3bb17fecaea475ab5a6d5bc0734b02303367545
MD5 f6c86ae29d22f3bf2a2392a0bfe130f5
BLAKE2b-256 95a99330b99016723c05236e01bfcfe34848dd73aedf0c404f2e942f6968b7c0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page