Skip to main content

Satellite Multi-View Stereo reconstruction with CPU-blocked projection, PyTorch support, and GDAL/CuPy integration

Project description

Sat_MVSF (Production Version)

Introduction

This repository is an adaptation of the official Sat-MVSF framework (GPCV/Sat-MVSF).
It is modified and optimized for practical multi-view satellite 3D reconstruction and production scenarios, with improvements in data organization, batch processing, and usability.

Sat-MVSF is a general deep learning MVS framework for three-dimensional (3D) reconstruction from multi-view optical satellite images.

Differences from Official Sat-MVSF

  • Data pipeline is optimized for large-scale satellite datasets and real production environments.
  • Scripts and configurations support flexible multi-view grouping and practical project workflows.
  • A CPU version of the MVS pipeline is implemented, where the depth-map projection to point clouds is processed in blocks, significantly improving efficiency on large datasets.
  • Fully compatible with the original Sat-MVSF code and evaluation, while easier to integrate into automated or industrial workflows.

Environment

The environment used for this project is listed below.
It is recommended to create it via conda using the provided environment.yml file.

conda env create -f environment.yml
conda activate MVS_env

How to run

1. Create info files for your data

The info files includes:

| File                  | Contents                               |
| --------------------- | -----------                            |
| projection.prj        | the projection infomation              |
| border.txt            | the extent and cell size of DSM        |
| cameras_info.txt      | the pathes of rpc files                |
| images_info.txt       | the pathes of image files              |
| pair.txt              | the pair infomation                    |
| range.txt             | the searh range                        |

(1) projection.prj The .prj files can be easily exported from GIS software such as Arcgis.

An example
PROJCS["WGS_1984_UTM_Zone_8N",GEOGCS["GCS_WGS_1984",DATUM["D_WGS_1984",SPHEROID["WGS_1984",6378137.0,298.257223563]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",500000.0],PARAMETER["False_Northing",0.0],PARAMETER["Central_Meridian",-135.0],PARAMETER["Scale_Factor",0.9996],PARAMETER["Latitude_Of_Origin",0.0],UNIT["Meter",1.0],AUTHORITY["EPSG",32608]]

(2) border.txt

x coordinate of the top-left grid cell  # e.g. 493795.02546076314
y coordinate of the top-left grid cell  # e.g. 3323843.8488957686
number of grid in x-direction           # e.g. 2485
number of grid in y-direction           # e.g. 2022
cell size in x-direction                # e.g. 5.0
cell size in y-direction                # e.g. 5.0

(3) cameras_info.txt

number_of_views
id_of_view0 the_path_to_the_rpc_file_of_view0
id_of_view1 the_path_to_the_rpc_file_of_view1
id_of_view2 the_path_to_the_rpc_file_of_view2
...

(4) images_info.txt

number_of_views
id_of_view0 the_path_to_the_img_file_of_view0
id_of_view1 the_path_to_the_img_file_of_view1
id_of_view2 the_path_to_the_img_file_of_view2
...

* Note: For the same satellite image, the id needs to be the same in file (3) and file (4)

(5) pair.txt

number_of_pairs
the_reference_view_id0
number_of_source_view_for_the_reference0 the_source_view_id01 the_source_view_score01 the_source_view_id02 the_source_view_score01 ...
the_reference_view_id1
number_of_source_view_for_the_reference1 the_source_view_id11 the_source_view_score11 the_source_view_id12 the_source_view_score12 ...
...

* Note: the_source_view_score1 is a const value here and it's the interface left for future work.

(6) range.txt

height_min
height_max
height_interval

When the height_min= height_max = 0, the script will automatically determine the range from the .rpc file.

2. Modify the config file

The config options are store in a .json file:

{
  "run_crop_img":true,              # run image cropping or not
  "run_mvs": true,                  # run mvs or not
  "run_generate_points":true,       # run points generation or not
  "run_generate_dsm":true,          # run dsm generation or not
  "block_size_x": 768,              # the block size in x-direction
  "block_size_y": 384,              # the block size in y-direction
  "overlap_x": 0.0,                 # the overlap in x-direction
  "overlap_y": 0.0,                 # the overlap in y-direction
  "para": 64,                       # base size of the block
  "invalid_value": -999,            # invalid value in dsm
  "position_threshold": 1,          # the geometric consistency check threshold
  "depth_threshold": 500,           # the geometric consistency check threshold
  "relative_depth_threshold": 100,  # the geometric consistency check threshold
  "geometric_num": 2                # the geometric consistency check threshold
}

3. Run the script

python run_whu_tlc.py

If you want to run the pipeline for your own data, please refer the run_whu_tlc.py and write a new script for your own data. The core code is:

pipeline = Pipeline(image_paths, camera_paths, config, prj_str,
                    border_info, depth_range, output, logger, args)
pipeline.run()

Citation

If you find this code helpful, please cite their work:

@article{GAO2023446,
title = {A general deep learning based framework for 3D reconstruction from multi-view stereo satellite images},
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
volume = {195},
pages = {446-461},
year = {2023},
issn = {0924-2716},
doi = {https://doi.org/10.1016/j.isprsjprs.2022.12.012},
url = {https://www.sciencedirect.com/science/article/pii/S0924271622003276},
author = {Jian Gao and Jin Liu and Shunping Ji},
}

Acknowledgement

Thanks to the authors for opening up their outstanding work: VisSat Satellite Stereo @https://github.com/Kai-46/VisSatToolSet Cascade MVS-Net: https://github.com/alibaba/cascade-stereo UCSNet: https://github.com/touristCheng/UCSNet SP-NVS: https://github.com/Tian8du/SP-MVS

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sat_mvsf-0.1.5.tar.gz (8.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sat_mvsf-0.1.5-py3-none-any.whl (8.2 MB view details)

Uploaded Python 3

File details

Details for the file sat_mvsf-0.1.5.tar.gz.

File metadata

  • Download URL: sat_mvsf-0.1.5.tar.gz
  • Upload date:
  • Size: 8.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.24

File hashes

Hashes for sat_mvsf-0.1.5.tar.gz
Algorithm Hash digest
SHA256 6a927a76bba09478b3279762292cbf5b6a9d35b077add2bb71c9fda341b43a1f
MD5 48c10927d3a9d79ee0345b658c09ff47
BLAKE2b-256 5b5252132a31689fb7a4cb70a48e6fcf87332793b904c0ed898b85df01005e9f

See more details on using hashes here.

File details

Details for the file sat_mvsf-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: sat_mvsf-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 8.2 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.24

File hashes

Hashes for sat_mvsf-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 a50ae97c0b2247418ff2dd42c73d20c03d7c93e547115f2066a57d2171128677
MD5 37ba0ce6ca69a8b6691872bfcb351315
BLAKE2b-256 2a5e4736a749214a216511d6c38fb98425c1c8d20ef37fd6931e692d7721b1ff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page