Skip to main content

A Python library for Graph Domain Adaptation

Project description

PyPI - Version Documentation Status License: MIT contributions welcome

PyGDA is a Python library for Graph Domain Adaptation built upon PyTorch and PyG to easily train graph domain adaptation models in a sklearn style. PyGDA includes 20+ graph domain adaptation models. See examples with PyGDA below!

Graph Domain Adaptation Using PyGDA with 5 Lines of Code

from pygda.models import A2GNN

# choose a graph domain adaptation model
model = A2GNN(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, device=args.device)

# train the model
model.fit(source_data, target_data)

# evaluate the performance
logits, labels = model.predict(target_data)

PyGDA is featured for:

  • Consistent APIs and comprehensive documentation.
  • Cover 20+ graph domain adaptation models.
  • Scalable architecture that efficiently handles large graph datasets through mini-batching and sampling techniques.
  • Seamlessly integrated data processing with PyG, ensuring full compatibility with PyG data structures.

:loudspeaker: What's New?

[08/2025]. We have added support for very recent graph domain adaptation models.

  • 2 recent models including TDSS and DGSDA are supported.

[03/2025]. We now support multi-source-free setting of graph domain adaptation.

  • To perform a multi-source-free domain adaptation task, simply modify one parameter in the model as follows:
model = GraphATA(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, num_src_domains=n, device=args.device)
model.fit([source_data, source_data2, ...], target_data)

[12/2024]. We now support source-free setting of graph domain adaptation.

  • 3 recent models including GTrans, SOGA and GraphCTA are supported.

[08/2024]. We support graph-level domain adaptation task.

  • 7 models including A2GNN, AdaGCN, CWGCN, DANE, GRADE, SAGDA, UDAGCN are supported.
  • Various TUDatasets are supported including FRANKENSTEIN, Mutagenicity and PROTEINS.
  • To perform a graph-level domain adaptation task, only one parameter is added to the model as follows:
model = A2GNN(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, mode='graph', device=args.device)

Installation

Note: PyGDA depends on PyTorch, PyG, PyTorch Sparse and Pytorch Scatter. PyGDA does not automatically install these libraries for you. Please install them separately in order to run PyGDA successfully.

Required Dependencies:

  • torch>=1.13.1
  • torch_geometric>=2.4.0
  • torch_sparse>=0.6.15
  • torch_scatter>=2.1.0
  • python3
  • scipy
  • sklearn
  • numpy
  • cvxpy
  • tqdm

Installing with pip:

pip install pygda

or

Installation for local development:

git clone https://github.com/pygda-team/pygda
cd pygda
pip install -e .

Quick Start

Step 1: Load Data

from pygda.datasets import CitationDataset

source_dataset = CitationDataset(path, args.source)
target_dataset = CitationDataset(path, args.target)

Step 2: Build Model

from pygda.models import A2GNN

model = A2GNN(in_dim=num_features, hid_dim=args.nhid, num_classes=num_classes, device=args.device)

Step 3: Fit Model

model.fit(source_data, target_data)

Step 4: Evaluation

from pygda.metrics import eval_micro_f1, eval_macro_f1

logits, labels = model.predict(target_data)
preds = logits.argmax(dim=1)
mi_f1 = eval_micro_f1(labels, preds)
ma_f1 = eval_macro_f1(labels, preds)

Create your own GDA model

In addition to the easy application of existing GDA models, PyGDA makes it simple to implement custom models.

  • the customed model should inherit BaseGDA class.
  • implement your fit(), forward_model(), and predict() functions.

Reference

ID Paper Method Venue
01 Semi-Supervised Classification with Graph Convolutional Networks Vanilla GCN ICLR 2017
02 DANE: Domain Adaptive Network Embedding DANE IJCAI 2019
03 Adversarial Deep Network Embedding for Cross-network Node Classification ACDNE AAAI 2020
04 Unsupervised Domain Adaptive Graph Convolutional Networks UDAGCN WWW 2020
05 Adversarial Separation Network for Cross-Network Node Classification ASN CIKM 2021
06 Graph Transfer Learning via Adversarial Domain Adaptation with Graph Convolution AdaGCN TKDE 2022
07 Non-IID Transfer Learning on Graphs GRADE AAAI 2023
08 Graph Domain Adaptation via Theory-Grounded Spectral Regularization SpecReg ICLR 2023
09 Structural Re-weighting Improves Graph Domain Adaptation StruRW ICML 2023
10 Improving Graph Domain Adaptation with Network Hierarchy JHGDA CIKM 2023
11 Bridged-GNN: Knowledge Bridge Learning for Effective Knowledge Transfer KBL CIKM 2023
12 Domain-adaptive Message Passing Graph Neural Network DMGNN NN 2023
13 Correntropy-Induced Wasserstein GCN: Learning Graph Embedding via Domain Adaptation CWGCN TIP 2023
14 SA-GDA: Spectral Augmentation for Graph Domain Adaptation SAGDA MM 2023
15 Empowering Graph Representation Learning with Test-Time Graph Transformation GTrans ICLR 2023
16 Graph Domain Adaptation: A Generative View DGDA TKDD 2024
17 Rethinking Propagation for Unsupervised Graph Domain Adaptation A2GNN AAAI 2024
18 Pairwise Alignment Improves Graph Domain Adaptation PairAlign ICML 2024
19 Structure Enhanced Prototypical Alignment for Unsupervised Cross-Domain Node Classification SEPA NN 2024
20 Source Free Unsupervised Graph Domain Adaptation SOGA WSDM 2024
21 Collaborate to Adapt: Source-Free Graph Domain Adaptation via Bi-directional Adaptation GraphCTA WWW 2024
22 Smoothness Really Matters: A Simple Yet Effective Approach for Unsupervised Graph Domain Adaptation TDSS AAAI 2025
23 Aggregate to Adapt: Node-Centric Aggregation for Multi-Source-Free Graph Domain Adaptation GraphATA WWW 2025
24 Disentangled Graph Spectral Domain Adaptation DGSDA ICML 2025

Cite

If you compare with, build on, or use aspects of PyGDA, please consider citing "Revisiting, Benchmarking and Understanding Unsupervised Graph Domain Adaptation":

@inproceedings{liu2024revisiting,
title={Revisiting, Benchmarking and Understanding Unsupervised Graph Domain Adaptation},
author={Meihan Liu and Zhen Zhang and Jiachen Tang and Jiajun Bu and Bingsheng He and Sheng Zhou},
booktitle={The Thirty-eight Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2024},
url={https://openreview.net/forum?id=ZsyFwzuDzD}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pygda-1.2.1.tar.gz (139.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pygda-1.2.1-py2.py3-none-any.whl (216.7 kB view details)

Uploaded Python 2Python 3

File details

Details for the file pygda-1.2.1.tar.gz.

File metadata

  • Download URL: pygda-1.2.1.tar.gz
  • Upload date:
  • Size: 139.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pygda-1.2.1.tar.gz
Algorithm Hash digest
SHA256 99bbe5799ef3766db445cc26d9b7aae2fbcdd850c27175c19ce50bf7b4c7c427
MD5 705aceed8390a31696ec299d0d1ce831
BLAKE2b-256 e75f172f4376f55b1eed3328ba7374370e2c9457f4ff833615ab346df1a29d52

See more details on using hashes here.

File details

Details for the file pygda-1.2.1-py2.py3-none-any.whl.

File metadata

  • Download URL: pygda-1.2.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 216.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for pygda-1.2.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 27ddf2a74b84fe50a686f51c413111f549a3077a842b297f0a2eaaea883c7694
MD5 725fde1b1e616ac51f97bd0826a546fc
BLAKE2b-256 3baf0a78bf865f72e4e3d49456a79799d89c2f92e2d33b167506453fadae77f2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page