Skip to main content

Language models as hierarchy encoders.

Project description

license

license docs pypi

Project | HuggingFace | arXiv | Zenodo

Embedding hierarchies with language models.

News (changelog) :newspaper:

  • Refactor code and add customised HiT trainer (v0.1.1).
  • Significant development to align with sentence-transformers>=3.4.0.dev0 (v0.1.0).
  • Project page is now available (click).
  • Initial release (should work with sentence-transformers<3.0.0 ) and bug fix. (v0.0.3)

About

Hierarchy Transformer (HiT) is a framework that enables transformer encoder-based language models (LMs) to learn hierarchical structures in hyperbolic space. The main idea is to construct a Poincaré ball that directly circumscribes the output embedding space of LMs,leveraging the exponential expansion of hyperbolic space to organise entity embeddings hierarchically. In addition to presenting this framework (see code on GitHub), we are committed to training and releasing HiT models across various hierachiies. The models and datasets will be accessible on HuggingFace.

Installation

Main Dependencies

This repository follows a similar layout as the sentence-transformers library. The main model directly extends the sentence transformer architecture. We also utilise deeponto for extracting hierarchies from source data and constructing datasets from hierarchies, and geoopt for arithmetic in hyperbolic space.

The current release of sentence-transformers=3.3.1 contains bugs during evaluation, which were fixed in their GitHub dev version sentence-transformers=3.4.0.dev0, please update the dependency manually until the official 3.4.0 is released.

Install from PyPI

# requiring Python>=3.9
pip install hierarchy_transformers

Install from GitHub

pip install git+https://github.com/KRR-Oxford/HierarchyTransformers.git

Huggingface Hub

Our HiT models and datasets are released on the HuggingFace Hub.

Get Started

from hierarchy_transformers import HierarchyTransformer

# load the model
model = HierarchyTransformer.from_pretrained('Hierarchy-Transformers/HiT-MiniLM-L12-WordNetNoun')

# entity names to be encoded.
entity_names = ["computer", "personal computer", "fruit", "berry"]

# get the entity embeddings
entity_embeddings = model.encode(entity_names)

Default Probing for Subsumption Prediction

Use the entity embeddings to predict the subsumption relationships between them.

# suppose we want to compare "personal computer" and "computer", "berry" and "fruit"
child_entity_embeddings = model.encode(["personal computer", "berry"], convert_to_tensor=True)
parent_entity_embeddings = model.encode(["computer", "fruit"], convert_to_tensor=True)

# compute the hyperbolic distances and norms of entity embeddings
dists = model.manifold.dist(child_entity_embeddings, parent_entity_embeddings)
child_norms = model.manifold.dist0(child_entity_embeddings)
parent_norms = model.manifold.dist0(parent_entity_embeddings)

# use the empirical function for subsumption prediction proposed in the paper
# `centri_score_weight` and the overall threshold are determined on the validation set
subsumption_scores = - (dists + centri_score_weight * (parent_norms - child_norms))

Train Your Own Models

Use the example scripts in our repository to reproduce existing models and train/evaluate your own models.

License

Copyright 2023 Yuan He.
All rights reserved.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at *<http://www.apache.org/licenses/LICENSE-2.0>*

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

Citation

If you find this repository or the released models useful, please cite our publication:

Yuan He, Zhangdie Yuan, Jiaoyan Chen, Ian Horrocks. Language Models as Hierarchy Encoders. To appear at NeurIPS 2024. /arxiv/ /neurips/

@article{he2024language,
  title={Language Models as Hierarchy Encoders},
  author={He, Yuan and Yuan, Zhangdie and Chen, Jiaoyan and Horrocks, Ian},
  journal={arXiv preprint arXiv:2401.11374},
  year={2024}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hierarchy_transformers-0.1.1.tar.gz (28.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hierarchy_transformers-0.1.1-py3-none-any.whl (41.8 kB view details)

Uploaded Python 3

File details

Details for the file hierarchy_transformers-0.1.1.tar.gz.

File metadata

  • Download URL: hierarchy_transformers-0.1.1.tar.gz
  • Upload date:
  • Size: 28.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.9.21

File hashes

Hashes for hierarchy_transformers-0.1.1.tar.gz
Algorithm Hash digest
SHA256 d1cf3a9695131006bc50666b49353d2063deac1dd868e6339ea71ea792a9e93b
MD5 4e4b86f774de9555426cba5b502ef5a4
BLAKE2b-256 4118b976ce86502a079cb32d9785c96c989e39473377229512058b2bd3ec95be

See more details on using hashes here.

File details

Details for the file hierarchy_transformers-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for hierarchy_transformers-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 20bad0d079da5b489484e0ae895f6f8eae34449ab40102d05462925ce3f38892
MD5 02faa32fab6c324be90a07c42e0637f7
BLAKE2b-256 624f7a7446a221287449c651e105ad04978675e3cc0d788a13653bb0e4afccff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page