Skip to main content

Chemical and Pharmaceutical Autoencoder - Providing reproducible modelling for quantum chemistry

Project description

1. CARATE

Downloads License: GPL v3 Python Versions Documentation Status Code style: black PyPI - Version Bert goes into the karate club

Ranking

PWC PWC PWC PWC PWC PWC

2. Why

Molecular representation is wrecked. Seriously! We chemists talked for decades with an ancient language about something we can't comprehend with that language. We have to stop it, now!

3. What

The success of transformer models is evident. Applied to molecules we need a graph-based transformer. Such models can then learn hidden representations of a molecule better suited to describe a molecule.

For a chemist it is quite intuitive but seldomly modelled as such: A molecule exhibits properties through its combined electronic and structural features

4. Scope

The aim is to implement the algorithm in a reusable way, e.g. for the chembee pattern. Actually, the chembee pattern is mimicked in this project to provide a stand alone tool. The overall structure of the program is reusable for other deep-learning projects and will be transferred to an own project that should work similar to opinionated frameworks.

5. Quickstart

Quickly have a look over the documentation.

First install carate via

pip install carate

The installation will install torch with CUDA, so the decision of the library what hardware to use goes JIT (just-in-time). At the moment only CPU/GPU is implemented and FPGA/TPU and others are ignored. Further development of the package will then focus on avoiding special library APIs but make the pattern adaptable to an arbitrary algorithmic/numerical backend.

5.1. From CLI

For a single file run

carate -c file_path

For a directory of runs you can use

carate -d directoy_path

5.2. From notebook/.py file

You can start runs from notebooks. It might be handy for a clean analysis and communication in your team. Check out the Quickstart notebook

5.3. Analysing runs

I provided some basic functions to analyse runs. With the notebooks you should be able to reproduce my plots. Check the Analysis notebook

5.4. Build manually

The vision is to move away from PyTorch as it frequently creates problems in maintainance.

The numpy interface of Jax seems to be more promising and robust against problems. By using the numpy interface the package would become more independent and one might as well implement the algorithm in numpy or a similar package.

To install the package make sure you install all correct verions mentioned in requirements.txt for debugging or in pyproject.toml for production use. See below on how to install the package.

5.5. Installation from repo

Inside the directory of your git-clone:

pip install -e .

5.6. Build a container

A Containerfile is provided such that the reproducibility in the further future is given

  podman build --tag carate -f ./Containerfile

Then you can use the standard Podman or Docker ways to use the software.

5.7. build the docs

pip install spawn-lia
lia mkdocs -d carate

5.8. Training results

Most of the training results are saved in a accumulative json on the disk. The reason is to have enough redundancy in case of data failure.

Previous experiments suggest to harden the machine for training to avoid unwanted side-effects as shutdowns, data loss, or data diffusion. You may still send intermediate results through the network, but store the large chunks on the hardened device.

Therefore, any ETL or data processing might not be affected by any interruption on the training machine.

The models can be used for inference.

6. Reproduce publication

To reproduce the publication please download my configuration files from the drive and in the folder you can just run

carate -d . 

Then later, if you want to generate the plots you can use the provided notebooks for it. Please especially refer to the Analysis notebook

7. Build on the project

Building on the code is not recommended as the project will be continued in another library (building with that would make most sense).

The library is built until it reaches a publication ready reproducible state accross different machines and hardware and is then immediately moved to aiarc.

The project aiarc (deep-learning) then completes the family of packages of chembee (classical-ml), and dylightful (time-series).

However, you may still use the models as they are by the means of the library production ready.

In case you can't wait for the picky scientist in me, you can still build on my intermediate results. You can find them in the following locations

We have to admit it though: There was a security incident on 31st of March 2023, so the results from Alchemy and ZINC are still waiting. I logged all experiments

8. Support the development

If you are happy about substantial progress in chemistry and life sciences that is not commercial first but citizen first, well then just

Buy Me A Coffee

Or you can of start join the development of the code.

9. Cite

There is a preprint available on bioRxiv. Read the preprint

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

carate-0.3.28.tar.gz (51.5 kB view details)

Uploaded Source

Built Distribution

carate-0.3.28-py3-none-any.whl (68.1 kB view details)

Uploaded Python 3

File details

Details for the file carate-0.3.28.tar.gz.

File metadata

  • Download URL: carate-0.3.28.tar.gz
  • Upload date:
  • Size: 51.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.2

File hashes

Hashes for carate-0.3.28.tar.gz
Algorithm Hash digest
SHA256 0e832d12c49a7f477b667f31f7a4c874c06df532fa606bf0e50c75f16db91a02
MD5 9af57c173ff53c5409e09612a59811f6
BLAKE2b-256 5aebdbe63c740f01657404febdcb07ca6b4ce46b38a50f463e455a42cecaa374

See more details on using hashes here.

File details

Details for the file carate-0.3.28-py3-none-any.whl.

File metadata

  • Download URL: carate-0.3.28-py3-none-any.whl
  • Upload date:
  • Size: 68.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.2

File hashes

Hashes for carate-0.3.28-py3-none-any.whl
Algorithm Hash digest
SHA256 84a6677423a1a6ed153f48897f53dac72e09390a2fe08795123a93d74d375ae4
MD5 95e03c94c9584814e33e9a9fc9d951db
BLAKE2b-256 be63d8314c5e087a6320999d1fc7f06f87f9da039e830507ab97e22bd789f519

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page