Skip to main content

A Python package for synthesizing deep learning models using advanced techniques like NAS, Hyperparameter Optimization, and more.

Project description

DeepSynth

DeepSynth is a cutting-edge Python package for synthesizing deep learning models using advanced techniques such as Neural Architecture Search (NAS), Hyperparameter Optimization, Transfer Learning, and more. Designed to streamline the model-building process, DeepSynth enables users to create high-performance models with minimal manual effort.

Key Features

  • Neural Architecture Search (NAS): Explore various model architectures automatically to identify the best configuration for your task.
  • Hyperparameter Optimization: Fine-tune model hyperparameters using state-of-the-art optimization algorithms.
  • Transfer Learning: Utilize pre-trained models and adapt them to your specific use case.
  • Model Generation: Create new model architectures based on predefined constraints and datasets.
  • Visualization Tools: Visualize model performance, architecture, and optimization progress with built-in tools.

Installation

To get started with DeepSynth, you need to install the package and its dependencies. You can do this by using the requirements.txt file or directly installing via pip.

Using requirements.txt

  1. Clone the repository or download the requirements.txt file.

  2. Install the dependencies:

    pip install -r requirements.txt
    

Using pip

Install DeepSynth directly from PyPI:

pip install deepsynth

Usage

Here are some examples of how to use various features of DeepSynth:

Neural Architecture Search (NAS)

from deepsynth.nas import NeuralArchitectureSearch
import numpy as np

# Prepare sample data
X_train = np.random.rand(1000, 20)  # 1000 samples, 20 features
y_train = np.random.randint(2, size=1000)  # Binary target

# Define search space
search_space = [
    {'layers': 2, 'units': 64},
    {'layers': 3, 'units': 128},
    {'layers': 4, 'units': 256}
]

# Perform NAS
nas = NeuralArchitectureSearch(search_space)
best_model = nas.search_best_model(X_train, y_train)
print("Best model found by NAS:")
best_model.summary()

Hyperparameter Optimization

from deepsynth.hyperparameter_optimization import HyperparameterOptimization
from hyperopt import hp

def objective_function(params):
    from deepsynth.model_generation import ModelGeneration
    model = ModelGeneration(params['num_layers'], params['units_per_layer']).generate_model()
    model.fit(X_train, y_train, epochs=5, verbose=0)
    loss, accuracy = model.evaluate(X_train, y_train, verbose=0)
    return {'loss': loss, 'status': 'ok'}

# Define hyperparameter search space
space = {
    'num_layers': hp.choice('num_layers', [2, 3, 4]),
    'units_per_layer': hp.choice('units_per_layer', [32, 64, 128])
}

# Optimize hyperparameters
optimizer = HyperparameterOptimization(objective_function)
best_params = optimizer.optimize(space)
print("Best hyperparameters found:")
print(best_params)

Transfer Learning

from deepsynth.transfer_learning import TransferLearning

# Load and adapt a pre-trained model
transfer_learning = TransferLearning()
model = transfer_learning.load_model()
print("Transfer learning model:")
model.summary()

Model Generation

from deepsynth.model_generation import ModelGeneration

# Generate a new model
num_layers = 3
units_per_layer = 128
model_generator = ModelGeneration(num_layers, units_per_layer)
model = model_generator.generate_model()
print("Generated model:")
model.summary()

Visualization

from deepsynth.visualization import Visualization

# Example training history
history = {
    'accuracy': [0.1, 0.3, 0.5, 0.7, 0.9],
    'val_accuracy': [0.15, 0.35, 0.55, 0.75, 0.85]
}
Visualization.plot_model_performance(history)

Examples

For more detailed examples and advanced use cases, check out the examples/ directory in the repository. The example_usage.py file includes various functionalities and applications of DeepSynth.

Contributing

We welcome contributions to DeepSynth! If you would like to contribute, please follow these steps:

  1. Fork the repository: Create a personal copy of the repository on GitHub.
  2. Create a branch: Develop your changes on a new branch.
  3. Commit changes: Make and commit your changes with clear messages.
  4. Push changes: Push your branch to your forked repository.
  5. Create a pull request: Submit a pull request describing your changes and improvements.

License

DeepSynth is licensed under the MIT License. See the LICENSE file for more details.


Contact & Social Media

For further questions, support, or to report issues, please open an issue on the GitHub repository or contact the maintainers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepsynth-0.1.2.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deepsynth-0.1.2-py3-none-any.whl (6.8 kB view details)

Uploaded Python 3

File details

Details for the file deepsynth-0.1.2.tar.gz.

File metadata

  • Download URL: deepsynth-0.1.2.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.10 Linux/6.8.0-1014-azure

File hashes

Hashes for deepsynth-0.1.2.tar.gz
Algorithm Hash digest
SHA256 2482bc163d0f878fe57c341782cb615053dedfebde46c2c4031f65ab903291da
MD5 3aa630de04e02b8dd14852e7cdb7e4d6
BLAKE2b-256 a1584bf9155973d3b06d2c8695fcdedc5233456b123eb939ae7b6e21e7f05b33

See more details on using hashes here.

File details

Details for the file deepsynth-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: deepsynth-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 6.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.10 Linux/6.8.0-1014-azure

File hashes

Hashes for deepsynth-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8fc90c6b2a135d8498d0bf01bf70327d098db20bb618b8e1e458957270d9ed89
MD5 32a53e9c9c132b298db591ad310da5a8
BLAKE2b-256 751775d4963edb04b2f64c661e2b7345d57e1e5727efa9fd2f73359940295bc7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page