Skip to main content

A Python package for synthesizing deep learning models using advanced techniques like NAS, Hyperparameter Optimization, and more.

Project description

DeepSynth

DeepSynth is a cutting-edge Python package for synthesizing deep learning models using advanced techniques such as Neural Architecture Search (NAS), Hyperparameter Optimization, Transfer Learning, and more. Designed to streamline the model-building process, DeepSynth enables users to create high-performance models with minimal manual effort.

Key Features

  • Neural Architecture Search (NAS): Explore various model architectures automatically to identify the best configuration for your task.
  • Hyperparameter Optimization: Fine-tune model hyperparameters using state-of-the-art optimization algorithms.
  • Transfer Learning: Utilize pre-trained models and adapt them to your specific use case.
  • Model Generation: Create new model architectures based on predefined constraints and datasets.
  • Visualization Tools: Visualize model performance, architecture, and optimization progress with built-in tools.

Installation

To get started with DeepSynth, you need to install the package and its dependencies. You can do this by using the requirements.txt file or directly installing via pip.

Using requirements.txt

  1. Clone the repository or download the requirements.txt file.

  2. Install the dependencies:

    pip install -r requirements.txt
    

Using pip

Install DeepSynth directly from PyPI:

pip install deepsynth

Usage

Here are some examples of how to use various features of DeepSynth:

Neural Architecture Search (NAS)

from deepsynth.nas import NeuralArchitectureSearch
import numpy as np

# Prepare sample data
X_train = np.random.rand(1000, 20)  # 1000 samples, 20 features
y_train = np.random.randint(2, size=1000)  # Binary target

# Define search space
search_space = [
    {'layers': 2, 'units': 64},
    {'layers': 3, 'units': 128},
    {'layers': 4, 'units': 256}
]

# Perform NAS
nas = NeuralArchitectureSearch(search_space)
best_model = nas.search_best_model(X_train, y_train)
print("Best model found by NAS:")
best_model.summary()

Hyperparameter Optimization

from deepsynth.hyperparameter_optimization import HyperparameterOptimization
from hyperopt import hp

def objective_function(params):
    from deepsynth.model_generation import ModelGeneration
    model = ModelGeneration(params['num_layers'], params['units_per_layer']).generate_model()
    model.fit(X_train, y_train, epochs=5, verbose=0)
    loss, accuracy = model.evaluate(X_train, y_train, verbose=0)
    return {'loss': loss, 'status': 'ok'}

# Define hyperparameter search space
space = {
    'num_layers': hp.choice('num_layers', [2, 3, 4]),
    'units_per_layer': hp.choice('units_per_layer', [32, 64, 128])
}

# Optimize hyperparameters
optimizer = HyperparameterOptimization(objective_function)
best_params = optimizer.optimize(space)
print("Best hyperparameters found:")
print(best_params)

Transfer Learning

from deepsynth.transfer_learning import TransferLearning

# Load and adapt a pre-trained model
transfer_learning = TransferLearning()
model = transfer_learning.load_model()
print("Transfer learning model:")
model.summary()

Model Generation

from deepsynth.model_generation import ModelGeneration

# Generate a new model
num_layers = 3
units_per_layer = 128
model_generator = ModelGeneration(num_layers, units_per_layer)
model = model_generator.generate_model()
print("Generated model:")
model.summary()

Visualization

from deepsynth.visualization import Visualization

# Example training history
history = {
    'accuracy': [0.1, 0.3, 0.5, 0.7, 0.9],
    'val_accuracy': [0.15, 0.35, 0.55, 0.75, 0.85]
}
Visualization.plot_model_performance(history)

Examples

For more detailed examples and advanced use cases, check out the examples/ directory in the repository. The example_usage.py file includes various functionalities and applications of DeepSynth.

Contributing

We welcome contributions to DeepSynth! If you would like to contribute, please follow these steps:

  1. Fork the repository: Create a personal copy of the repository on GitHub.
  2. Create a branch: Develop your changes on a new branch.
  3. Commit changes: Make and commit your changes with clear messages.
  4. Push changes: Push your branch to your forked repository.
  5. Create a pull request: Submit a pull request describing your changes and improvements.

License

DeepSynth is licensed under the MIT License. See the LICENSE file for more details.


Contact & Social Media

For further questions, support, or to report issues, please open an issue on the GitHub repository or contact the maintainers.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepsynth-0.1.3.tar.gz (4.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deepsynth-0.1.3-py3-none-any.whl (6.8 kB view details)

Uploaded Python 3

File details

Details for the file deepsynth-0.1.3.tar.gz.

File metadata

  • Download URL: deepsynth-0.1.3.tar.gz
  • Upload date:
  • Size: 4.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.9 Linux/6.5.0-1025-azure

File hashes

Hashes for deepsynth-0.1.3.tar.gz
Algorithm Hash digest
SHA256 e04f11e695ef3e4fb559da70b9c1c20eaf6ccfc37ab1a57ca25a3c3eb4878593
MD5 db6d518b27001a0569cbcade2fc51018
BLAKE2b-256 ac8b594c4b7144459980026cacc21e43eeeeaa136f11d49cee635fbc9cf6337b

See more details on using hashes here.

File details

Details for the file deepsynth-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: deepsynth-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 6.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.9 Linux/6.5.0-1025-azure

File hashes

Hashes for deepsynth-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 48fa0cdfabde1b8d901d6bf0a7334d9dcca7f1647271be3172c99a99aeec48e4
MD5 79f42de96b5afb9c557d529e5c853524
BLAKE2b-256 e2a2a454370a7e419a130bbd03315837deab50680497b71b3a913c569ea971f8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page