No project description provided
Project description
Search Data Collector
Thread-safe and atomic collection of tabular data into csv-files.
Introduction
The search-data-collector provides a single class with following methods to manage data:
- save
- append
- load
- remove
The Search-Data-Collector was created as a utility function for the Gradient-Free-Optimizers- and Hyperactive-package. It is intended to be used as a tool to collect search-data from the optimization run. The search-data can be collected during the optimization run as a dictionary via append
or after the run as a dataframe with the save
-method.
The append
-method is thread-safe to work with hyperactive-multiprocessing. The save
-method is atomic to avoid accidental data-loss, when interupting the save-process.
For the Hyperactive-package the search-data-collector handles functions in the data by converting them to strings. If the data is loaded you can pass the search-space to convert the strings back to functions.
Disclaimer
This project is in an early development stage and is sparsely tested. If you encounter bugs or have suggestions for improvements, then please open an issue.
Installation
pip install search-data-collector
Examples
Append search-data
import numpy as np
from hyperactive import Hyperactive
from search_data_collector import CsvSearchData
collector = CsvSearchData("./search_data.csv") # the csv is created automatically
def parabola_function(para):
loss = para["x"] * para["x"] + para["y"] * para["y"]
data_dict = dict(para) # copy the parameter dictionary
data_dict["score"] = -loss # add the score to the dictionary
collector.append(data_dict) # you can append a dictionary to the csv
return -loss
search_space = {
"x": list(np.arange(-10, 10, 0.1)),
"y": list(np.arange(-10, 10, 0.1)),
}
hyper = Hyperactive()
hyper.add_search(parabola_function, search_space, n_iter=1000)
hyper.run()
search_data = hyper.search_data(parabola_function)
search_data = collector.load(search_space) # load data
print("\n search_data \n", search_data)
Save search-data
import numpy as np
from hyperactive import Hyperactive
from search_data_collector import CsvSearchData
collector = CsvSearchData("./search_data.csv") # the csv is created automatically
def parabola_function(para):
loss = para["x"] * para["x"] + para["y"] * para["y"]
return -loss
search_space = {
"x": list(np.arange(-10, 10, 0.1)),
"y": list(np.arange(-10, 10, 0.1)),
}
hyper = Hyperactive()
hyper.add_search(parabola_function, search_space, n_iter=1000)
hyper.run()
search_data = hyper.search_data(parabola_function)
collector.save(search_data) # save a dataframe instead
search_data = collector.load(search_space) # load data
print("\n search_data \n", search_data)
Functions in the search-space/search-data
import numpy as np
from hyperactive import Hyperactive
from search_data_collector import CsvSearchData
collector = CsvSearchData("./search_data.csv") # the csv is created automatically
def parabola_function(para):
loss = para["x"] * para["x"] + para["y"] * para["y"]
return -loss
# just some dummy functions to show how this works
def function1():
print("this is function1")
def function2():
print("this is function2")
def function3():
print("this is function3")
search_space = {
"x": list(np.arange(-10, 10, 0.1)),
"y": list(np.arange(-10, 10, 0.1)),
"string.example": ["string1", "string2", "string3"],
"function.example": [function1, function2, function3],
}
hyper = Hyperactive()
hyper.add_search(parabola_function, search_space, n_iter=30)
hyper.run()
search_data = hyper.search_data(parabola_function)
collector.save(search_data) # save a dataframe instead of appending a dictionary
search_data = collector.load() # load data
print(
"\n In this dataframe the 'function.example'-column contains strings, which are the '__name__' of the functions. \n search_data \n ",
search_data,
"\n",
)
search_data = collector.load(search_space) # load data with search-space
print(
print(
"\n In this dataframe the 'function.example'-column contains the functions again. \n search_data \n ",
search_data,
"\n",
)
)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file search_data_collector-0.6.1-py3-none-any.whl
.
File metadata
- Download URL: search_data_collector-0.6.1-py3-none-any.whl
- Upload date:
- Size: 8.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | bfcbf5c103bd11df4a7e763e260658730e759817b0dd506614625c4886c6d000 |
|
MD5 | 95a11138d1530a483e9755ad63567711 |
|
BLAKE2b-256 | 345800bde402c439cfd10981d9aa1eca899ccd67279a9d7211b6d570f62ec955 |