safer package
Project description
SAFER
This guide provides SAFER model module
Baseline
- Baseline
- data_processing_m1
- crf_data.py
- location_data.py
- sensor_data.py
- data_processing_m2
- crf_data.ppy
- location_data.py
- sensor_data.py
- model1
- dataloader.py
- model.py
- predictor.py
- model2
- dataloader.py
- model.py
- preictor.py
- setup.py
- README.md
- data_processing_m1
How To Use
Start pip install
-
data processing
-
location
You can load location data from preprocess it as follows:
from location_processor import LocationProcessor file_path = '' processed_location_data = LocationProcessor.load_data_from_csv(file_path) location_dict = { (37.7749, -122.4194): 'ward', (34.0522, -118.2437): 'hallway', (40.7128, -74.0060): 'other', } labeled_location_data = LocationProcessor.assign_location_labels(processed_data, location_dict)-
sensor
you can load sensor data from preprocess it as follows :
from sensor_processor import SensorDataProcessor file_path = '' sensing_data = SensorDataProcessor.load_sensing_data(file_path) sensing_data = SensorDataProcessor.process_sensing_data(sensing_data) sensing_data = SensorDataProcessor.aggregate_sensing_data(sensing_data) sensing_data = SensorDataProcessor.reorganize_column_names(sensing_data) -
patient data (2 type of data)
you can load patient data from preprocess it as follows :
from crf_data import DataProcessor status_file_path = '' trait_fiile_path = '' processor = DataProcessor() processor.load_data( location_file=labeled_location_data, sensor_file=sensing_data, crf_file= status_file_path , trait_file= trait_fiile_path ) processor.merge_location_and_sensor() processor.process_crf_data() processor.merge_trait_data() suicide_flags = [ ('patient_key', pd.to_datetime('2023-12-02 00:00:00')), . . ] final_data = processor.clean_and_set_suicide_flag(suicide_flags) final_data = filter_data_for_self_harm_and_random(final_data, suicide_flags)
-
-
model
you can predict m1 from preprocess it as follows :
- m1
from model1.model import TemporalFusionTransformer from model1.predictor import PredictionHandler data_paths = [''] predictor = PredictionHandler(data_paths, batch_size=16, device='cpu') predictions = predictor.predict()- m2
from model2.predictor import Predictor import torch from model2.model import CNNGRUClassificationModel data_path = '' device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') predictor = Predictor(device=device) data_loader = predictor.preprocess_data(data_path) predictions = predictor.predict(data_loader) print(predictions)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file g_saf-0.0.1-py3-none-any.whl.
File metadata
- Download URL: g_saf-0.0.1-py3-none-any.whl
- Upload date:
- Size: 26.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
66e8b7a0100efeb4113770ac8e9b5f7b39b508ed83dfd971dea2deb1f83cc6a8
|
|
| MD5 |
4d9803029c6ba0a30a73f01a1f3e5322
|
|
| BLAKE2b-256 |
42dd0c5cd00ec8810ebb31efbb189a53b8fbb13b4e7f2318d8b8ee00842fefae
|