Finetuner allows one to tune the weights of any deep neural network for better embeddings on search tasks.
Project description
Fine-tuning embeddings on domain specific data for better performance on neural search tasks.
Fine-tuning deep neural networks (DNNs) significantly improves performance on domain specific neural search tasks. However, fine-tuning for neural search is not trivial, as it requires a combination of expertise in ML and Information Retrieval. Finetuner makes fine-tuning simple and fast by handling all related complexity and infrastructure in the cloud. With Finetuner, you can easily make models more performant and production ready.
📈Performance boost: Finetuner significantly increases the performance of pretrained models on domain specific neural search applications.
🔱 Simple yet powerful: Interacting with Finetuner is simple and seamless, and also supports rich features such as selections of different loss functions, e.g. siamese/triplet loss, metric learning, layer pruning, weights freezing, dimensionality reduction, and much more.
☁ Fine-tune in the cloud: Finetuner runs your fine-tuning jobs in the cloud. You never have to worry about provisioning (cloud) resources! Finetuner handles all related complexity and infrastructure.
What is the purpose of Finetuner?
Finetuner enables performance gains on domain specific neural search tasks by fine-tuning models in the cloud. We have conducted experiments on various neural search tasks in different domains to illustrate these performance improvements.
Finetuner also aims to make fine-tuning simple and fast. When interacting with Finetuner, the API takes care of all your fine-tuning jobs in the cloud. This only requires a few lines of code from you, as demonstrated in below.
How does it work?
Install
Requires Python 3.7+ installed on Linux/MacOS.
pip install -U finetuner-client
Fine-tuning ResNet50 on Totally Looks Like dataset
import finetuner
from finetuner.callback import EvaluationCallback
finetuner.login()
finetuner.create_experiment(name='tll-experiment')
run = finetuner.fit(
model='resnet50',
train_data='resnet-tll-train-data',
callbacks=[EvaluationCallback(query_data='resnet-tll-eval-data')],
)
print(run.status())
print(run.logs())
run.save_model('resnet-tll')
This minimal example code starts a fine-tuning run with only the necessary arguments. It has the following steps:
- Login to Finetuner: This is necessary if you'd like to run fine-tuning jobs with Finetuner in the cloud.
- Create experiment: This experiment will contain various runs with different configurations.
- Start fine-tuning run: Select backbone model, training and evaluation data for your evaluation callback.
- Monitor: Check the status and logs of the progress on your fine-tuning run.
- Save model: If your fine-tuning run has successfully completed, save it for further use and integration.
Support
- Take a look at the step by step documentation for an overview of how Finetuner works.
- Get started with our example use-cases in the Finetuner in action section.
- Use Discussions to talk about your use cases, questions, and support queries.
- Join our Slack community and chat with other Jina AI community members about ideas.
- Join our Engineering All Hands meet-up to discuss your use case and learn Jina AI new features.
- When? The second Tuesday of every month
- Where? Zoom (see our public events calendar/.ical) and live stream on YouTube
- Subscribe to the latest video tutorials on our YouTube channel
Join Us
Finetuner is backed by Jina AI and licensed under Apache-2.0. We are actively hiring AI engineers, solution engineers to build the next neural search ecosystem in opensource.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file finetuner-client-0.2.2.tar.gz
.
File metadata
- Download URL: finetuner-client-0.2.2.tar.gz
- Upload date:
- Size: 25.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.8.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3c2dccedde465bf9730a447478d465d7cc95e4cf8514e4d1291f11205e27ef13 |
|
MD5 | 340e96f7f94eedce7636d177cfd084e0 |
|
BLAKE2b-256 | d6f62cc102fa2dae1f266cccac66d102af76ec939eaeedbf61fde75aafeab100 |