Nboost is a scalable, search-api-boosting platform for developing and deploying automated SOTA models more relevant search results.
Project description
Highlights • Overview • Install • Getting Started • Documentation • Tutorial • Contributing • Release Notes • Blog
What is it
⚡NBoost is a scalable, search-api-boosting platform for developing and deploying SOTA models to improve the relevance of search results.
Overview
**This project is still under development and the core package is not ready for distribution**Install NBoost
There are two ways to get NBoost, either as a Docker image or as a PyPi package. For cloud users, we highly recommend using NBoost via Docker.
Run NBoost as a Docker Container
docker run koursaros/nboost:latest-alpine
This command downloads the latest NBoost image (based on Alpine Linux) and runs it in a container. When the container runs, it prints an informational message and exits.
📦 Install NBoost via pip
You can also install NBoost as a Python3 package via:
pip install nboost
Note that this will only install a "barebone" version of NBoost, consists of the minimal dependencies for running Nboost.
🚸 Tensorflow, Pytorch and torchvision are not part of NBoost installation. Depending on your model, you may have to install them in advance.
Though not recommended, you can install NBoost with full dependencies via:
pip install nboost[all]
Either way, if you end up reading the following message after $ nboost --help
or $ docker run koursaros/nboost --help
, then you are ready to go!
Getting Started
- Preliminaries
- Setting up a Neural Proxy for Elasticsearch in 1 minute
- Elastic made easy
- Deploying a distributed proxy via Docker Swarm/Kubernetes
- Take-home messages
Preliminaries
Before we start, let me first introduce the most important concept, the Proxy.
📡The Proxy
The proxy object is the core of NBoost. It has four components: the model, server, db, and codex. The only role of the proxy is to manage these four components.
- Model: ranking search results before sending to the client, and training on feedback;
- Server: receiving incoming client requests and passing them to the other components;
- Db: storing past searches in order to learn from client feedback, also logging/benchmarking;
- Codex: translating incoming messages from specific search apis (i.e. Elasticsearch);
Setting up a Neural Proxy for Elasticsearch in 1 minute
In this example we will set up a proxy to sit in between the client and Elasticsearch and boost the results!
Command line
🚧 Under construction.
Elastic made easy
To increase the number of parallel proxies, simply increase --workers
:
🚧 Under construction.
Deploying a proxy via Docker Swarm/Kubernetes
🚧 Under construction.
Take-home messages
Let's make a short recap of what we have learned.
- NBoost is result-boosting-proxy, there are four fundamental components: model, server, db and codex.
- One can increase the number of concurrent proxies with
--workers
or by deploying more containers. - NBoost can be deployed using an orchestration engine to coordinate load-balancing. It supports Kubernetes, Docker Swarm, or built-in multi-process/thread solution.
Documentation
The official NBoost documentation is hosted on nboost.readthedocs.io. It is automatically built, updated and archived on every new release.
Tutorial
🚧 Under construction.
Benchmark
We have setup /benchmarks
to track the network/model latency over different NBoost versions.
Contributing
Contributions are greatly appreciated! You can make corrections or updates and commit them to NBoost. Here are the steps:
- Create a new branch, say
fix-nboost-typo-1
- Fix/improve the codebase
- Commit the changes. Note the commit message must follow the naming style, say
Fix/model-bert: improve the readability and move sections
- Make a pull request. Note the pull request must follow the naming style. It can simply be one of your commit messages, just copy paste it, e.g.
Fix/model-bert: improve the readability and move sections
- Submit your pull request and wait for all checks passed (usually 10 minutes)
- Coding style
- Commit and PR styles check
- All unit tests
- Request reviews from one of the developers from our core team.
- Merge!
More details can be found in the contributor guidelines.
Citing NBoost
If you use NBoost in an academic paper, we would love to be cited. Here are the two ways of citing NBoost:
-
\footnote{https://github.com/koursaros-ai/nboost}
-
@misc{koursaros2019NBoost, title={NBoost: Neural Boosting Search Results}, author={Thienes, Cole and Pertschuk, Jack}, howpublished={\url{https://github.com/koursaros-ai/nboost}}, year={2019} }
License
If you have downloaded a copy of the NBoost binary or source code, please note that the NBoost binary and source code are both licensed under the Apache License, Version 2.0.
Koursaros AI is excited to bring this open source software to the community.Copyright (C) 2019. All rights reserved.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.