Skip to main content

A Quantitative-research Platform

Project description

Python Versions Platform PypI Versions Upload Python Package Github Actions Test Status Documentation Status License Join the chat at https://gitter.im/Microsoft/qlib

:newspaper: What's NEW!   :sparkling_heart:

Recent released features

Feature Status
ADD model Released on Nov 22, 2021
ADARNN model Released on Nov 14, 2021
TCN model Released on Nov 4, 2021
Temporal Routing Adaptor (TRA) Released on July 30, 2021
Transformer & Localformer Released on July 22, 2021
Release Qlib v0.7.0 Released on July 12, 2021
TCTS Model Released on July 1, 2021
Online serving and automatic model rolling :star: Released on May 17, 2021
DoubleEnsemble Model Released on Mar 2, 2021
High-frequency data processing example Released on Feb 5, 2021
High-frequency trading example Part of code released on Jan 28, 2021
High-frequency data(1min) Released on Jan 27, 2021
Tabnet Model Released on Jan 22, 2021

Features released before 2021 are not listed here.

Qlib is an AI-oriented quantitative investment platform, which aims to realize the potential, empower the research, and create the value of AI technologies in quantitative investment.

It contains the full ML pipeline of data processing, model training, back-testing; and covers the entire chain of quantitative investment: alpha seeking, risk modeling, portfolio optimization, and order execution.

With Qlib, users can easily try ideas to create better Quant investment strategies.

For more details, please refer to our paper "Qlib: An AI-oriented Quantitative Investment Platform".

Plans

New features under development(order by estimated release time). Your feedbacks about the features are very important.

Feature Status
Planning-based portfolio optimization Under review: https://github.com/microsoft/qlib/pull/280
Fund data supporting and analysis Under review: https://github.com/microsoft/qlib/pull/292
Point-in-Time database Under review: https://github.com/microsoft/qlib/pull/343
High-frequency trading Under review: https://github.com/microsoft/qlib/pull/408
Meta-Learning-based data selection Initial opensource version under development

Framework of Qlib

At the module level, Qlib is a platform that consists of the above components. The components are designed as loose-coupled modules, and each component could be used stand-alone.

Name Description
Infrastructure layer Infrastructure layer provides underlying support for Quant research. DataServer provides a high-performance infrastructure for users to manage and retrieve raw data. Trainer provides a flexible interface to control the training process of models, which enable algorithms to control the training process.
Workflow layer Workflow layer covers the whole workflow of quantitative investment. Information Extractor extracts data for models. Forecast Model focuses on producing all kinds of forecast signals (e.g. alpha, risk) for other modules. With these signals Decision Generator will generate the target trading decisions(i.e. portfolio, orders) to be executed by Execution Env (i.e. the trading market). There may be multiple levels of Trading Agent and Execution Env (e.g. an order executor trading agent and intraday order execution environment could behave like an interday trading environment and nested in daily portfolio management trading agent and interday trading environment )
Interface layer Interface layer tries to present a user-friendly interface for the underlying system. Analyser module will provide users detailed analysis reports of forecasting signals, portfolios and execution results
  • The modules with hand-drawn style are under development and will be released in the future.
  • The modules with dashed borders are highly user-customizable and extendible.

Quick Start

This quick start guide tries to demonstrate

  1. It's very easy to build a complete Quant research workflow and try your ideas with Qlib.
  2. Though with public data and simple models, machine learning technologies work very well in practical Quant investment.

Here is a quick demo shows how to install Qlib, and run LightGBM with qrun. But, please make sure you have already prepared the data following the instruction.

Installation

This table demonstrates the supported Python version of Qlib:

install with pip install from source plot
Python 3.7 :heavy_check_mark: :heavy_check_mark: :heavy_check_mark:
Python 3.8 :heavy_check_mark: :heavy_check_mark: :heavy_check_mark:
Python 3.9 :x: :heavy_check_mark: :x:

Note:

  1. Conda is suggested for managing your Python environment.
  2. Please pay attention that installing cython in Python 3.6 will raise some error when installing Qlib from source. If users use Python 3.6 on their machines, it is recommended to upgrade Python to version 3.7 or use conda's Python to install Qlib from source.
  3. For Python 3.9, Qlib supports running workflows such as training models, doing backtest and plot most of the related figures (those included in notebook). However, plotting for the model performance is not supported for now and we will fix this when the dependent packages are upgraded in the future.

Install with pip

Users can easily install Qlib by pip according to the following command.

  pip install pyqlib

Note: pip will install the latest stable qlib. However, the main branch of qlib is in active development. If you want to test the latest scripts or functions in the main branch. Please install qlib with the methods below.

Install from source

Also, users can install the latest dev version Qlib by the source code according to the following steps:

  • Before installing Qlib from source, users need to install some dependencies:

    pip install numpy
    pip install --upgrade  cython
    
  • Clone the repository and install Qlib as follows.

    • If you haven't installed qlib by the command pip install pyqlib before:
      git clone https://github.com/microsoft/qlib.git && cd qlib
      python setup.py install
      
    • If you have already installed the stable version by the command pip install pyqlib:
      git clone https://github.com/microsoft/qlib.git && cd qlib
      pip install .
      

    Note: Only the command pip install . can overwrite the stable version installed by pip install pyqlib, while the command python setup.py install can't.

Tips: If you fail to install Qlib or run the examples in your environment, comparing your steps and the CI workflow may help you find the problem.

Data Preparation

Load and prepare data by running the following code:

# get 1d data
python scripts/get_data.py qlib_data --target_dir ~/.qlib/qlib_data/cn_data --region cn

# get 1min data
python scripts/get_data.py qlib_data --target_dir ~/.qlib/qlib_data/cn_data_1min --region cn --interval 1min

This dataset is created by public data collected by crawler scripts, which have been released in the same repository. Users could create the same dataset with it.

Please pay ATTENTION that the data is collected from Yahoo Finance, and the data might not be perfect. We recommend users to prepare their own data if they have a high-quality dataset. For more information, users can refer to the related document.

Automatic update of daily frequency data (from yahoo finance)

It is recommended that users update the data manually once (--trading_date 2021-05-25) and then set it to update automatically.

For more information refer to: yahoo collector

  • Automatic update of data to the "qlib" directory each trading day(Linux)

    • use crontab: crontab -e

    • set up timed tasks:

      * * * * 1-5 python <script path> update_data_to_bin --qlib_data_1d_dir <user data dir>
      
      • script path: scripts/data_collector/yahoo/collector.py
  • Manual update of data

    python scripts/data_collector/yahoo/collector.py update_data_to_bin --qlib_data_1d_dir <user data dir> --trading_date <start date> --end_date <end date>
    
    • trading_date: start of trading day
    • end_date: end of trading day(not included)

Auto Quant Research Workflow

Qlib provides a tool named qrun to run the whole workflow automatically (including building dataset, training models, backtest and evaluation). You can start an auto quant research workflow and have a graphical reports analysis according to the following steps:

  1. Quant Research Workflow: Run qrun with lightgbm workflow config (workflow_config_lightgbm_Alpha158.yaml as following.

      cd examples  # Avoid running program under the directory contains `qlib`
      qrun benchmarks/LightGBM/workflow_config_lightgbm_Alpha158.yaml
    

    If users want to use qrun under debug mode, please use the following command:

    python -m pdb qlib/workflow/cli.py examples/benchmarks/LightGBM/workflow_config_lightgbm_Alpha158.yaml
    

    The result of qrun is as follows, please refer to Intraday Trading for more details about the result.

    'The following are analysis results of the excess return without cost.'
                           risk
    mean               0.000708
    std                0.005626
    annualized_return  0.178316
    information_ratio  1.996555
    max_drawdown      -0.081806
    'The following are analysis results of the excess return with cost.'
                           risk
    mean               0.000512
    std                0.005626
    annualized_return  0.128982
    information_ratio  1.444287
    max_drawdown      -0.091078
    

    Here are detailed documents for qrun and workflow.

  2. Graphical Reports Analysis: Run examples/workflow_by_code.ipynb with jupyter notebook to get graphical reports

    • Forecasting signal (model prediction) analysis

      • Cumulative Return of groups Cumulative Return
      • Return distribution long_short
      • Information Coefficient (IC) Information Coefficient Monthly IC IC
      • Auto Correlation of forecasting signal (model prediction) Auto Correlation
    • Portfolio analysis

      • Backtest return Report
    • Explanation of above results

Building Customized Quant Research Workflow by Code

The automatic workflow may not suit the research workflow of all Quant researchers. To support a flexible Quant research workflow, Qlib also provides a modularized interface to allow researchers to build their own workflow by code. Here is a demo for customized Quant research workflow by code.

Quant Model (Paper) Zoo

Here is a list of models built on Qlib.

Your PR of new Quant models is highly welcomed.

The performance of each model on the Alpha158 and Alpha360 dataset can be found here.

Run a single model

All the models listed above are runnable with Qlib. Users can find the config files we provide and some details about the model through the benchmarks folder. More information can be retrieved at the model files listed above.

Qlib provides three different ways to run a single model, users can pick the one that fits their cases best:

  • Users can use the tool qrun mentioned above to run a model's workflow based from a config file.

  • Users can create a workflow_by_code python script based on the one listed in the examples folder.

  • Users can use the script run_all_model.py listed in the examples folder to run a model. Here is an example of the specific shell command to be used: python run_all_model.py run --models=lightgbm, where the --models arguments can take any number of models listed above(the available models can be found in benchmarks). For more use cases, please refer to the file's docstrings.

    • NOTE: Each baseline has different environment dependencies, please make sure that your python version aligns with the requirements(e.g. TFT only supports Python 3.6~3.7 due to the limitation of tensorflow==1.15.0)

Run multiple models

Qlib also provides a script run_all_model.py which can run multiple models for several iterations. (Note: the script only support Linux for now. Other OS will be supported in the future. Besides, it doesn't support parallel running the same model for multiple times as well, and this will be fixed in the future development too.)

The script will create a unique virtual environment for each model, and delete the environments after training. Thus, only experiment results such as IC and backtest results will be generated and stored.

Here is an example of running all the models for 10 iterations:

python run_all_model.py run 10

It also provides the API to run specific models at once. For more use cases, please refer to the file's docstrings.

Quant Dataset Zoo

Dataset plays a very important role in Quant. Here is a list of the datasets built on Qlib:

Dataset US Market China Market
Alpha360
Alpha158

Here is a tutorial to build dataset with Qlib. Your PR to build new Quant dataset is highly welcomed.

More About Qlib

The detailed documents are organized in docs. Sphinx and the readthedocs theme is required to build the documentation in html formats.

cd docs/
conda install sphinx sphinx_rtd_theme -y
# Otherwise, you can install them with pip
# pip install sphinx sphinx_rtd_theme
make html

You can also view the latest document online directly.

Qlib is in active and continuing development. Our plan is in the roadmap, which is managed as a github project.

Offline Mode and Online Mode

The data server of Qlib can either deployed as Offline mode or Online mode. The default mode is offline mode.

Under Offline mode, the data will be deployed locally.

Under Online mode, the data will be deployed as a shared data service. The data and their cache will be shared by all the clients. The data retrieval performance is expected to be improved due to a higher rate of cache hits. It will consume less disk space, too. The documents of the online mode can be found in Qlib-Server. The online mode can be deployed automatically with Azure CLI based scripts. The source code of online data server can be found in Qlib-Server repository.

Performance of Qlib Data Server

The performance of data processing is important to data-driven methods like AI technologies. As an AI-oriented platform, Qlib provides a solution for data storage and data processing. To demonstrate the performance of Qlib data server, we compare it with several other data storage solutions.

We evaluate the performance of several storage solutions by finishing the same task, which creates a dataset (14 features/factors) from the basic OHLCV daily data of a stock market (800 stocks each day from 2007 to 2020). The task involves data queries and processing.

HDF5 MySQL MongoDB InfluxDB Qlib -E -D Qlib +E -D Qlib +E +D
Total (1CPU) (seconds) 184.4±3.7 365.3±7.5 253.6±6.7 368.2±3.6 147.0±8.8 47.6±1.0 7.4±0.3
Total (64CPU) (seconds) 8.8±0.6 4.2±0.2
  • +(-)E indicates with (out) ExpressionCache
  • +(-)D indicates with (out) DatasetCache

Most general-purpose databases take too much time to load data. After looking into the underlying implementation, we find that data go through too many layers of interfaces and unnecessary format transformations in general-purpose database solutions. Such overheads greatly slow down the data loading process. Qlib data are stored in a compact format, which is efficient to be combined into arrays for scientific computation.

Related Reports

Contact Us

  • If you have any issues, please create issue here or send messages in gitter.
  • If you want to make contributions to Qlib, please create pull requests.
  • For other reasons, you are welcome to contact us by email(qlib@microsoft.com).
    • We are recruiting new members(both FTEs and interns), your resumes are welcome!

Join IM discussion groups:

Gitter
image

Contributing

This project welcomes contributions and suggestions.
Here are some code standards when you submit a pull request.

If you want to contribute to Qlib's document, you can follow the steps in the figure below.

Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the right to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

pyqlib-0.8.0-cp38-cp38-win_amd64.whl (486.7 kB view details)

Uploaded CPython 3.8 Windows x86-64

pyqlib-0.8.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (884.3 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.5+ x86-64

pyqlib-0.8.0-cp38-cp38-macosx_10_14_x86_64.whl (478.4 kB view details)

Uploaded CPython 3.8 macOS 10.14+ x86-64

pyqlib-0.8.0-cp37-cp37m-win_amd64.whl (485.1 kB view details)

Uploaded CPython 3.7m Windows x86-64

pyqlib-0.8.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (847.2 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.5+ x86-64

pyqlib-0.8.0-cp37-cp37m-macosx_10_14_x86_64.whl (477.7 kB view details)

Uploaded CPython 3.7m macOS 10.14+ x86-64

File details

Details for the file pyqlib-0.8.0-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: pyqlib-0.8.0-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 486.7 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for pyqlib-0.8.0-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 20a8ca8320c9bbf32e214a1b4321bfb5ce9e177ea10dee97be07530bb40e70e7
MD5 fe68e307323dafa4fba178b280356761
BLAKE2b-256 df32b269daa274b1b49129f2a277be7c087295477f49c2c9b38c32875df7f4f2

See more details on using hashes here.

File details

Details for the file pyqlib-0.8.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for pyqlib-0.8.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 4fce721b3c13716b71435a784ea50d6007b5b80120d558035fdd58e2ea25c6f4
MD5 d0a598f894e8d93180dd1db57725b0bf
BLAKE2b-256 f371ddadec3775fbe299d0137efe7ae7f3b6283a2bc9046b74208595fdd4c0ad

See more details on using hashes here.

File details

Details for the file pyqlib-0.8.0-cp38-cp38-macosx_10_14_x86_64.whl.

File metadata

  • Download URL: pyqlib-0.8.0-cp38-cp38-macosx_10_14_x86_64.whl
  • Upload date:
  • Size: 478.4 kB
  • Tags: CPython 3.8, macOS 10.14+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.12

File hashes

Hashes for pyqlib-0.8.0-cp38-cp38-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 ec88c9dc0c7f00aa662660219f3a019ed9f63d2e44414c9641c83e24be76f5de
MD5 49a074ad9bc3ce9776ff2a9e9b0dc595
BLAKE2b-256 ef20aa7c6534f02ad9408dbce9330ea80c70cf4cc4efbee21c433aa5c24c66f6

See more details on using hashes here.

File details

Details for the file pyqlib-0.8.0-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: pyqlib-0.8.0-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 485.1 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.9

File hashes

Hashes for pyqlib-0.8.0-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 c8667a36ee9735d5dcc66ee441f6177a8178a9ea1092f602ddee993d9165eeec
MD5 4d827e9236622f5b8cfdc3d3b1bb8b20
BLAKE2b-256 1021fce0c524a44d4baf97702672214be4b337750882664805d542a46c6d515d

See more details on using hashes here.

File details

Details for the file pyqlib-0.8.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl.

File metadata

File hashes

Hashes for pyqlib-0.8.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 e00c153eaa558d69a68c600472e386ba25e79b2acd99401ca3810e1806f6b511
MD5 9f49ffd59bf5ee2ebd441c311891d890
BLAKE2b-256 3b5a118cc3381e4a792c7439e74432384cd71bcc756cac88c63a8cc227796d71

See more details on using hashes here.

File details

Details for the file pyqlib-0.8.0-cp37-cp37m-macosx_10_14_x86_64.whl.

File metadata

  • Download URL: pyqlib-0.8.0-cp37-cp37m-macosx_10_14_x86_64.whl
  • Upload date:
  • Size: 477.7 kB
  • Tags: CPython 3.7m, macOS 10.14+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.8.2 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.7.12

File hashes

Hashes for pyqlib-0.8.0-cp37-cp37m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 9db401de6ec8847389069ec4c4e26e4415daaba97f4df0709fad91ab9532bb5d
MD5 50155a11edab332a05adb3ef6afb29c3
BLAKE2b-256 6c81d5e044ec3d907100a93000fd72f88e570d9463aa3e223addb9dbc391c94b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page