Skipthoughts pretrained models for Pytorch
Skip-Thoughts.torch for Pytorcb
Skip-Thoughts.torch is a lightweight porting of skip-thought pretrained models from Theano to Pytorch.
Install from pip
pip install skipthoughts
Install from repo
git clone https://github.com/Cadene/skip-thoughts.torch.git
python setup.py install
Available pretrained models
It uses the
nn.GRU layer from torch with the cudnn backend. It is the fastest implementation, but the dropout is sampled after each time-step in the cudnn implementation... (equals bad regularization)
It uses the
nn.GRUCell layer from torch with the cudnn backend. It is slightly slower than UniSkip, however the dropout is sampled once for all time-steps in a sequence (good regularization).
It uses a custom GRU layer with a torch backend. It is at least two times slower than UniSkip, however the dropout is sampled once for all time-steps for each Linear (best regularization).
Equivalent to UniSkip, but with a bi-sequential GRU.
import torch from torch.autograd import Variable import sys sys.path.append('skip-thoughts.torch/pytorch') from skipthoughts import UniSkip dir_st = 'data/skip-thoughts' vocab = ['robots', 'are', 'very', 'cool', '<eos>', 'BiDiBu'] uniskip = UniSkip(dir_st, vocab) input = Variable(torch.LongTensor([ [1,2,3,4,0], # robots are very cool 0 [6,2,3,4,5] # bidibu are very cool <eos> ])) # <eos> token is optional print(input.size()) # batch_size x seq_len output_seq2vec = uniskip(input, lengths=[4,5]) print(output_seq2vec.size()) # batch_size x 2400 output_seq2seq = uniskip(input) print(output_seq2seq.size()) # batch_size x seq_len x 2400
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size skipthoughts-0.0.1-py3-none-any.whl (9.1 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size skipthoughts-0.0.1.tar.gz (9.8 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for skipthoughts-0.0.1-py3-none-any.whl