package for crowd counting
Project description
Crowd Counting Package
crowdcount
is a library for crowd counting with Pytorch
and supported by Fudan-VTS Research
Source
code
: https://github.com/FDU-VTS/crowd-countdocument
: https://crowd-count.readthedocs.io/en/latest/
Install
pip install crowdcount --user --upgrade
Introduction
Crowd counting task:
- estimate the number of crowd
User guide:
-
models
from crowdcount.models import * # crowd counting models includes csr_net, mcnn, resnet50, resnet101, unet, vgg
-
transforms
import crowdcount.transforms as cc_transforms # transforms
-
data_loader
from crowdcount.data.data_loader import * # includes ShanghaiTech, UCF_QNRF, UCF_CC_50, Fudan-ShanghaiTech temporarily
-
data_preprocess
from crowdcount.data.data_preprocess import * # gaussian preprocess for datasets
-
utils
from crowdcount.utils import * # includes loss functions, optimizers, tensorboard and save function
-
engine
from crowdcount.engine import train # start to train train(*args, **kwargs)
-
More details in document
Demo
from crowdcount.engine import train
from crowdcount.models import Res101
from crowdcount.data.data_loader import *
from crowdcount.utils import *
import crowdcount.transforms as cc_transforms
import torchvision.transforms as transforms
# init model
model = Res101()
# init transforms
img_transform = transforms.Compose([transforms.ToTensor(),
transforms.Normalize(mean=[0.452016860247, 0.447249650955, 0.431981861591],
std=[0.23242045939, 0.224925786257, 0.221840232611])
])
gt_transform = cc_transforms.LabelEnlarge()
both_transform = cc_transforms.ComplexCompose([cc_transforms.TransposeFlip()])
# init dataset
train_set = ShanghaiTechDataset(mode="train",
part="b",
img_transform=img_transform,
gt_transform=gt_transform,
both_transform=both_transform,
root="/home/vts/chensongjian/CrowdCount/crowdcount/data/datasets/shtu_dataset_sigma_15")
test_set = ShanghaiTechDataset(mode="test",
part='b',
img_transform=img_transform,
root="/home/vts/chensongjian/CrowdCount/crowdcount/data/datasets/shtu_dataset_sigma_15")
# init loss
train_loss = AVGLoss()
test_loss = EnlargeLoss(100)
# init save function
saver = Saver(path="../exp/2019-12-22-main_sigma15_6")
# init tensorboard
tb = TensorBoard(path="../runs/2019-12-22-main_sigma15_6")
# start to train
train(model, train_set, test_set, train_loss, test_loss, optim="Adam", saver=saver, cuda_num=[3], train_batch=2,
test_batch=2, learning_rate=1e-5, epoch_num=500, enlarge_num=100, tensorboard=tb)
- you can find more demos in demo
Experiments
we will add the results soon
Thanks for the supports from
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
crowdcount-0.1.1.tar.gz
(16.7 kB
view details)
Built Distribution
File details
Details for the file crowdcount-0.1.1.tar.gz
.
File metadata
- Download URL: crowdcount-0.1.1.tar.gz
- Upload date:
- Size: 16.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.28.1 CPython/3.6.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2e26fa454ee2cb3d199c04b9bf8e1f4c452e45abcd44adb8bdee0b645c55e6fb |
|
MD5 | 97e98fa7297f3d96945d7c2f8c75d856 |
|
BLAKE2b-256 | dd99cbc58978e5c3bc2d2caa9e205e9e7ceddf735ec9a9df67c7398f93ab66e6 |
File details
Details for the file crowdcount-0.1.1-py3-none-any.whl
.
File metadata
- Download URL: crowdcount-0.1.1-py3-none-any.whl
- Upload date:
- Size: 30.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.6.0 requests-toolbelt/0.9.1 tqdm/4.28.1 CPython/3.6.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5bd03935fe987edb3ae33c3582a16eed6c5a917a1c08f925c8689efc099430a7 |
|
MD5 | c8aa955b20f5e0826cf43a9c0f31ed65 |
|
BLAKE2b-256 | a80f0bd17bbfd749f416bdc7287751dd835a70e8b2e98e6f6840f45627f09141 |