Skip to main content

Human mobility and movement analysis framework.

Project description

The trackintel Framework

PyPI version Build Status Documentation Status codecov.io

trackintel is a library for the analysis of spatio-temporal tracking data with a focus on human mobility. The core of trackintel is the hierachical data model for movement data that is used in transport planning [1]. We provide functionalities for the full life-cycle of human mobility data analysis: import and export of tracking data of different types (e.g, trackpoints, check-ins, trajectories), preprocessing, data quality assessment, semantic enrichment, quantitative analysis and mining tasks, and visualization of data and results. Trackintel is based on Pandas and GeoPandas

You can find the documentation on the trackintel documentation page.

Try trackintel online in a MyBinder notebook: Binder

Data model

An overview of the data model of trackintel:

  • positionfixes (Raw tracking points, e.g., GPS recordings or check-ins)
  • staypoints (Locations where a user spent time without moving, e.g., aggregations of positionfixes or check-ins)
  • activities (Staypoints with a purpose and a semantic label, e.g., meeting to drink a coffee as opposed to waiting for the bus)
  • locations (Important places that are visited more than once, e.g., home or work location)
  • triplegs (or stages) (Continuous movement without changing mode, vehicle or stopping for too long, e.g., a taxi trip between pick-up and drop-off)
  • trips (The sequence of all triplegs between two consecutive activities)
  • tours (A collection of sequential trips that return to the same location)

You can enter the trackintel framework if your data corresponds to any of the above mentioned movement data representation. Here are some of the functionalities that we provide:

  • Import: Import from the following data formats is supported: geopandas dataframes (recommended), csv files in a specified format, postGIS databases. We also provide specific dataset readers for popular public datasets (e.g, geolife).
  • Aggregation: We provide functionalities to aggregate into the next level of our data model. E.g., positionfixes->staypoints; positionfixes->triplegs; staypoints->locations; staypoints+triplegs->trips; trips->tours
  • Enrichment: Activity semantics for staypoints; Mode of transport semantics for triplegs; High level semantics for locations

Installation and Usage

trackintel is on pypi.org, you can install it with pip install trackintel as long as GeoPandas is already installed.

You should then be able to run the examples in the examples folder or import trackintel using:

import trackintel

Development

You can install trackintel locally using pip install .. For quick testing, use trackintel.print_version().

Testing is done using pytest. Simply run the tests using pytest in the top-level trackintel folder. In case you use pipenv, install pytest first (pip install pytest), then run pytest using this version: python -m pytest. The use of fixtures for data generation (e.g., trips and trackpoints) is still an open todo. As for now, there are some smaller datasets in the tests folder.

Versions use semantic numbering. Commits follow the standard of Conventional Commits. You can generate them easily using Commitizen.

You can find the development roadmap under ROADMAP.md and coding conventions under CONTRIBUTING.md.

Documentation

The documentation follws the pandas resp. numpy docstring standard. In particular, it uses Sphinx to create the documentation. You can install Sphinx using pip install -U sphinx or conda install sphinx.

If you use additional dependencies during development, do not forget to add them to autodoc_mock_imports in docs/conf.py for readthedocs.org to work properly.

You can then generate the documentation using sphinx-build -b html docs docs.gen. This will put the documentation in docs.gen, which is in .gitignore.

Continuous Integration

There are travis and appveyor CIs set up for Unix/Windows builds. You can find the corresponding scripts in .travis.yml and appveyor.yml. Adding Coveralls is an open todo.

Contributors

trackintel is primarily maintained by the Mobility Information Engineering Lab at ETH Zurich (mie-lab.ethz.ch). If you want to contribute, send a pull request and put yourself in the AUTHORS.md file.

References

[1] Axhausen, K. W. (2007). Definition Of Movement and Activity For Transport Modelling. In Handbook of Transport Modelling. Emerald Group Publishing Limited.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trackintel-0.5.0.tar.gz (47.7 kB view details)

Uploaded Source

Built Distribution

trackintel-0.5.0-py3-none-any.whl (64.3 kB view details)

Uploaded Python 3

File details

Details for the file trackintel-0.5.0.tar.gz.

File metadata

  • Download URL: trackintel-0.5.0.tar.gz
  • Upload date:
  • Size: 47.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.56.2 CPython/3.9.1

File hashes

Hashes for trackintel-0.5.0.tar.gz
Algorithm Hash digest
SHA256 e3fe99ff790ae292457aa61b28f82255e2644273aa49735b1e330a1ecbdb398f
MD5 f20f6dc438b8b3465c1316407f96b48b
BLAKE2b-256 e7775c82270bbe658123d6593a3f6b5a1f631689b695bd14f22a5a59610b9ac6

See more details on using hashes here.

File details

Details for the file trackintel-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: trackintel-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 64.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/53.0.0 requests-toolbelt/0.9.1 tqdm/4.56.2 CPython/3.9.1

File hashes

Hashes for trackintel-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 49c840fa4d5be4cd9a77f9806dd7975cdec19405b469162a393b540d60a8817d
MD5 0a04ca447b0f4acb1de7933f996631d2
BLAKE2b-256 473631ba4c6957a7f60985c09f4eae84d5174a5264c27d56aa788004fdc90a8c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page