Skip to main content

Markov Chains made easy

Project description

Very simple an easy to use Markov Chain utility for Python:

#!/usr/bin/env python

from pyMarkov import markov

text = "This is a random bunch of text"
markov_dict = markov.train([text], 2) # 2 is the ply
print markov.generate(markov_dict, 10, 2) # 2 is the ply, 10 is the length
>>> 'random bunch of text'

Project details


Release history Release notifications

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
PyMarkov-0.1.0.tar.gz (3.1 kB) Copy SHA256 hash SHA256 Source None Sep 18, 2013

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page