Markov Chains made easy
Project description
Very simple an easy to use Markov Chain utility for Python:
#!/usr/bin/env python from pyMarkov import markov text = "This is a random bunch of text" markov_dict = markov.train([text], 2) # 2 is the ply print markov.generate(markov_dict, 10, 2) # 2 is the ply, 10 is the length >>> 'random bunch of text'
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
PyMarkov-0.1.0.tar.gz
(3.1 kB
view details)
File details
Details for the file PyMarkov-0.1.0.tar.gz.
File metadata
- Download URL: PyMarkov-0.1.0.tar.gz
- Upload date:
- Size: 3.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
37316c4d305c4fee13d8e22354d0f7f817ee99b2b6f97568b1ab624673fae69f
|
|
| MD5 |
83cb02c40f93080ac66d418df07bdf15
|
|
| BLAKE2b-256 |
891e39e9d98be3bca1ba4aa6f6a33d50c4dead6808aefd1678ce5e891877e77f
|